Dec 09 10:01:06 crc systemd[1]: Starting Kubernetes Kubelet... Dec 09 10:01:06 crc restorecon[4751]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:06 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:07 crc restorecon[4751]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 10:01:07 crc restorecon[4751]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 09 10:01:07 crc kubenswrapper[5002]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 10:01:07 crc kubenswrapper[5002]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 09 10:01:07 crc kubenswrapper[5002]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 10:01:07 crc kubenswrapper[5002]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 10:01:07 crc kubenswrapper[5002]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 09 10:01:07 crc kubenswrapper[5002]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.837236 5002 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842212 5002 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842243 5002 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842256 5002 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842267 5002 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842277 5002 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842286 5002 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842296 5002 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842305 5002 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842313 5002 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842322 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842330 5002 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842340 5002 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842348 5002 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842357 5002 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842365 5002 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842373 5002 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842381 5002 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842390 5002 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842398 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842406 5002 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842416 5002 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842426 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842436 5002 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842445 5002 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842453 5002 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842460 5002 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842471 5002 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842481 5002 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842491 5002 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842500 5002 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842509 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842532 5002 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842541 5002 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842550 5002 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842558 5002 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842566 5002 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842574 5002 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842582 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842590 5002 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842598 5002 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842606 5002 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842614 5002 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842622 5002 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842631 5002 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842639 5002 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842649 5002 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842657 5002 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842665 5002 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842673 5002 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842681 5002 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842689 5002 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842697 5002 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842704 5002 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842713 5002 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842721 5002 feature_gate.go:330] unrecognized feature gate: Example Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842729 5002 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842737 5002 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842745 5002 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842754 5002 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842762 5002 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842769 5002 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842777 5002 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842785 5002 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842794 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842804 5002 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842836 5002 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842845 5002 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842852 5002 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842860 5002 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842868 5002 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.842876 5002 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843009 5002 flags.go:64] FLAG: --address="0.0.0.0" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843025 5002 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843040 5002 flags.go:64] FLAG: --anonymous-auth="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843052 5002 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843063 5002 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843073 5002 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843084 5002 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843095 5002 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843105 5002 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843114 5002 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843124 5002 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843133 5002 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843142 5002 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843152 5002 flags.go:64] FLAG: --cgroup-root="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843161 5002 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843171 5002 flags.go:64] FLAG: --client-ca-file="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843179 5002 flags.go:64] FLAG: --cloud-config="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843191 5002 flags.go:64] FLAG: --cloud-provider="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843201 5002 flags.go:64] FLAG: --cluster-dns="[]" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843212 5002 flags.go:64] FLAG: --cluster-domain="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843220 5002 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843230 5002 flags.go:64] FLAG: --config-dir="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843239 5002 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843248 5002 flags.go:64] FLAG: --container-log-max-files="5" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843259 5002 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843268 5002 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843277 5002 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843286 5002 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843295 5002 flags.go:64] FLAG: --contention-profiling="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843305 5002 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843314 5002 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843324 5002 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843333 5002 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843344 5002 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843353 5002 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843362 5002 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843371 5002 flags.go:64] FLAG: --enable-load-reader="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843379 5002 flags.go:64] FLAG: --enable-server="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843388 5002 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843400 5002 flags.go:64] FLAG: --event-burst="100" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843409 5002 flags.go:64] FLAG: --event-qps="50" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843418 5002 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843428 5002 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843437 5002 flags.go:64] FLAG: --eviction-hard="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843447 5002 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843456 5002 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843465 5002 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843474 5002 flags.go:64] FLAG: --eviction-soft="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843482 5002 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843492 5002 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843501 5002 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843510 5002 flags.go:64] FLAG: --experimental-mounter-path="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843519 5002 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843527 5002 flags.go:64] FLAG: --fail-swap-on="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843536 5002 flags.go:64] FLAG: --feature-gates="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843547 5002 flags.go:64] FLAG: --file-check-frequency="20s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843556 5002 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843565 5002 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843574 5002 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843583 5002 flags.go:64] FLAG: --healthz-port="10248" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843593 5002 flags.go:64] FLAG: --help="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843602 5002 flags.go:64] FLAG: --hostname-override="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843611 5002 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843620 5002 flags.go:64] FLAG: --http-check-frequency="20s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843628 5002 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843637 5002 flags.go:64] FLAG: --image-credential-provider-config="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843646 5002 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843655 5002 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843664 5002 flags.go:64] FLAG: --image-service-endpoint="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843672 5002 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843681 5002 flags.go:64] FLAG: --kube-api-burst="100" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843690 5002 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843700 5002 flags.go:64] FLAG: --kube-api-qps="50" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843708 5002 flags.go:64] FLAG: --kube-reserved="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843717 5002 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843726 5002 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843736 5002 flags.go:64] FLAG: --kubelet-cgroups="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843745 5002 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843755 5002 flags.go:64] FLAG: --lock-file="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843763 5002 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843772 5002 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843782 5002 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843795 5002 flags.go:64] FLAG: --log-json-split-stream="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843804 5002 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843839 5002 flags.go:64] FLAG: --log-text-split-stream="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843849 5002 flags.go:64] FLAG: --logging-format="text" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843857 5002 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843868 5002 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843876 5002 flags.go:64] FLAG: --manifest-url="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843885 5002 flags.go:64] FLAG: --manifest-url-header="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843897 5002 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843906 5002 flags.go:64] FLAG: --max-open-files="1000000" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843917 5002 flags.go:64] FLAG: --max-pods="110" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843926 5002 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843936 5002 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843945 5002 flags.go:64] FLAG: --memory-manager-policy="None" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843954 5002 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843964 5002 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843973 5002 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.843982 5002 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844002 5002 flags.go:64] FLAG: --node-status-max-images="50" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844011 5002 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844020 5002 flags.go:64] FLAG: --oom-score-adj="-999" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844029 5002 flags.go:64] FLAG: --pod-cidr="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844038 5002 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844051 5002 flags.go:64] FLAG: --pod-manifest-path="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844060 5002 flags.go:64] FLAG: --pod-max-pids="-1" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844069 5002 flags.go:64] FLAG: --pods-per-core="0" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844078 5002 flags.go:64] FLAG: --port="10250" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844087 5002 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844096 5002 flags.go:64] FLAG: --provider-id="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844105 5002 flags.go:64] FLAG: --qos-reserved="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844115 5002 flags.go:64] FLAG: --read-only-port="10255" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844124 5002 flags.go:64] FLAG: --register-node="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844134 5002 flags.go:64] FLAG: --register-schedulable="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844143 5002 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844157 5002 flags.go:64] FLAG: --registry-burst="10" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844166 5002 flags.go:64] FLAG: --registry-qps="5" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844175 5002 flags.go:64] FLAG: --reserved-cpus="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844184 5002 flags.go:64] FLAG: --reserved-memory="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844195 5002 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844204 5002 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844710 5002 flags.go:64] FLAG: --rotate-certificates="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844725 5002 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844735 5002 flags.go:64] FLAG: --runonce="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844744 5002 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844753 5002 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844763 5002 flags.go:64] FLAG: --seccomp-default="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844772 5002 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844781 5002 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844790 5002 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844799 5002 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844808 5002 flags.go:64] FLAG: --storage-driver-password="root" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844842 5002 flags.go:64] FLAG: --storage-driver-secure="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844851 5002 flags.go:64] FLAG: --storage-driver-table="stats" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844859 5002 flags.go:64] FLAG: --storage-driver-user="root" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844869 5002 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844878 5002 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844887 5002 flags.go:64] FLAG: --system-cgroups="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844896 5002 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844912 5002 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844921 5002 flags.go:64] FLAG: --tls-cert-file="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844930 5002 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844940 5002 flags.go:64] FLAG: --tls-min-version="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844949 5002 flags.go:64] FLAG: --tls-private-key-file="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844958 5002 flags.go:64] FLAG: --topology-manager-policy="none" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844975 5002 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844984 5002 flags.go:64] FLAG: --topology-manager-scope="container" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.844993 5002 flags.go:64] FLAG: --v="2" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.845005 5002 flags.go:64] FLAG: --version="false" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.845017 5002 flags.go:64] FLAG: --vmodule="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.845028 5002 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.845038 5002 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845294 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845308 5002 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845318 5002 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845329 5002 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845337 5002 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845348 5002 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845359 5002 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845367 5002 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845376 5002 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845384 5002 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845393 5002 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845402 5002 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845410 5002 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845418 5002 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845427 5002 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845436 5002 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845444 5002 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845454 5002 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845462 5002 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845469 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845488 5002 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845496 5002 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845504 5002 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845512 5002 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845520 5002 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845531 5002 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845539 5002 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845546 5002 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845554 5002 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845561 5002 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845569 5002 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845585 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845593 5002 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845602 5002 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845610 5002 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845618 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845626 5002 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845633 5002 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845641 5002 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845649 5002 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845656 5002 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845664 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845671 5002 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845680 5002 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845687 5002 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845697 5002 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845708 5002 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845718 5002 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845726 5002 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845735 5002 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845743 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845752 5002 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845760 5002 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845768 5002 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845776 5002 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845784 5002 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845792 5002 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845802 5002 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845836 5002 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845844 5002 feature_gate.go:330] unrecognized feature gate: Example Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845852 5002 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845860 5002 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845867 5002 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845877 5002 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845886 5002 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845893 5002 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845901 5002 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845910 5002 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845919 5002 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845929 5002 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.845937 5002 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.846185 5002 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.856538 5002 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.856567 5002 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856681 5002 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856694 5002 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856707 5002 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856717 5002 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856725 5002 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856734 5002 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856743 5002 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856751 5002 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856759 5002 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856767 5002 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856775 5002 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856782 5002 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856790 5002 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856798 5002 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856806 5002 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856838 5002 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856847 5002 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856854 5002 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856862 5002 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856870 5002 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856877 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856885 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856893 5002 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856901 5002 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856912 5002 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856921 5002 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856929 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856937 5002 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856945 5002 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856954 5002 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856962 5002 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856970 5002 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856978 5002 feature_gate.go:330] unrecognized feature gate: Example Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856986 5002 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.856995 5002 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857003 5002 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857010 5002 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857020 5002 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857030 5002 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857040 5002 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857049 5002 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857057 5002 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857066 5002 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857074 5002 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857082 5002 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857090 5002 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857098 5002 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857106 5002 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857113 5002 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857121 5002 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857129 5002 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857136 5002 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857144 5002 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857152 5002 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857195 5002 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857203 5002 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857211 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857218 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857226 5002 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857234 5002 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857244 5002 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857253 5002 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857262 5002 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857270 5002 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857278 5002 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857286 5002 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857293 5002 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857301 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857308 5002 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857316 5002 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857325 5002 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.857337 5002 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857553 5002 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857566 5002 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857574 5002 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857582 5002 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857592 5002 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857603 5002 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857613 5002 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857622 5002 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857631 5002 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857640 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857649 5002 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857657 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857665 5002 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857672 5002 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857680 5002 feature_gate.go:330] unrecognized feature gate: Example Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857688 5002 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857696 5002 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857704 5002 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857711 5002 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857719 5002 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857727 5002 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857734 5002 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857742 5002 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857750 5002 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857758 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857766 5002 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857773 5002 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857781 5002 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857789 5002 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857797 5002 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857804 5002 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857835 5002 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857843 5002 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857851 5002 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857869 5002 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857877 5002 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857885 5002 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857893 5002 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857901 5002 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857910 5002 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857918 5002 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857927 5002 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857935 5002 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857968 5002 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857978 5002 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857988 5002 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.857999 5002 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858008 5002 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858016 5002 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858024 5002 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858032 5002 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858039 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858047 5002 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858055 5002 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858062 5002 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858070 5002 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858078 5002 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858086 5002 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858094 5002 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858101 5002 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858109 5002 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858117 5002 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858124 5002 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858132 5002 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858140 5002 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858147 5002 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858155 5002 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858165 5002 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858175 5002 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858184 5002 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.858195 5002 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.858207 5002 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.858690 5002 server.go:940] "Client rotation is on, will bootstrap in background" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.862855 5002 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.862990 5002 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.863771 5002 server.go:997] "Starting client certificate rotation" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.863799 5002 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.864220 5002 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-14 02:47:54.075147487 +0000 UTC Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.864327 5002 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 112h46m46.210825392s for next certificate rotation Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.878466 5002 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.881408 5002 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.892105 5002 log.go:25] "Validated CRI v1 runtime API" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.906930 5002 log.go:25] "Validated CRI v1 image API" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.908389 5002 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.911339 5002 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-09-09-56-41-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.911385 5002 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.937753 5002 manager.go:217] Machine: {Timestamp:2025-12-09 10:01:07.935250607 +0000 UTC m=+0.327301758 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8af61218-105c-4188-8c40-2d81c3899a86 BootID:bb4d43e7-bcbf-4472-90e9-44716d72c15e Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d5:7b:db Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d5:7b:db Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c9:83:df Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:82:77:be Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:bc:97:b5 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a3:53:5f Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:c5:26:25 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:66:4c:c0:b1:8b:a1 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:be:27:cc:3a:07:6c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.938143 5002 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.938380 5002 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.938796 5002 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.939191 5002 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.939240 5002 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.939563 5002 topology_manager.go:138] "Creating topology manager with none policy" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.939580 5002 container_manager_linux.go:303] "Creating device plugin manager" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.940008 5002 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.940060 5002 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.940467 5002 state_mem.go:36] "Initialized new in-memory state store" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.940614 5002 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.941575 5002 kubelet.go:418] "Attempting to sync node with API server" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.941607 5002 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.941644 5002 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.941668 5002 kubelet.go:324] "Adding apiserver pod source" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.941701 5002 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.944454 5002 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.945007 5002 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.945144 5002 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.132:6443: connect: connection refused Dec 09 10:01:07 crc kubenswrapper[5002]: E1209 10:01:07.945248 5002 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.132:6443: connect: connection refused" logger="UnhandledError" Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.945241 5002 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.132:6443: connect: connection refused Dec 09 10:01:07 crc kubenswrapper[5002]: E1209 10:01:07.945370 5002 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.132:6443: connect: connection refused" logger="UnhandledError" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.946426 5002 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949480 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949514 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949521 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949529 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949541 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949548 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949554 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949603 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949612 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949619 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949632 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949640 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.949853 5002 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.951095 5002 server.go:1280] "Started kubelet" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.951581 5002 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.132:6443: connect: connection refused Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.951590 5002 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.951953 5002 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 09 10:01:07 crc systemd[1]: Started Kubernetes Kubelet. Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.952942 5002 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.953075 5002 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.953388 5002 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.953483 5002 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 00:40:19.535390804 +0000 UTC Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.953527 5002 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 62h39m11.581883957s for next certificate rotation Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.953799 5002 server.go:460] "Adding debug handlers to kubelet server" Dec 09 10:01:07 crc kubenswrapper[5002]: E1209 10:01:07.953890 5002 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.953960 5002 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.953972 5002 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.954021 5002 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 09 10:01:07 crc kubenswrapper[5002]: E1209 10:01:07.953896 5002 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.132:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187f83c69949ea84 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 10:01:07.951045252 +0000 UTC m=+0.343096383,LastTimestamp:2025-12-09 10:01:07.951045252 +0000 UTC m=+0.343096383,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 10:01:07 crc kubenswrapper[5002]: E1209 10:01:07.954437 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" interval="200ms" Dec 09 10:01:07 crc kubenswrapper[5002]: W1209 10:01:07.955579 5002 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.132:6443: connect: connection refused Dec 09 10:01:07 crc kubenswrapper[5002]: E1209 10:01:07.955640 5002 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.132:6443: connect: connection refused" logger="UnhandledError" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.955737 5002 factory.go:153] Registering CRI-O factory Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.955759 5002 factory.go:221] Registration of the crio container factory successfully Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.955850 5002 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.955871 5002 factory.go:55] Registering systemd factory Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.955880 5002 factory.go:221] Registration of the systemd container factory successfully Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.955901 5002 factory.go:103] Registering Raw factory Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.955916 5002 manager.go:1196] Started watching for new ooms in manager Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.956531 5002 manager.go:319] Starting recovery of all containers Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.971913 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972288 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972314 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972344 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972366 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972382 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972402 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972415 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972438 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972453 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972467 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972486 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972500 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972533 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972547 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972567 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972581 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972597 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972613 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972628 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972675 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972690 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972707 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972725 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972739 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972757 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972777 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972798 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972847 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972869 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972885 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972905 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972921 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972936 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972958 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.972978 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973004 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973019 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973034 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973767 5002 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973849 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973868 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973883 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973901 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973915 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973934 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973948 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973963 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973980 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.973995 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974015 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974028 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974042 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974068 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974088 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974104 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974125 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974144 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974160 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974173 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974190 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974203 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974222 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974236 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974248 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974267 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974279 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974297 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974310 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974324 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974339 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974352 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974372 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974386 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974399 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974416 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974430 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974447 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974461 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974476 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974493 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974507 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974522 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974540 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974555 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974572 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974586 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974600 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974618 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974631 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974648 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974661 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974674 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974691 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974705 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974721 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.974734 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.975645 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.975676 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.975711 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.975724 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.975737 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.975754 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.975767 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.975783 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976397 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976460 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976486 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976509 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976532 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976553 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976573 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976594 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976613 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976656 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976677 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976694 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976712 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976728 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976747 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976768 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976785 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976801 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976846 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976875 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976894 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976911 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976929 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976945 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976962 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976980 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.976997 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977014 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977035 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977053 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977069 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977086 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977105 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977119 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977134 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977151 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977166 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977183 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977201 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977217 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977234 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977251 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977269 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977284 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977303 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977320 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977338 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977355 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977373 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977388 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977407 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977423 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977442 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977458 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977473 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977488 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977504 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977519 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977534 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977550 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977567 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977582 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977600 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977617 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977636 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977651 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977667 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977682 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977698 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977713 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977731 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977746 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977762 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977776 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977791 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977807 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977846 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977861 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977878 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977912 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977930 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977948 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977964 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.977979 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978000 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978018 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978033 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978048 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978065 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978080 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978097 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978112 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978128 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978142 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978158 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978172 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978191 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978209 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978224 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978238 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978256 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978273 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978290 5002 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978306 5002 reconstruct.go:97] "Volume reconstruction finished" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.978317 5002 reconciler.go:26] "Reconciler: start to sync state" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.980954 5002 manager.go:324] Recovery completed Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.993001 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.995293 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.995518 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.995528 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.997193 5002 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.997210 5002 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 09 10:01:07 crc kubenswrapper[5002]: I1209 10:01:07.997230 5002 state_mem.go:36] "Initialized new in-memory state store" Dec 09 10:01:08 crc kubenswrapper[5002]: E1209 10:01:08.054368 5002 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.056769 5002 policy_none.go:49] "None policy: Start" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.056755 5002 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.058878 5002 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.058936 5002 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.058965 5002 kubelet.go:2335] "Starting kubelet main sync loop" Dec 09 10:01:08 crc kubenswrapper[5002]: E1209 10:01:08.059361 5002 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 09 10:01:08 crc kubenswrapper[5002]: W1209 10:01:08.059846 5002 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.132:6443: connect: connection refused Dec 09 10:01:08 crc kubenswrapper[5002]: E1209 10:01:08.059904 5002 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.132:6443: connect: connection refused" logger="UnhandledError" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.060310 5002 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.060345 5002 state_mem.go:35] "Initializing new in-memory state store" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.109094 5002 manager.go:334] "Starting Device Plugin manager" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.109157 5002 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.109172 5002 server.go:79] "Starting device plugin registration server" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.109640 5002 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.109655 5002 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.109843 5002 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.110008 5002 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.110032 5002 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 09 10:01:08 crc kubenswrapper[5002]: E1209 10:01:08.120115 5002 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 09 10:01:08 crc kubenswrapper[5002]: E1209 10:01:08.155338 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" interval="400ms" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.159682 5002 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.159804 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.161353 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.161392 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.161402 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.161530 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.161949 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.162012 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.162541 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.162597 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.162608 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.162777 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.162950 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.163011 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.163397 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.163457 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.163476 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.164374 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.164431 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.164448 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.164623 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.164922 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.164945 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.165021 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.164962 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.165117 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.165778 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.165831 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.165844 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.165997 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.166138 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.166163 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.166175 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.166361 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.166393 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.166656 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.166693 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.166713 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.167006 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.167054 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.167195 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.167234 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.167245 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.167984 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.168065 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.168081 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.210077 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.211360 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.211399 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.211412 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.211437 5002 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 10:01:08 crc kubenswrapper[5002]: E1209 10:01:08.212017 5002 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.132:6443: connect: connection refused" node="crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.282629 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.282702 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.282740 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.282772 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.282792 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.282880 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.282938 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.282956 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.283011 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.283084 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.283156 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.283189 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.283216 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.283246 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.283296 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384497 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384535 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384575 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384645 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384650 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384673 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384695 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384652 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384714 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384735 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384761 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384747 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384772 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384780 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384839 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384847 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384795 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384781 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384894 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384911 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384926 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384942 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384958 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384972 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384976 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.385006 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.385031 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.385060 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.384956 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.385113 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.412438 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.413985 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.414061 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.414086 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.414135 5002 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 10:01:08 crc kubenswrapper[5002]: E1209 10:01:08.414797 5002 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.132:6443: connect: connection refused" node="crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.498191 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.524003 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: W1209 10:01:08.524419 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d570f59d833d35bb53de3208e31bc9ece4e7d9def61a67c5472a39692f6f7e58 WatchSource:0}: Error finding container d570f59d833d35bb53de3208e31bc9ece4e7d9def61a67c5472a39692f6f7e58: Status 404 returned error can't find the container with id d570f59d833d35bb53de3208e31bc9ece4e7d9def61a67c5472a39692f6f7e58 Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.534622 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: W1209 10:01:08.545042 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-68cb9fc5899f8c09f507b01139e7a74ffa61246fd42752a72ff262d958093911 WatchSource:0}: Error finding container 68cb9fc5899f8c09f507b01139e7a74ffa61246fd42752a72ff262d958093911: Status 404 returned error can't find the container with id 68cb9fc5899f8c09f507b01139e7a74ffa61246fd42752a72ff262d958093911 Dec 09 10:01:08 crc kubenswrapper[5002]: W1209 10:01:08.552762 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cb215044df1b0fd08fba58ec61011f10e3c9a6bfbe0f413c88eb0a65cee537d8 WatchSource:0}: Error finding container cb215044df1b0fd08fba58ec61011f10e3c9a6bfbe0f413c88eb0a65cee537d8: Status 404 returned error can't find the container with id cb215044df1b0fd08fba58ec61011f10e3c9a6bfbe0f413c88eb0a65cee537d8 Dec 09 10:01:08 crc kubenswrapper[5002]: E1209 10:01:08.555863 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" interval="800ms" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.560707 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.566480 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:08 crc kubenswrapper[5002]: W1209 10:01:08.570054 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-871e1e00838dda1031fca315ac770a0bf6306b119e0bb56895f4bc6e2b2cb1ef WatchSource:0}: Error finding container 871e1e00838dda1031fca315ac770a0bf6306b119e0bb56895f4bc6e2b2cb1ef: Status 404 returned error can't find the container with id 871e1e00838dda1031fca315ac770a0bf6306b119e0bb56895f4bc6e2b2cb1ef Dec 09 10:01:08 crc kubenswrapper[5002]: W1209 10:01:08.581568 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-165dd357907d6f5eedaf8a706d99f80fade7a3d1705f752f73500f6b8ae70a15 WatchSource:0}: Error finding container 165dd357907d6f5eedaf8a706d99f80fade7a3d1705f752f73500f6b8ae70a15: Status 404 returned error can't find the container with id 165dd357907d6f5eedaf8a706d99f80fade7a3d1705f752f73500f6b8ae70a15 Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.815169 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.816518 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.816554 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.816563 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.816588 5002 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 10:01:08 crc kubenswrapper[5002]: E1209 10:01:08.817099 5002 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.132:6443: connect: connection refused" node="crc" Dec 09 10:01:08 crc kubenswrapper[5002]: W1209 10:01:08.833461 5002 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.132:6443: connect: connection refused Dec 09 10:01:08 crc kubenswrapper[5002]: E1209 10:01:08.833539 5002 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.132:6443: connect: connection refused" logger="UnhandledError" Dec 09 10:01:08 crc kubenswrapper[5002]: I1209 10:01:08.952141 5002 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.132:6443: connect: connection refused Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.066476 5002 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d" exitCode=0 Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.066592 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d"} Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.066803 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"68cb9fc5899f8c09f507b01139e7a74ffa61246fd42752a72ff262d958093911"} Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.066978 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.069478 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.069523 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.069541 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.070277 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98"} Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.070319 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d570f59d833d35bb53de3208e31bc9ece4e7d9def61a67c5472a39692f6f7e58"} Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.072036 5002 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f" exitCode=0 Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.072077 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f"} Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.072146 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"165dd357907d6f5eedaf8a706d99f80fade7a3d1705f752f73500f6b8ae70a15"} Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.072367 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.073528 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.073560 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.073572 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.075530 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.077323 5002 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782" exitCode=0 Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.077371 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782"} Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.077423 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"871e1e00838dda1031fca315ac770a0bf6306b119e0bb56895f4bc6e2b2cb1ef"} Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.077486 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.077548 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.077572 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.077655 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.079316 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.079356 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.079371 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.080293 5002 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ffd3267e6e931890e236a6659bb3f39f0b33ed2217ec46326efb6ef7b8e13b77" exitCode=0 Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.080348 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ffd3267e6e931890e236a6659bb3f39f0b33ed2217ec46326efb6ef7b8e13b77"} Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.080388 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cb215044df1b0fd08fba58ec61011f10e3c9a6bfbe0f413c88eb0a65cee537d8"} Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.080548 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.081919 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.081981 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.082001 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:09 crc kubenswrapper[5002]: W1209 10:01:09.145158 5002 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.132:6443: connect: connection refused Dec 09 10:01:09 crc kubenswrapper[5002]: E1209 10:01:09.145269 5002 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.132:6443: connect: connection refused" logger="UnhandledError" Dec 09 10:01:09 crc kubenswrapper[5002]: W1209 10:01:09.326277 5002 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.132:6443: connect: connection refused Dec 09 10:01:09 crc kubenswrapper[5002]: E1209 10:01:09.326355 5002 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.132:6443: connect: connection refused" logger="UnhandledError" Dec 09 10:01:09 crc kubenswrapper[5002]: E1209 10:01:09.357017 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" interval="1.6s" Dec 09 10:01:09 crc kubenswrapper[5002]: W1209 10:01:09.373026 5002 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.132:6443: connect: connection refused Dec 09 10:01:09 crc kubenswrapper[5002]: E1209 10:01:09.373261 5002 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.132:6443: connect: connection refused" logger="UnhandledError" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.617612 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.619131 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.619162 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.619173 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:09 crc kubenswrapper[5002]: I1209 10:01:09.619199 5002 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 10:01:09 crc kubenswrapper[5002]: E1209 10:01:09.619650 5002 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.132:6443: connect: connection refused" node="crc" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.086312 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.086366 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.086380 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.086452 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.088093 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.088121 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.088128 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.089288 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.089313 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.089322 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.089378 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.090023 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.090040 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.090048 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.092330 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.092350 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.092359 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.092368 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.092377 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.092436 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.093070 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.093167 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.093236 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.094831 5002 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136" exitCode=0 Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.094964 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.095125 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.095865 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.095964 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.096138 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.097581 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"51d529f16cadd597347e6640d9f4c6fecf86943db64b9d1b154ee1adc9e68364"} Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.097730 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.098561 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.098579 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.098587 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:10 crc kubenswrapper[5002]: I1209 10:01:10.576629 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.105628 5002 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5" exitCode=0 Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.105792 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.105876 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.105927 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.106733 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5"} Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.106947 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.106988 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.107127 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.107850 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.107906 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.107924 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.107856 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.108027 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.108051 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.108208 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.108227 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.108239 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.108453 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.108470 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.108480 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.220469 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.221799 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.221874 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.221891 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.221923 5002 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 10:01:11 crc kubenswrapper[5002]: I1209 10:01:11.876374 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.110434 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4"} Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.110476 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93"} Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.110486 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4"} Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.110495 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a"} Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.110504 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b"} Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.110513 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.110610 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.110522 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.111483 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.111515 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.111527 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.111572 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.111589 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.111598 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.111728 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.111742 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.111752 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.400225 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.400387 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.400427 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.402172 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.402215 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:12 crc kubenswrapper[5002]: I1209 10:01:12.402226 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:13 crc kubenswrapper[5002]: I1209 10:01:13.172511 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:13 crc kubenswrapper[5002]: I1209 10:01:13.172708 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:13 crc kubenswrapper[5002]: I1209 10:01:13.174031 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:13 crc kubenswrapper[5002]: I1209 10:01:13.174079 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:13 crc kubenswrapper[5002]: I1209 10:01:13.174094 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:13 crc kubenswrapper[5002]: I1209 10:01:13.225598 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 09 10:01:13 crc kubenswrapper[5002]: I1209 10:01:13.225783 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:13 crc kubenswrapper[5002]: I1209 10:01:13.226829 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:13 crc kubenswrapper[5002]: I1209 10:01:13.226861 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:13 crc kubenswrapper[5002]: I1209 10:01:13.226874 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:15 crc kubenswrapper[5002]: I1209 10:01:15.854241 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:15 crc kubenswrapper[5002]: I1209 10:01:15.854448 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:15 crc kubenswrapper[5002]: I1209 10:01:15.855383 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:15 crc kubenswrapper[5002]: I1209 10:01:15.855438 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:15 crc kubenswrapper[5002]: I1209 10:01:15.855451 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:16 crc kubenswrapper[5002]: I1209 10:01:16.172792 5002 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 09 10:01:16 crc kubenswrapper[5002]: I1209 10:01:16.172929 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 10:01:17 crc kubenswrapper[5002]: I1209 10:01:17.421149 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:17 crc kubenswrapper[5002]: I1209 10:01:17.421394 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:17 crc kubenswrapper[5002]: I1209 10:01:17.422956 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:17 crc kubenswrapper[5002]: I1209 10:01:17.423007 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:17 crc kubenswrapper[5002]: I1209 10:01:17.423025 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:18 crc kubenswrapper[5002]: E1209 10:01:18.121121 5002 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 09 10:01:18 crc kubenswrapper[5002]: I1209 10:01:18.716099 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:18 crc kubenswrapper[5002]: I1209 10:01:18.716305 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:18 crc kubenswrapper[5002]: I1209 10:01:18.718180 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:18 crc kubenswrapper[5002]: I1209 10:01:18.718219 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:18 crc kubenswrapper[5002]: I1209 10:01:18.718232 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:18 crc kubenswrapper[5002]: I1209 10:01:18.723339 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:19 crc kubenswrapper[5002]: I1209 10:01:19.127348 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:19 crc kubenswrapper[5002]: I1209 10:01:19.128787 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:19 crc kubenswrapper[5002]: I1209 10:01:19.128898 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:19 crc kubenswrapper[5002]: I1209 10:01:19.128911 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:19 crc kubenswrapper[5002]: I1209 10:01:19.134218 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:19 crc kubenswrapper[5002]: I1209 10:01:19.953140 5002 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.129706 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.130453 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.130488 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.130499 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.316460 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.316631 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.317857 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.317900 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.317918 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.576500 5002 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.576908 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.603861 5002 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.603986 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.609370 5002 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 09 10:01:20 crc kubenswrapper[5002]: I1209 10:01:20.609433 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 09 10:01:24 crc kubenswrapper[5002]: I1209 10:01:24.935422 5002 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 09 10:01:24 crc kubenswrapper[5002]: I1209 10:01:24.935478 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.582525 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.582671 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.583191 5002 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.583276 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 09 10:01:25 crc kubenswrapper[5002]: E1209 10:01:25.583282 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.583645 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.583708 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.583733 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.585033 5002 trace.go:236] Trace[96854899]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 10:01:12.132) (total time: 13452ms): Dec 09 10:01:25 crc kubenswrapper[5002]: Trace[96854899]: ---"Objects listed" error: 13452ms (10:01:25.584) Dec 09 10:01:25 crc kubenswrapper[5002]: Trace[96854899]: [13.452510638s] [13.452510638s] END Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.585442 5002 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.586772 5002 trace.go:236] Trace[1494361159]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 10:01:11.772) (total time: 13814ms): Dec 09 10:01:25 crc kubenswrapper[5002]: Trace[1494361159]: ---"Objects listed" error: 13814ms (10:01:25.586) Dec 09 10:01:25 crc kubenswrapper[5002]: Trace[1494361159]: [13.814290616s] [13.814290616s] END Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.586839 5002 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 09 10:01:25 crc kubenswrapper[5002]: E1209 10:01:25.586914 5002 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.587679 5002 trace.go:236] Trace[489198973]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 10:01:10.880) (total time: 14707ms): Dec 09 10:01:25 crc kubenswrapper[5002]: Trace[489198973]: ---"Objects listed" error: 14706ms (10:01:25.587) Dec 09 10:01:25 crc kubenswrapper[5002]: Trace[489198973]: [14.707092871s] [14.707092871s] END Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.587705 5002 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.588519 5002 trace.go:236] Trace[11452535]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 10:01:11.033) (total time: 14555ms): Dec 09 10:01:25 crc kubenswrapper[5002]: Trace[11452535]: ---"Objects listed" error: 14555ms (10:01:25.588) Dec 09 10:01:25 crc kubenswrapper[5002]: Trace[11452535]: [14.555428827s] [14.555428827s] END Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.588562 5002 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.590344 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.591912 5002 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.912031 5002 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33422->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.912077 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33422->192.168.126.11:17697: read: connection reset by peer" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.953871 5002 apiserver.go:52] "Watching apiserver" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.956666 5002 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.956935 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.957273 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.957356 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.957349 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.957409 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.957495 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.957622 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:25 crc kubenswrapper[5002]: E1209 10:01:25.957799 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:25 crc kubenswrapper[5002]: E1209 10:01:25.957859 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:25 crc kubenswrapper[5002]: E1209 10:01:25.957961 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.959643 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.959657 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.960038 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.960063 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.960236 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.960093 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.960114 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.960199 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.960201 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.984486 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:25 crc kubenswrapper[5002]: I1209 10:01:25.993480 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.007726 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.016347 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.022857 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.029994 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.037688 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.055374 5002 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095559 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095617 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095651 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095681 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095746 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095779 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095809 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095870 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095902 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095931 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095962 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.095995 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096026 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096058 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096088 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096119 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096149 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096202 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096248 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096294 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096339 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096385 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096430 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096475 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096565 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096688 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096735 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096777 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096806 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096922 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.096966 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097014 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097100 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097148 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097175 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097221 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097204 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097320 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097327 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097364 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097405 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097437 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097474 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097485 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097642 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097650 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097681 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097714 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097743 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097775 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097808 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097868 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097900 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097929 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097972 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097996 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098019 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098042 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098066 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098094 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098127 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098158 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098182 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098209 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098277 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098301 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098351 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098374 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098407 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098440 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098465 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098491 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098521 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098556 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098585 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098607 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098629 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098651 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098679 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098701 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098721 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098746 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098768 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098788 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098808 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098856 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098878 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098901 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098923 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098951 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098976 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099003 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099024 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099046 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099069 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099092 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099114 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099168 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099190 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099215 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099237 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099257 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099279 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099303 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099323 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099345 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099372 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099393 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099414 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099438 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099459 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099480 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099501 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099524 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099546 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099568 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099593 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099615 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099671 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099694 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099717 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099742 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099767 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099790 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099962 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099989 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100012 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100034 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100057 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100080 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100109 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100143 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100179 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100204 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100230 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100255 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100277 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100301 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100326 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100348 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100373 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100403 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100436 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100460 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100483 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100508 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100536 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100560 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100585 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100608 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100632 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100656 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100678 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100702 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100752 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100778 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100803 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100849 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100898 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100926 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100955 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100984 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101012 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101040 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101066 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101092 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101121 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101145 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101168 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101202 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101260 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101286 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101309 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101335 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101359 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101384 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101407 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101432 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101456 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101480 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101510 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101534 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101556 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101581 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101606 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101628 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101654 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101679 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101703 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101728 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101752 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101774 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101799 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101853 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101888 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101922 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101954 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101980 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102004 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102028 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102093 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102123 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102150 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102176 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102202 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102230 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102258 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102285 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102313 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102339 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102365 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102390 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102418 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102444 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102510 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102526 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102544 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102559 5002 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102573 5002 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.107427 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.108617 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.097738 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098152 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.115385 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098382 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098493 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098638 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.098909 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099049 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099176 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.099503 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100105 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100145 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100432 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100629 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100679 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.100699 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101092 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.101192 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102170 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.102671 5002 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102716 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.102987 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.103009 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.103077 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.103306 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.103358 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.104131 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.104183 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.104219 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.104331 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.104639 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.105563 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.105961 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.106426 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.106575 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.106458 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.106791 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.106883 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.107329 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.107597 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.107592 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.107790 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.107894 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.108297 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.108397 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.108448 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.108742 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.109174 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.109310 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.109647 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.109662 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.109706 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.109803 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.110005 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.110032 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.110170 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.110303 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.110350 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.110593 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.111432 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.111555 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.111577 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.111861 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.111991 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.112730 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.112763 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.113299 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.113355 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.113364 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:01:26.613336389 +0000 UTC m=+19.005387490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.113666 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.114185 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.114386 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.114566 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.114610 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.114779 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.115508 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.115640 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.115829 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.115860 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.116378 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.113886 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.116460 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.117023 5002 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.116456 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.117364 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.117433 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.117851 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.118031 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.118637 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.118666 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.117519 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.118962 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119095 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119113 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119150 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119406 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119426 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119684 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119730 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119177 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.120127 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.122801 5002 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119741 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119949 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119738 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.119740 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.121137 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.121157 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:26.620552689 +0000 UTC m=+19.012603810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.131719 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:26.631693492 +0000 UTC m=+19.023744593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.131846 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.121310 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.121038 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.123426 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.122903 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.123557 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.123942 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.124070 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.124201 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.124225 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.124842 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.125136 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.125179 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.125356 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.125352 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.125484 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.125488 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.126159 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.126702 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.127107 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.127224 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.127333 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.127697 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.127925 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.128413 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.126746 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.128626 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.128726 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.128089 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.128891 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.129344 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.129430 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.129692 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.129869 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.130179 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.130335 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.130477 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.130292 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.129622 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.130788 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.130870 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.130936 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.131310 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.131474 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.131479 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.132419 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.132439 5002 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.132511 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:26.632493423 +0000 UTC m=+19.024544524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.136247 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.136510 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.137971 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.138365 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.139906 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.140982 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.142134 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.142380 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.142489 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.142511 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.142830 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.142968 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.143246 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.143299 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.143348 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.143366 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.143954 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.144136 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.144344 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.144376 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.144457 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.145054 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.145199 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.145331 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.145637 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.145704 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.146574 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.146723 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.146974 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.147466 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.149276 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.150972 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.151075 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.151489 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.151617 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.151764 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.152536 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.153168 5002 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8" exitCode=255 Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.153247 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8"} Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.153456 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.154114 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.154151 5002 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.154221 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:26.654197273 +0000 UTC m=+19.046248424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.155609 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.156961 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.157308 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.157442 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.157465 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.157668 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.158056 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.159345 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.165447 5002 scope.go:117] "RemoveContainer" containerID="f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.165464 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.169488 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.172647 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.174745 5002 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.174797 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.178621 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.183040 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.187865 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.193134 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.197710 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203467 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203555 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203604 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203661 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203674 5002 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203684 5002 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203692 5002 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203700 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203708 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203716 5002 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203741 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203749 5002 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203758 5002 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203766 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203774 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203782 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203799 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203848 5002 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203856 5002 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203864 5002 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203872 5002 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203879 5002 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203895 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203919 5002 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203927 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203935 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203943 5002 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203951 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203959 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203968 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.203993 5002 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204001 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204010 5002 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204018 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204027 5002 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204035 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204045 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204053 5002 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204077 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204087 5002 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204096 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204104 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204114 5002 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204068 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204124 5002 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204170 5002 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204180 5002 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204190 5002 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204199 5002 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204208 5002 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204218 5002 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204227 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204236 5002 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204245 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204253 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204261 5002 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204271 5002 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204281 5002 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204290 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204298 5002 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204307 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204315 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204324 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204333 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204341 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204349 5002 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204357 5002 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204366 5002 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204375 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204384 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204394 5002 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204403 5002 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204411 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204420 5002 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204428 5002 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204436 5002 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204443 5002 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204452 5002 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204460 5002 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204469 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204477 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204485 5002 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204493 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204500 5002 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204508 5002 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204516 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204525 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204534 5002 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204543 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204551 5002 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204559 5002 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204792 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204803 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204837 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204847 5002 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204855 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204863 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204871 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204879 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204888 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204897 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204907 5002 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204916 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204923 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204933 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204943 5002 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204952 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204959 5002 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204968 5002 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204976 5002 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204991 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.204999 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205007 5002 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205016 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205024 5002 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205032 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205040 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205049 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205058 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205086 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205102 5002 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205111 5002 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205119 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205127 5002 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205136 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205145 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205153 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205161 5002 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205169 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205178 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205187 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205195 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205204 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205212 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205221 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205228 5002 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205236 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205244 5002 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205252 5002 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205260 5002 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205267 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205275 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205283 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205290 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205298 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205306 5002 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205314 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205321 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205329 5002 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205336 5002 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205344 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205351 5002 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205360 5002 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205368 5002 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205375 5002 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205383 5002 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205391 5002 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205401 5002 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205409 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205417 5002 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205481 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205492 5002 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205500 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205508 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205516 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205526 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205535 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205544 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205552 5002 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205560 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205568 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205576 5002 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205584 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205594 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205602 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205610 5002 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205618 5002 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205626 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205634 5002 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205642 5002 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205614 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205650 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205772 5002 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.205780 5002 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.206588 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.206628 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.206645 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.206660 5002 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.206675 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.206691 5002 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.206709 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.206725 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.206740 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.206760 5002 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.216792 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.227402 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.270045 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.276155 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.282501 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 10:01:26 crc kubenswrapper[5002]: W1209 10:01:26.288725 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6c175ea6e076013a37c2e3ee98ee123b350638b0cd2f76aa9f98f3026788dee2 WatchSource:0}: Error finding container 6c175ea6e076013a37c2e3ee98ee123b350638b0cd2f76aa9f98f3026788dee2: Status 404 returned error can't find the container with id 6c175ea6e076013a37c2e3ee98ee123b350638b0cd2f76aa9f98f3026788dee2 Dec 09 10:01:26 crc kubenswrapper[5002]: W1209 10:01:26.313237 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-4b31bc570ffacb5e590412e9e5e18a591f7c41179d20bfc0a0c36d6b3c8c0767 WatchSource:0}: Error finding container 4b31bc570ffacb5e590412e9e5e18a591f7c41179d20bfc0a0c36d6b3c8c0767: Status 404 returned error can't find the container with id 4b31bc570ffacb5e590412e9e5e18a591f7c41179d20bfc0a0c36d6b3c8c0767 Dec 09 10:01:26 crc kubenswrapper[5002]: W1209 10:01:26.313669 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-6376563a214a2f14e3f0290d4498b537ac76adc89df293417b22cb677a96015b WatchSource:0}: Error finding container 6376563a214a2f14e3f0290d4498b537ac76adc89df293417b22cb677a96015b: Status 404 returned error can't find the container with id 6376563a214a2f14e3f0290d4498b537ac76adc89df293417b22cb677a96015b Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.711196 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.711260 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.711285 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.711302 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:26 crc kubenswrapper[5002]: I1209 10:01:26.711320 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711399 5002 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711416 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:01:27.711386318 +0000 UTC m=+20.103437409 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711426 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711457 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:27.711447419 +0000 UTC m=+20.103498520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711458 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711472 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711487 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711498 5002 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711505 5002 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711519 5002 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711542 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:27.711525151 +0000 UTC m=+20.103576322 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711624 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:27.711609034 +0000 UTC m=+20.103660115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:26 crc kubenswrapper[5002]: E1209 10:01:26.711645 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:27.711639414 +0000 UTC m=+20.103690495 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.059182 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.059327 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.059186 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.059417 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.157559 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207"} Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.157613 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d"} Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.157627 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6c175ea6e076013a37c2e3ee98ee123b350638b0cd2f76aa9f98f3026788dee2"} Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.159587 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.160965 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7"} Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.161444 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.161662 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4b31bc570ffacb5e590412e9e5e18a591f7c41179d20bfc0a0c36d6b3c8c0767"} Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.162516 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5"} Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.162549 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6376563a214a2f14e3f0290d4498b537ac76adc89df293417b22cb677a96015b"} Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.175044 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.186751 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.205898 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.221481 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.233001 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.247037 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.262668 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.280419 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.295125 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.313599 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.328889 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.341785 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.353467 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.363888 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.718305 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.718388 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718426 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:01:29.718405386 +0000 UTC m=+22.110456467 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.718455 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.718486 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718507 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718526 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718538 5002 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718562 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718572 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718585 5002 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:27 crc kubenswrapper[5002]: I1209 10:01:27.718512 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718621 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:29.718611091 +0000 UTC m=+22.110662172 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718622 5002 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718637 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:29.718629922 +0000 UTC m=+22.110681003 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718663 5002 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718689 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:29.718677053 +0000 UTC m=+22.110728134 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:27 crc kubenswrapper[5002]: E1209 10:01:27.718763 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:29.718746015 +0000 UTC m=+22.110797096 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.060412 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:28 crc kubenswrapper[5002]: E1209 10:01:28.060634 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.064888 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.066244 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.068739 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.070414 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.071674 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.073025 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.074346 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.075538 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.076979 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.078256 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.079576 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.080138 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.083119 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.084277 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.086340 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.087692 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.089243 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.090296 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.091108 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.092309 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.093589 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.094579 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.095712 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.096530 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.096714 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.098114 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.098945 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.101041 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.102661 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.103860 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.106057 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.107059 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.108730 5002 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.108968 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.112481 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.113512 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.114529 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.118224 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.119187 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.120620 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.122669 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.125749 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.129168 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.130616 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.135030 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.136628 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.138654 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.139293 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.140076 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.140780 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.141764 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.142432 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.142564 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.144906 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.146126 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.148344 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.149642 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.150656 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.160125 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.175129 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.188132 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.787156 5002 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.788642 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.788725 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.788741 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.788804 5002 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.797712 5002 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.797976 5002 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.798971 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.799011 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.799023 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.799040 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.799053 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:28Z","lastTransitionTime":"2025-12-09T10:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:28 crc kubenswrapper[5002]: E1209 10:01:28.821719 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.825136 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.825173 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.825189 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.825208 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.825220 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:28Z","lastTransitionTime":"2025-12-09T10:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:28 crc kubenswrapper[5002]: E1209 10:01:28.838489 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.842170 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.842202 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.842211 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.842225 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.842238 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:28Z","lastTransitionTime":"2025-12-09T10:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:28 crc kubenswrapper[5002]: E1209 10:01:28.861310 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.865123 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.865159 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.865168 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.865182 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.865191 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:28Z","lastTransitionTime":"2025-12-09T10:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:28 crc kubenswrapper[5002]: E1209 10:01:28.877329 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.881306 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.881340 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.881350 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.881366 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.881379 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:28Z","lastTransitionTime":"2025-12-09T10:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:28 crc kubenswrapper[5002]: E1209 10:01:28.896015 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:28 crc kubenswrapper[5002]: E1209 10:01:28.896155 5002 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.897983 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.898021 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.898034 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.898050 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:28 crc kubenswrapper[5002]: I1209 10:01:28.898065 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:28Z","lastTransitionTime":"2025-12-09T10:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.000175 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.000230 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.000246 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.000270 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.000287 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:29Z","lastTransitionTime":"2025-12-09T10:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.059888 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.059920 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.060029 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.060209 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.102642 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.102688 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.102699 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.102717 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.102728 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:29Z","lastTransitionTime":"2025-12-09T10:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.169873 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b"} Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.190273 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:29Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.205305 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.205346 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.205358 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.205374 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.205385 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:29Z","lastTransitionTime":"2025-12-09T10:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.210860 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:29Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.227748 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:29Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.245650 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:29Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.260102 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:29Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.271894 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:29Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.285311 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:29Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.307583 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.307629 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.307639 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.307660 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.307671 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:29Z","lastTransitionTime":"2025-12-09T10:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.410385 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.410427 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.410439 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.410469 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.410480 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:29Z","lastTransitionTime":"2025-12-09T10:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.515798 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.515852 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.515863 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.515877 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.515887 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:29Z","lastTransitionTime":"2025-12-09T10:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.618100 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.618131 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.618140 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.618153 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.618161 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:29Z","lastTransitionTime":"2025-12-09T10:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.721733 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.721780 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.721793 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.721829 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.721841 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:29Z","lastTransitionTime":"2025-12-09T10:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.737116 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.737207 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.737280 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.737315 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737334 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:01:33.737309418 +0000 UTC m=+26.129360519 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.737369 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737410 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737421 5002 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737431 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737464 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737481 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:33.737465542 +0000 UTC m=+26.129516633 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737484 5002 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737438 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737527 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:33.737517404 +0000 UTC m=+26.129568495 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737439 5002 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737537 5002 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737561 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:33.737552705 +0000 UTC m=+26.129603806 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:29 crc kubenswrapper[5002]: E1209 10:01:29.737590 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:33.737573645 +0000 UTC m=+26.129624736 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.824630 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.824673 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.824685 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.824701 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.824713 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:29Z","lastTransitionTime":"2025-12-09T10:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.928138 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.928184 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.928202 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.928224 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:29 crc kubenswrapper[5002]: I1209 10:01:29.928243 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:29Z","lastTransitionTime":"2025-12-09T10:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.030840 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.030872 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.030887 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.030900 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.030910 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:30Z","lastTransitionTime":"2025-12-09T10:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.059630 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:30 crc kubenswrapper[5002]: E1209 10:01:30.059775 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.133342 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.133385 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.133396 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.133418 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.133430 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:30Z","lastTransitionTime":"2025-12-09T10:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.235925 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.235982 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.236000 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.236029 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.236049 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:30Z","lastTransitionTime":"2025-12-09T10:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.338124 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.338173 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.338183 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.338200 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.338214 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:30Z","lastTransitionTime":"2025-12-09T10:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.339540 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.353104 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.354257 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.354528 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.369461 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.382780 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.399276 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.412078 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.427843 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.440197 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.440238 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.440249 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.440265 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.440281 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:30Z","lastTransitionTime":"2025-12-09T10:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.442305 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.463226 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.478680 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.497206 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.512416 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.537435 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.545742 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.545791 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.545803 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.545835 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.545847 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:30Z","lastTransitionTime":"2025-12-09T10:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.562711 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.577803 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.595199 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.627999 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-t5hm5"] Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.628301 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t5hm5" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.631912 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.631916 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.632047 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.648535 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.648572 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.648580 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.648595 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.648605 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:30Z","lastTransitionTime":"2025-12-09T10:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.649833 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.665229 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.680482 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.697069 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.711041 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.726567 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.743087 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.748797 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9wwj\" (UniqueName: \"kubernetes.io/projected/4de8c639-f176-405c-ae34-6717f9f9458c-kube-api-access-p9wwj\") pod \"node-resolver-t5hm5\" (UID: \"4de8c639-f176-405c-ae34-6717f9f9458c\") " pod="openshift-dns/node-resolver-t5hm5" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.748857 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4de8c639-f176-405c-ae34-6717f9f9458c-hosts-file\") pod \"node-resolver-t5hm5\" (UID: \"4de8c639-f176-405c-ae34-6717f9f9458c\") " pod="openshift-dns/node-resolver-t5hm5" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.750342 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.750378 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.750387 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.750402 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.750411 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:30Z","lastTransitionTime":"2025-12-09T10:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.764265 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.776177 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.849675 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9wwj\" (UniqueName: \"kubernetes.io/projected/4de8c639-f176-405c-ae34-6717f9f9458c-kube-api-access-p9wwj\") pod \"node-resolver-t5hm5\" (UID: \"4de8c639-f176-405c-ae34-6717f9f9458c\") " pod="openshift-dns/node-resolver-t5hm5" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.849751 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4de8c639-f176-405c-ae34-6717f9f9458c-hosts-file\") pod \"node-resolver-t5hm5\" (UID: \"4de8c639-f176-405c-ae34-6717f9f9458c\") " pod="openshift-dns/node-resolver-t5hm5" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.849848 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4de8c639-f176-405c-ae34-6717f9f9458c-hosts-file\") pod \"node-resolver-t5hm5\" (UID: \"4de8c639-f176-405c-ae34-6717f9f9458c\") " pod="openshift-dns/node-resolver-t5hm5" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.853284 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.853329 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.853338 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.853354 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.853364 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:30Z","lastTransitionTime":"2025-12-09T10:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.867323 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9wwj\" (UniqueName: \"kubernetes.io/projected/4de8c639-f176-405c-ae34-6717f9f9458c-kube-api-access-p9wwj\") pod \"node-resolver-t5hm5\" (UID: \"4de8c639-f176-405c-ae34-6717f9f9458c\") " pod="openshift-dns/node-resolver-t5hm5" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.939423 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t5hm5" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.955805 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.955854 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.955865 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.955881 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:30 crc kubenswrapper[5002]: I1209 10:01:30.955893 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:30Z","lastTransitionTime":"2025-12-09T10:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.022893 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kxpn6"] Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.023210 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.025034 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.025065 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.028286 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.028576 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.028983 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.044160 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.057750 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.057780 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.057790 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.057805 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.057832 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:31Z","lastTransitionTime":"2025-12-09T10:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.059369 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.059449 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.059422 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: E1209 10:01:31.059563 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:31 crc kubenswrapper[5002]: E1209 10:01:31.059465 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.071361 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.086932 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.102396 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.130417 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.151446 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f49c6392-68b2-4847-9291-a0b4d9c1cbef-rootfs\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.151483 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49c6392-68b2-4847-9291-a0b4d9c1cbef-proxy-tls\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.151499 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f49c6392-68b2-4847-9291-a0b4d9c1cbef-mcd-auth-proxy-config\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.151516 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sslq7\" (UniqueName: \"kubernetes.io/projected/f49c6392-68b2-4847-9291-a0b4d9c1cbef-kube-api-access-sslq7\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.160052 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.160082 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.160092 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.160109 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.160120 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:31Z","lastTransitionTime":"2025-12-09T10:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.160763 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.183323 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t5hm5" event={"ID":"4de8c639-f176-405c-ae34-6717f9f9458c","Type":"ContainerStarted","Data":"097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.183357 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t5hm5" event={"ID":"4de8c639-f176-405c-ae34-6717f9f9458c","Type":"ContainerStarted","Data":"af15ea70a2601e74bdde48c5d8002769e706d4afacd25b1087295c9e49023f06"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.208958 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.225546 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.244025 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.252212 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f49c6392-68b2-4847-9291-a0b4d9c1cbef-rootfs\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.252258 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49c6392-68b2-4847-9291-a0b4d9c1cbef-proxy-tls\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.252282 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f49c6392-68b2-4847-9291-a0b4d9c1cbef-mcd-auth-proxy-config\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.252303 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sslq7\" (UniqueName: \"kubernetes.io/projected/f49c6392-68b2-4847-9291-a0b4d9c1cbef-kube-api-access-sslq7\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.252314 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f49c6392-68b2-4847-9291-a0b4d9c1cbef-rootfs\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.253157 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f49c6392-68b2-4847-9291-a0b4d9c1cbef-mcd-auth-proxy-config\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.260649 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49c6392-68b2-4847-9291-a0b4d9c1cbef-proxy-tls\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.262503 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.262537 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.262546 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.262560 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.262569 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:31Z","lastTransitionTime":"2025-12-09T10:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.268497 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.279309 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sslq7\" (UniqueName: \"kubernetes.io/projected/f49c6392-68b2-4847-9291-a0b4d9c1cbef-kube-api-access-sslq7\") pod \"machine-config-daemon-kxpn6\" (UID: \"f49c6392-68b2-4847-9291-a0b4d9c1cbef\") " pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.283848 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.298999 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.312454 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.333576 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.333583 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: W1209 10:01:31.349007 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49c6392_68b2_4847_9291_a0b4d9c1cbef.slice/crio-eaac14bc5521655d768c82e7150d4102fee4e4831d50eb51344948770859252f WatchSource:0}: Error finding container eaac14bc5521655d768c82e7150d4102fee4e4831d50eb51344948770859252f: Status 404 returned error can't find the container with id eaac14bc5521655d768c82e7150d4102fee4e4831d50eb51344948770859252f Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.352684 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.364795 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.364848 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.364858 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.364905 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.364915 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:31Z","lastTransitionTime":"2025-12-09T10:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.376362 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.387752 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.398575 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zjdhx"] Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.399168 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.399954 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-rgf44"] Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.399974 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.400265 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.400500 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.400825 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.401215 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.401719 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.401745 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.403164 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.403253 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.411187 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.422100 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.434744 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.452112 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.465383 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.466662 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.466686 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.466696 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.466712 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.466723 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:31Z","lastTransitionTime":"2025-12-09T10:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.480585 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.499585 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.512878 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.524542 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.535698 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.554623 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-daemon-config\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.554678 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-var-lib-kubelet\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.554795 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh57p\" (UniqueName: \"kubernetes.io/projected/28ed6e93-eda5-4648-b185-25d2960ce0f0-kube-api-access-sh57p\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.554863 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.554913 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-run-multus-certs\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.554935 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-var-lib-cni-multus\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.554967 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-etc-kubernetes\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555022 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-run-netns\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555054 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctz8l\" (UniqueName: \"kubernetes.io/projected/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-kube-api-access-ctz8l\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555090 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-socket-dir-parent\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555118 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-system-cni-dir\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555140 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-os-release\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555165 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-cnibin\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555186 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-cni-binary-copy\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555208 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-system-cni-dir\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555229 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-run-k8s-cni-cncf-io\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555250 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-hostroot\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555271 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-cnibin\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555296 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-var-lib-cni-bin\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555318 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-cni-dir\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555336 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28ed6e93-eda5-4648-b185-25d2960ce0f0-cni-binary-copy\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555355 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555404 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-os-release\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.555437 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-conf-dir\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.556573 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.568645 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.568683 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.568692 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.568707 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.568716 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:31Z","lastTransitionTime":"2025-12-09T10:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.570531 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.580353 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.655864 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-var-lib-cni-bin\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.655908 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-cni-dir\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.655928 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.655949 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28ed6e93-eda5-4648-b185-25d2960ce0f0-cni-binary-copy\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.655967 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-os-release\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.655989 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-conf-dir\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656008 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-daemon-config\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656029 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-var-lib-kubelet\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656036 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-var-lib-cni-bin\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656075 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-conf-dir\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656056 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656123 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-cni-dir\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656147 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-run-multus-certs\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656126 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-var-lib-kubelet\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656175 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh57p\" (UniqueName: \"kubernetes.io/projected/28ed6e93-eda5-4648-b185-25d2960ce0f0-kube-api-access-sh57p\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656200 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-etc-kubernetes\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656196 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-run-multus-certs\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656234 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-var-lib-cni-multus\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656262 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-etc-kubernetes\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656298 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-run-netns\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656328 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-run-netns\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656299 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-var-lib-cni-multus\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656326 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctz8l\" (UniqueName: \"kubernetes.io/projected/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-kube-api-access-ctz8l\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656401 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-system-cni-dir\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656433 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-os-release\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656461 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-cnibin\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656469 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656485 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-socket-dir-parent\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656520 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-system-cni-dir\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656530 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-socket-dir-parent\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656546 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-run-k8s-cni-cncf-io\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656564 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-system-cni-dir\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656568 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-hostroot\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656592 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-hostroot\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656602 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-cnibin\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656625 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-host-run-k8s-cni-cncf-io\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656622 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-cni-binary-copy\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656699 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-cnibin\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656700 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-system-cni-dir\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656714 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-os-release\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656724 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-cnibin\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656722 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656784 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28ed6e93-eda5-4648-b185-25d2960ce0f0-cni-binary-copy\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.656968 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/28ed6e93-eda5-4648-b185-25d2960ce0f0-multus-daemon-config\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.657099 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28ed6e93-eda5-4648-b185-25d2960ce0f0-os-release\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.657257 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-cni-binary-copy\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.671328 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.671377 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.671390 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.671407 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.671418 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:31Z","lastTransitionTime":"2025-12-09T10:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.678557 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctz8l\" (UniqueName: \"kubernetes.io/projected/9281be7a-3f7d-4b00-a3ff-f5b9236dd74a-kube-api-access-ctz8l\") pod \"multus-additional-cni-plugins-zjdhx\" (UID: \"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\") " pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.691383 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh57p\" (UniqueName: \"kubernetes.io/projected/28ed6e93-eda5-4648-b185-25d2960ce0f0-kube-api-access-sh57p\") pod \"multus-rgf44\" (UID: \"28ed6e93-eda5-4648-b185-25d2960ce0f0\") " pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.711156 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.717240 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rgf44" Dec 09 10:01:31 crc kubenswrapper[5002]: W1209 10:01:31.721274 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9281be7a_3f7d_4b00_a3ff_f5b9236dd74a.slice/crio-da9ed0a6a726b837afdd15b97658ba400a6f7ee51c416fd442c6e33baf54439b WatchSource:0}: Error finding container da9ed0a6a726b837afdd15b97658ba400a6f7ee51c416fd442c6e33baf54439b: Status 404 returned error can't find the container with id da9ed0a6a726b837afdd15b97658ba400a6f7ee51c416fd442c6e33baf54439b Dec 09 10:01:31 crc kubenswrapper[5002]: W1209 10:01:31.731531 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28ed6e93_eda5_4648_b185_25d2960ce0f0.slice/crio-78e9cf2be55449597881ca64212d4e563a47072618c117b8d13f1c46b4b23e91 WatchSource:0}: Error finding container 78e9cf2be55449597881ca64212d4e563a47072618c117b8d13f1c46b4b23e91: Status 404 returned error can't find the container with id 78e9cf2be55449597881ca64212d4e563a47072618c117b8d13f1c46b4b23e91 Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.759868 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2mnnl"] Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.763195 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.764947 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.765199 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.765283 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.765405 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.765637 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.767743 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.770108 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.775384 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.775423 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.775435 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.775455 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.775469 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:31Z","lastTransitionTime":"2025-12-09T10:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.785844 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.799902 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.827524 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.846281 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.859231 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.859655 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-netd\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.859694 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-systemd-units\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.859715 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-systemd\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.859755 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-kubelet\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.859773 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnh82\" (UniqueName: \"kubernetes.io/projected/7013527e-73de-4427-af9c-e33663b1c222-kube-api-access-fnh82\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860016 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-netns\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860089 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-ovn\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860117 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-env-overrides\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860160 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-openvswitch\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860207 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-node-log\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860229 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-var-lib-openvswitch\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860259 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-slash\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860284 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-ovn-kubernetes\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860340 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-etc-openvswitch\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860393 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-config\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860433 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-log-socket\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860456 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7013527e-73de-4427-af9c-e33663b1c222-ovn-node-metrics-cert\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860493 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-bin\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860540 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.860567 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-script-lib\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.873685 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.883015 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.883058 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.883066 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.883081 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.883092 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:31Z","lastTransitionTime":"2025-12-09T10:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.891365 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.909404 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.920327 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.933580 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.951345 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961517 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-netns\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961567 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-ovn\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961589 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-env-overrides\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961616 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-openvswitch\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961642 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-node-log\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961672 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-var-lib-openvswitch\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961695 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-slash\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961718 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-ovn-kubernetes\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961730 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-openvswitch\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961774 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-var-lib-openvswitch\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961738 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-etc-openvswitch\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962014 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-config\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962058 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-log-socket\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962087 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7013527e-73de-4427-af9c-e33663b1c222-ovn-node-metrics-cert\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961722 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-netns\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962152 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-bin\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961836 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-ovn-kubernetes\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962181 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-log-socket\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962123 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-bin\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961852 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-node-log\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961779 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-etc-openvswitch\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961802 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-ovn\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962241 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962317 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-script-lib\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962398 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-netd\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962429 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-systemd-units\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962458 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-systemd\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962462 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-env-overrides\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962513 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-kubelet\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962551 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnh82\" (UniqueName: \"kubernetes.io/projected/7013527e-73de-4427-af9c-e33663b1c222-kube-api-access-fnh82\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962260 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962660 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-netd\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962693 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-systemd-units\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962727 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-systemd\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962759 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-kubelet\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.961787 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-slash\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.962928 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-config\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.963118 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-script-lib\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.966632 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.968611 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7013527e-73de-4427-af9c-e33663b1c222-ovn-node-metrics-cert\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.979791 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnh82\" (UniqueName: \"kubernetes.io/projected/7013527e-73de-4427-af9c-e33663b1c222-kube-api-access-fnh82\") pod \"ovnkube-node-2mnnl\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.985733 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.985788 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.985843 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.985867 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.985888 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:31Z","lastTransitionTime":"2025-12-09T10:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:31 crc kubenswrapper[5002]: I1209 10:01:31.989321 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.059614 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:32 crc kubenswrapper[5002]: E1209 10:01:32.059858 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.089046 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.089095 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.089107 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.089124 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.089137 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:32Z","lastTransitionTime":"2025-12-09T10:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.185094 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" event={"ID":"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a","Type":"ContainerStarted","Data":"da9ed0a6a726b837afdd15b97658ba400a6f7ee51c416fd442c6e33baf54439b"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.186273 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.186294 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.186303 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"eaac14bc5521655d768c82e7150d4102fee4e4831d50eb51344948770859252f"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.187083 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rgf44" event={"ID":"28ed6e93-eda5-4648-b185-25d2960ce0f0","Type":"ContainerStarted","Data":"78e9cf2be55449597881ca64212d4e563a47072618c117b8d13f1c46b4b23e91"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.191021 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.191057 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.191068 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.191095 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.191107 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:32Z","lastTransitionTime":"2025-12-09T10:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.206724 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.220453 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.231526 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.243418 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.253701 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.268708 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.280793 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.295012 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.295054 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.295065 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.295082 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.295092 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:32Z","lastTransitionTime":"2025-12-09T10:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.301511 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.315740 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.329384 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.343374 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.349961 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.363569 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.375083 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:32Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.400871 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.400916 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.400924 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.400937 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.400945 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:32Z","lastTransitionTime":"2025-12-09T10:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.504061 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.504093 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.504102 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.504114 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.504125 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:32Z","lastTransitionTime":"2025-12-09T10:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.606674 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.606747 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.606760 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.606776 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.606789 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:32Z","lastTransitionTime":"2025-12-09T10:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.708532 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.708574 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.708586 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.708603 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.708616 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:32Z","lastTransitionTime":"2025-12-09T10:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.811413 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.811680 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.811752 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.811843 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.811921 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:32Z","lastTransitionTime":"2025-12-09T10:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.914200 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.914238 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.914247 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.914260 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:32 crc kubenswrapper[5002]: I1209 10:01:32.914269 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:32Z","lastTransitionTime":"2025-12-09T10:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.016767 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.016800 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.016829 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.016846 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.016858 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:33Z","lastTransitionTime":"2025-12-09T10:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.060201 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.060214 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.060354 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.060486 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.118962 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.119008 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.119020 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.119036 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.119046 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:33Z","lastTransitionTime":"2025-12-09T10:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.177790 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.181329 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.186117 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.195620 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794" exitCode=0 Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.195688 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.195735 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"b9e28e18876930927475b2285b8109c84f6a30fe113b3aeb6a3cbf15eaaaf7bc"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.200408 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rgf44" event={"ID":"28ed6e93-eda5-4648-b185-25d2960ce0f0","Type":"ContainerStarted","Data":"541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.202980 5002 generic.go:334] "Generic (PLEG): container finished" podID="9281be7a-3f7d-4b00-a3ff-f5b9236dd74a" containerID="cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b" exitCode=0 Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.203078 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" event={"ID":"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a","Type":"ContainerDied","Data":"cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.206011 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.210456 5002 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.224750 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.226030 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.226063 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.226074 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.226091 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.226103 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:33Z","lastTransitionTime":"2025-12-09T10:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.240076 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.250651 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.262538 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.275860 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.287601 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.301798 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.314338 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.329743 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.329801 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.329846 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.329875 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.329894 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:33Z","lastTransitionTime":"2025-12-09T10:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.332751 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.345670 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.360475 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.385082 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.400723 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.410689 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.425667 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.432498 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.432521 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.432529 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.432542 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.432550 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:33Z","lastTransitionTime":"2025-12-09T10:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.436567 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.449191 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.460964 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.473015 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.490388 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.502989 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.515071 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.527745 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.534780 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.534829 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.534838 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.534852 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.534862 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:33Z","lastTransitionTime":"2025-12-09T10:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.544169 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.556274 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.568453 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.599262 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-p7t88"] Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.599689 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p7t88" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.601531 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.602270 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.602388 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.602498 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.618118 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.633577 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.636595 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.636630 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.636640 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.636654 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.636664 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:33Z","lastTransitionTime":"2025-12-09T10:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.646070 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.656395 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.669516 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.678891 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.681028 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edef2440-1e4a-4676-9517-08b21b3b66ca-host\") pod \"node-ca-p7t88\" (UID: \"edef2440-1e4a-4676-9517-08b21b3b66ca\") " pod="openshift-image-registry/node-ca-p7t88" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.681096 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/edef2440-1e4a-4676-9517-08b21b3b66ca-serviceca\") pod \"node-ca-p7t88\" (UID: \"edef2440-1e4a-4676-9517-08b21b3b66ca\") " pod="openshift-image-registry/node-ca-p7t88" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.681127 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzs5x\" (UniqueName: \"kubernetes.io/projected/edef2440-1e4a-4676-9517-08b21b3b66ca-kube-api-access-mzs5x\") pod \"node-ca-p7t88\" (UID: \"edef2440-1e4a-4676-9517-08b21b3b66ca\") " pod="openshift-image-registry/node-ca-p7t88" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.692507 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.707878 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.738362 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.738409 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.738421 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.738435 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.738447 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:33Z","lastTransitionTime":"2025-12-09T10:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.762463 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.782121 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.782307 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:01:41.782279162 +0000 UTC m=+34.174330253 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.782548 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.782610 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.782636 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.782660 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edef2440-1e4a-4676-9517-08b21b3b66ca-host\") pod \"node-ca-p7t88\" (UID: \"edef2440-1e4a-4676-9517-08b21b3b66ca\") " pod="openshift-image-registry/node-ca-p7t88" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.782686 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.782724 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/edef2440-1e4a-4676-9517-08b21b3b66ca-serviceca\") pod \"node-ca-p7t88\" (UID: \"edef2440-1e4a-4676-9517-08b21b3b66ca\") " pod="openshift-image-registry/node-ca-p7t88" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.782768 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzs5x\" (UniqueName: \"kubernetes.io/projected/edef2440-1e4a-4676-9517-08b21b3b66ca-kube-api-access-mzs5x\") pod \"node-ca-p7t88\" (UID: \"edef2440-1e4a-4676-9517-08b21b3b66ca\") " pod="openshift-image-registry/node-ca-p7t88" Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.782926 5002 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.782989 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.783021 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.783035 5002 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.782962 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edef2440-1e4a-4676-9517-08b21b3b66ca-host\") pod \"node-ca-p7t88\" (UID: \"edef2440-1e4a-4676-9517-08b21b3b66ca\") " pod="openshift-image-registry/node-ca-p7t88" Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.783081 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:41.783065203 +0000 UTC m=+34.175116284 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.782933 5002 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.782944 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.783131 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.783149 5002 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.783179 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:41.783167465 +0000 UTC m=+34.175218626 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.783227 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:41.783218297 +0000 UTC m=+34.175269438 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:33 crc kubenswrapper[5002]: E1209 10:01:33.783381 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:41.783367791 +0000 UTC m=+34.175418872 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.784178 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/edef2440-1e4a-4676-9517-08b21b3b66ca-serviceca\") pod \"node-ca-p7t88\" (UID: \"edef2440-1e4a-4676-9517-08b21b3b66ca\") " pod="openshift-image-registry/node-ca-p7t88" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.803445 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.820151 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzs5x\" (UniqueName: \"kubernetes.io/projected/edef2440-1e4a-4676-9517-08b21b3b66ca-kube-api-access-mzs5x\") pod \"node-ca-p7t88\" (UID: \"edef2440-1e4a-4676-9517-08b21b3b66ca\") " pod="openshift-image-registry/node-ca-p7t88" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.840704 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.840753 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.840762 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.840776 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.840785 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:33Z","lastTransitionTime":"2025-12-09T10:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.847914 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.890063 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.929457 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.940900 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p7t88" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.942489 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.942519 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.942528 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.942541 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.942549 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:33Z","lastTransitionTime":"2025-12-09T10:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:33 crc kubenswrapper[5002]: W1209 10:01:33.953098 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedef2440_1e4a_4676_9517_08b21b3b66ca.slice/crio-b91acaf2f0827328126d1f6009bc0862d023de7122c3c2f77083f407d565c68a WatchSource:0}: Error finding container b91acaf2f0827328126d1f6009bc0862d023de7122c3c2f77083f407d565c68a: Status 404 returned error can't find the container with id b91acaf2f0827328126d1f6009bc0862d023de7122c3c2f77083f407d565c68a Dec 09 10:01:33 crc kubenswrapper[5002]: I1209 10:01:33.970056 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:33Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.009393 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.044408 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.044448 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.044459 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.044473 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.044486 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:34Z","lastTransitionTime":"2025-12-09T10:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.059801 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:34 crc kubenswrapper[5002]: E1209 10:01:34.059938 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.147070 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.147101 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.147110 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.147123 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.147132 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:34Z","lastTransitionTime":"2025-12-09T10:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.208928 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.208967 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.208977 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.208987 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.208995 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.209003 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.210170 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p7t88" event={"ID":"edef2440-1e4a-4676-9517-08b21b3b66ca","Type":"ContainerStarted","Data":"cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.210187 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p7t88" event={"ID":"edef2440-1e4a-4676-9517-08b21b3b66ca","Type":"ContainerStarted","Data":"b91acaf2f0827328126d1f6009bc0862d023de7122c3c2f77083f407d565c68a"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.212030 5002 generic.go:334] "Generic (PLEG): container finished" podID="9281be7a-3f7d-4b00-a3ff-f5b9236dd74a" containerID="d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643" exitCode=0 Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.212680 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" event={"ID":"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a","Type":"ContainerDied","Data":"d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.223839 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.235296 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.244866 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.249134 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.249182 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.249195 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.249212 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.249225 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:34Z","lastTransitionTime":"2025-12-09T10:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.256785 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.269628 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.283280 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.296612 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.329653 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.353067 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.353103 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.353111 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.353124 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.353149 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:34Z","lastTransitionTime":"2025-12-09T10:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.369803 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.415081 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.450708 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.455677 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.455707 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.455717 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.455730 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.455740 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:34Z","lastTransitionTime":"2025-12-09T10:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.489730 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.534716 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.558410 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.558452 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.558464 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.558480 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.558493 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:34Z","lastTransitionTime":"2025-12-09T10:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.575618 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.628900 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.655564 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.662103 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.662371 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.662391 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.662416 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.662432 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:34Z","lastTransitionTime":"2025-12-09T10:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.691678 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.731791 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.764551 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.764589 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.764603 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.764618 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.764627 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:34Z","lastTransitionTime":"2025-12-09T10:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.771518 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.810590 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.857243 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.866652 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.866686 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.866697 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.866712 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.866723 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:34Z","lastTransitionTime":"2025-12-09T10:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.890182 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.935532 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.969423 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.969475 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.969488 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.969509 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.969523 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:34Z","lastTransitionTime":"2025-12-09T10:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:34 crc kubenswrapper[5002]: I1209 10:01:34.979506 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:34Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.009923 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.052128 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.059281 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.059278 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:35 crc kubenswrapper[5002]: E1209 10:01:35.059433 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:35 crc kubenswrapper[5002]: E1209 10:01:35.059601 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.072128 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.072450 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.072529 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.072619 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.072700 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:35Z","lastTransitionTime":"2025-12-09T10:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.096078 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.130423 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.169126 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.174676 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.174745 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.174769 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.174802 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.174855 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:35Z","lastTransitionTime":"2025-12-09T10:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.207617 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.218123 5002 generic.go:334] "Generic (PLEG): container finished" podID="9281be7a-3f7d-4b00-a3ff-f5b9236dd74a" containerID="006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b" exitCode=0 Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.218169 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" event={"ID":"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a","Type":"ContainerDied","Data":"006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b"} Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.259032 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.277254 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.277286 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.277295 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.277308 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.277317 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:35Z","lastTransitionTime":"2025-12-09T10:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.290220 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.331053 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.367994 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.379896 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.379937 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.379946 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.379961 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.379972 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:35Z","lastTransitionTime":"2025-12-09T10:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.408484 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.451757 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.481694 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.481734 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.481746 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.481784 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.481798 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:35Z","lastTransitionTime":"2025-12-09T10:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.489467 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.528434 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.569501 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.584456 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.584497 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.584509 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.584524 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.584535 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:35Z","lastTransitionTime":"2025-12-09T10:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.611768 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.649488 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.687261 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.687301 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.687311 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.687326 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.687337 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:35Z","lastTransitionTime":"2025-12-09T10:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.690833 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.732225 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.774486 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.790424 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.790457 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.790465 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.790478 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.790492 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:35Z","lastTransitionTime":"2025-12-09T10:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.824939 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:35Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.892601 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.892647 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.892659 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.892675 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.892689 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:35Z","lastTransitionTime":"2025-12-09T10:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.995690 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.995725 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.995733 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.995746 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:35 crc kubenswrapper[5002]: I1209 10:01:35.995756 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:35Z","lastTransitionTime":"2025-12-09T10:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.059731 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:36 crc kubenswrapper[5002]: E1209 10:01:36.059874 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.097518 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.097555 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.097565 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.097580 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.097590 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:36Z","lastTransitionTime":"2025-12-09T10:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.199668 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.199707 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.199718 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.199734 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.199744 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:36Z","lastTransitionTime":"2025-12-09T10:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.223901 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d"} Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.226074 5002 generic.go:334] "Generic (PLEG): container finished" podID="9281be7a-3f7d-4b00-a3ff-f5b9236dd74a" containerID="1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd" exitCode=0 Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.226105 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" event={"ID":"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a","Type":"ContainerDied","Data":"1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd"} Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.235847 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.247745 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.259578 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.269941 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.286153 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.296756 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.301752 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.301782 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.301792 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.301808 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.301847 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:36Z","lastTransitionTime":"2025-12-09T10:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.309611 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.323414 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.335569 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.346890 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.361346 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.377471 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.390446 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.404177 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.404214 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.404224 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.404239 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.404250 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:36Z","lastTransitionTime":"2025-12-09T10:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.416552 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.439274 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:36Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.506283 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.506568 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.506577 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.506592 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.506602 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:36Z","lastTransitionTime":"2025-12-09T10:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.609348 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.609382 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.609390 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.609405 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.609416 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:36Z","lastTransitionTime":"2025-12-09T10:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.712131 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.712174 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.712184 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.712205 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.712218 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:36Z","lastTransitionTime":"2025-12-09T10:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.814524 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.814561 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.814572 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.814588 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.814599 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:36Z","lastTransitionTime":"2025-12-09T10:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.917187 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.917290 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.917308 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.917345 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:36 crc kubenswrapper[5002]: I1209 10:01:36.917356 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:36Z","lastTransitionTime":"2025-12-09T10:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.020910 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.020940 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.020948 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.020964 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.020973 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:37Z","lastTransitionTime":"2025-12-09T10:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.059992 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:37 crc kubenswrapper[5002]: E1209 10:01:37.060109 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.060465 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:37 crc kubenswrapper[5002]: E1209 10:01:37.060612 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.124499 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.124559 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.124574 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.124592 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.124626 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:37Z","lastTransitionTime":"2025-12-09T10:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.227404 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.227460 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.227474 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.227496 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.227516 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:37Z","lastTransitionTime":"2025-12-09T10:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.232571 5002 generic.go:334] "Generic (PLEG): container finished" podID="9281be7a-3f7d-4b00-a3ff-f5b9236dd74a" containerID="75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858" exitCode=0 Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.232626 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" event={"ID":"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a","Type":"ContainerDied","Data":"75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858"} Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.257934 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.273752 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.287988 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.302742 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.316095 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.326127 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.329308 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.329337 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.329346 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.329380 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.329389 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:37Z","lastTransitionTime":"2025-12-09T10:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.338473 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.357354 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.371843 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.383795 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.397624 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.410249 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.421910 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.431397 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.431424 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.431432 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.431446 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.431454 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:37Z","lastTransitionTime":"2025-12-09T10:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.434586 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.446053 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:37Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.534985 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.535027 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.535041 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.535063 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.535077 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:37Z","lastTransitionTime":"2025-12-09T10:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.637675 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.637749 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.637770 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.637798 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.637872 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:37Z","lastTransitionTime":"2025-12-09T10:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.741261 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.741314 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.741325 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.741343 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.741357 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:37Z","lastTransitionTime":"2025-12-09T10:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.843939 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.843977 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.843987 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.844006 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.844017 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:37Z","lastTransitionTime":"2025-12-09T10:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.946239 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.946279 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.946287 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.946302 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:37 crc kubenswrapper[5002]: I1209 10:01:37.946311 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:37Z","lastTransitionTime":"2025-12-09T10:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.048140 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.048168 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.048177 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.048191 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.048199 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:38Z","lastTransitionTime":"2025-12-09T10:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.060072 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:38 crc kubenswrapper[5002]: E1209 10:01:38.060187 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.072805 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.081914 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.096214 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.108205 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.121622 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.134159 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.145158 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.151003 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.151038 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.151066 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.151082 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.151091 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:38Z","lastTransitionTime":"2025-12-09T10:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.156125 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.222215 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.238170 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.239435 5002 generic.go:334] "Generic (PLEG): container finished" podID="9281be7a-3f7d-4b00-a3ff-f5b9236dd74a" containerID="257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210" exitCode=0 Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.239554 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" event={"ID":"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a","Type":"ContainerDied","Data":"257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210"} Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.249978 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.253672 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.253701 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.253710 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.253724 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.253733 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:38Z","lastTransitionTime":"2025-12-09T10:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.265252 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.276406 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.286393 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.295706 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.309140 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.318902 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.327204 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.337929 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.347674 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.355708 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.355748 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.355757 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.355771 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.355781 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:38Z","lastTransitionTime":"2025-12-09T10:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.364458 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.375749 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.386145 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.397086 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.409549 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.422562 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.440193 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.454234 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.458173 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.458374 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.458468 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.458569 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.458657 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:38Z","lastTransitionTime":"2025-12-09T10:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.468661 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.486027 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.561274 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.561333 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.561346 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.561366 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.561379 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:38Z","lastTransitionTime":"2025-12-09T10:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.662861 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.662904 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.662915 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.662931 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.662943 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:38Z","lastTransitionTime":"2025-12-09T10:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.766034 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.766069 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.766082 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.766098 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.766110 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:38Z","lastTransitionTime":"2025-12-09T10:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.869035 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.869079 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.869090 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.869107 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.869118 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:38Z","lastTransitionTime":"2025-12-09T10:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.972370 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.972416 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.972433 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.972450 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:38 crc kubenswrapper[5002]: I1209 10:01:38.972460 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:38Z","lastTransitionTime":"2025-12-09T10:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.059227 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.059302 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:39 crc kubenswrapper[5002]: E1209 10:01:39.059429 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:39 crc kubenswrapper[5002]: E1209 10:01:39.059589 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.075182 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.075233 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.075244 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.075262 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.075274 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:39Z","lastTransitionTime":"2025-12-09T10:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.079119 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.079152 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.079164 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.079178 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.079189 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:39Z","lastTransitionTime":"2025-12-09T10:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: E1209 10:01:39.095543 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.099435 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.099488 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.099502 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.099520 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.099532 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:39Z","lastTransitionTime":"2025-12-09T10:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: E1209 10:01:39.118232 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.122222 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.122266 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.122276 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.122289 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.122298 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:39Z","lastTransitionTime":"2025-12-09T10:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: E1209 10:01:39.134023 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.138285 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.138323 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.138331 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.138350 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.138362 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:39Z","lastTransitionTime":"2025-12-09T10:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: E1209 10:01:39.151323 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.154805 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.154844 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.154851 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.154864 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.154872 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:39Z","lastTransitionTime":"2025-12-09T10:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: E1209 10:01:39.169481 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: E1209 10:01:39.169656 5002 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.177317 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.177344 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.177355 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.177371 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.177382 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:39Z","lastTransitionTime":"2025-12-09T10:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.279543 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.279589 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.279600 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.279618 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.279632 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:39Z","lastTransitionTime":"2025-12-09T10:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.706775 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.706845 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.706860 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.706894 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.706916 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:39Z","lastTransitionTime":"2025-12-09T10:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.716199 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937"} Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.717367 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.717397 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.723924 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" event={"ID":"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a","Type":"ContainerStarted","Data":"60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0"} Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.733155 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.743406 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.744874 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.748549 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.765344 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.779553 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.794981 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.809670 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.809724 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.809737 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.809762 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.809776 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:39Z","lastTransitionTime":"2025-12-09T10:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.809987 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.825666 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.847505 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.863447 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.877092 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.891681 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.911657 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.911696 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.911705 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.911720 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.911728 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:39Z","lastTransitionTime":"2025-12-09T10:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.915482 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.929988 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.943691 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.956605 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.973618 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:39 crc kubenswrapper[5002]: I1209 10:01:39.985465 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:39Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.007487 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.014144 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.014201 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.014210 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.014226 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.014238 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:40Z","lastTransitionTime":"2025-12-09T10:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.022287 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.035285 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.056983 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.060286 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:40 crc kubenswrapper[5002]: E1209 10:01:40.060493 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.074983 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.091900 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.107671 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.117340 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.117394 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.117407 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.117458 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.117473 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:40Z","lastTransitionTime":"2025-12-09T10:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.122659 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.137971 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.163678 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.184091 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.196677 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.208302 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:40Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.220520 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.220557 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.220571 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.220587 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.220598 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:40Z","lastTransitionTime":"2025-12-09T10:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.323734 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.323795 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.323805 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.323856 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.323870 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:40Z","lastTransitionTime":"2025-12-09T10:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.426678 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.426724 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.426735 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.426750 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.426762 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:40Z","lastTransitionTime":"2025-12-09T10:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.530240 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.530289 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.530299 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.530314 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.530325 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:40Z","lastTransitionTime":"2025-12-09T10:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.632594 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.632632 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.632641 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.632654 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.632663 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:40Z","lastTransitionTime":"2025-12-09T10:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.726983 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.735113 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.735156 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.735168 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.735184 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.735196 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:40Z","lastTransitionTime":"2025-12-09T10:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.837956 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.837996 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.838006 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.838020 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.838029 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:40Z","lastTransitionTime":"2025-12-09T10:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.940546 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.940601 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.940610 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.940624 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:40 crc kubenswrapper[5002]: I1209 10:01:40.940632 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:40Z","lastTransitionTime":"2025-12-09T10:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.042549 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.042584 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.042593 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.042606 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.042615 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:41Z","lastTransitionTime":"2025-12-09T10:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.060116 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.060156 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.060259 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.060377 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.145102 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.145164 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.145174 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.145200 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.145211 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:41Z","lastTransitionTime":"2025-12-09T10:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.247737 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.247778 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.247789 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.247803 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.247827 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:41Z","lastTransitionTime":"2025-12-09T10:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.350522 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.350570 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.350583 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.350602 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.350615 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:41Z","lastTransitionTime":"2025-12-09T10:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.456799 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.456850 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.456858 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.456875 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.456885 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:41Z","lastTransitionTime":"2025-12-09T10:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.558806 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.558863 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.558874 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.558888 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.558900 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:41Z","lastTransitionTime":"2025-12-09T10:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.696455 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.696508 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.696520 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.696537 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.696550 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:41Z","lastTransitionTime":"2025-12-09T10:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.729699 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.799042 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.799082 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.799092 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.799107 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.799118 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:41Z","lastTransitionTime":"2025-12-09T10:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.821468 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.821574 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.821603 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.821626 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.821663 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.821717 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:01:57.821690871 +0000 UTC m=+50.213741962 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.821726 5002 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.821744 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.821767 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.821739 5002 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.821804 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:57.821784343 +0000 UTC m=+50.213835474 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.821880 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:57.821859815 +0000 UTC m=+50.213910946 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.821784 5002 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.821941 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:57.821929257 +0000 UTC m=+50.213980428 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.821990 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.822048 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.822076 5002 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:41 crc kubenswrapper[5002]: E1209 10:01:41.822177 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:57.822138092 +0000 UTC m=+50.214189213 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.901913 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.901977 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.901993 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.902017 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:41 crc kubenswrapper[5002]: I1209 10:01:41.902038 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:41Z","lastTransitionTime":"2025-12-09T10:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.004409 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.004462 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.004471 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.004483 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.004492 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:42Z","lastTransitionTime":"2025-12-09T10:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.060805 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:42 crc kubenswrapper[5002]: E1209 10:01:42.060992 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.106829 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.106872 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.106880 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.106894 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.106903 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:42Z","lastTransitionTime":"2025-12-09T10:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.209464 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.209503 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.209513 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.209527 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.209536 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:42Z","lastTransitionTime":"2025-12-09T10:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.311891 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.311940 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.311951 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.311968 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.311982 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:42Z","lastTransitionTime":"2025-12-09T10:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.414034 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.414076 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.414088 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.414106 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.414118 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:42Z","lastTransitionTime":"2025-12-09T10:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.516907 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.516973 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.516997 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.517025 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.517048 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:42Z","lastTransitionTime":"2025-12-09T10:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.619663 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.619719 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.619730 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.619745 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.619756 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:42Z","lastTransitionTime":"2025-12-09T10:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.722444 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.722495 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.722505 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.722520 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.722530 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:42Z","lastTransitionTime":"2025-12-09T10:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.824931 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.824969 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.824977 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.824992 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.825002 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:42Z","lastTransitionTime":"2025-12-09T10:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.927355 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.927406 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.927416 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.927430 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:42 crc kubenswrapper[5002]: I1209 10:01:42.927440 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:42Z","lastTransitionTime":"2025-12-09T10:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.029909 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.030165 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.030177 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.030198 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.030209 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:43Z","lastTransitionTime":"2025-12-09T10:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.059321 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.059350 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:43 crc kubenswrapper[5002]: E1209 10:01:43.059446 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:43 crc kubenswrapper[5002]: E1209 10:01:43.059539 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.132880 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.132932 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.132948 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.132969 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.132985 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:43Z","lastTransitionTime":"2025-12-09T10:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.235185 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.235637 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.235649 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.235665 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.235682 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:43Z","lastTransitionTime":"2025-12-09T10:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.338042 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.338086 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.338096 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.338110 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.338121 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:43Z","lastTransitionTime":"2025-12-09T10:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.441016 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.441061 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.441069 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.441086 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.441102 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:43Z","lastTransitionTime":"2025-12-09T10:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.544266 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.544325 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.544357 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.544380 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.544397 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:43Z","lastTransitionTime":"2025-12-09T10:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.646671 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.646704 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.646713 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.646725 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.646735 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:43Z","lastTransitionTime":"2025-12-09T10:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.737798 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/0.log" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.740870 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937" exitCode=1 Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.740923 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937"} Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.741645 5002 scope.go:117] "RemoveContainer" containerID="36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.748431 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.748478 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.748496 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.748516 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.748529 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:43Z","lastTransitionTime":"2025-12-09T10:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.771906 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.791013 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.810013 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.824784 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.836043 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.848233 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.850851 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.850927 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.850940 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.850980 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.850994 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:43Z","lastTransitionTime":"2025-12-09T10:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.861455 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.872270 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.888772 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:42Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551903 6292 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551957 6292 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551991 6292 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552129 6292 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552456 6292 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552499 6292 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 10:01:41.552527 6292 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 10:01:41.552537 6292 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 10:01:41.552557 6292 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 10:01:41.552545 6292 factory.go:656] Stopping watch factory\\\\nI1209 10:01:41.552577 6292 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.899439 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.911047 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.923997 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.936582 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.948583 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:43Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.953751 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.953804 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.953856 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.953878 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:43 crc kubenswrapper[5002]: I1209 10:01:43.953894 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:43Z","lastTransitionTime":"2025-12-09T10:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.021545 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.056361 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.056396 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.056407 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.056421 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.056430 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:44Z","lastTransitionTime":"2025-12-09T10:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.059790 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:44 crc kubenswrapper[5002]: E1209 10:01:44.059944 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.159367 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.159414 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.159430 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.159447 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.159458 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:44Z","lastTransitionTime":"2025-12-09T10:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.263303 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.263372 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.263391 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.263416 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.263432 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:44Z","lastTransitionTime":"2025-12-09T10:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.366592 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.366671 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.366697 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.366728 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.366751 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:44Z","lastTransitionTime":"2025-12-09T10:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.469446 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.469477 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.469486 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.469499 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.469508 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:44Z","lastTransitionTime":"2025-12-09T10:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.514950 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k"] Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.515682 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.517429 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.518389 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.536094 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:42Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551903 6292 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551957 6292 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551991 6292 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552129 6292 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552456 6292 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552499 6292 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 10:01:41.552527 6292 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 10:01:41.552537 6292 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 10:01:41.552557 6292 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 10:01:41.552545 6292 factory.go:656] Stopping watch factory\\\\nI1209 10:01:41.552577 6292 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.543994 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg8sd\" (UniqueName: \"kubernetes.io/projected/02305623-2d65-47e3-ac63-5182bf50d141-kube-api-access-dg8sd\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.544041 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02305623-2d65-47e3-ac63-5182bf50d141-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.544059 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02305623-2d65-47e3-ac63-5182bf50d141-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.544078 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02305623-2d65-47e3-ac63-5182bf50d141-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.547690 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.558432 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.571294 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.571998 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.572036 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.572047 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.572063 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.572073 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:44Z","lastTransitionTime":"2025-12-09T10:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.581762 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.595442 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.605438 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.624199 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.638181 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.645517 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg8sd\" (UniqueName: \"kubernetes.io/projected/02305623-2d65-47e3-ac63-5182bf50d141-kube-api-access-dg8sd\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.645573 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02305623-2d65-47e3-ac63-5182bf50d141-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.645601 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02305623-2d65-47e3-ac63-5182bf50d141-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.645625 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02305623-2d65-47e3-ac63-5182bf50d141-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.646275 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02305623-2d65-47e3-ac63-5182bf50d141-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.646367 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02305623-2d65-47e3-ac63-5182bf50d141-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.650914 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02305623-2d65-47e3-ac63-5182bf50d141-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.654319 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.662753 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg8sd\" (UniqueName: \"kubernetes.io/projected/02305623-2d65-47e3-ac63-5182bf50d141-kube-api-access-dg8sd\") pod \"ovnkube-control-plane-749d76644c-m689k\" (UID: \"02305623-2d65-47e3-ac63-5182bf50d141\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.668285 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.674771 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.674791 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.674801 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.674830 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.674842 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:44Z","lastTransitionTime":"2025-12-09T10:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.688153 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.727455 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.746626 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/0.log" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.751877 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.754455 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.754598 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.763079 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.772602 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.778735 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.778758 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.778765 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.778777 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.778786 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:44Z","lastTransitionTime":"2025-12-09T10:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.783135 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.792945 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.805403 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.816222 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.826676 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.827171 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.838202 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: W1209 10:01:44.841935 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02305623_2d65_47e3_ac63_5182bf50d141.slice/crio-d00d053e3ebc5b5a0d6218df097badc14bd93cd940cce25c0e10fb73d118b898 WatchSource:0}: Error finding container d00d053e3ebc5b5a0d6218df097badc14bd93cd940cce25c0e10fb73d118b898: Status 404 returned error can't find the container with id d00d053e3ebc5b5a0d6218df097badc14bd93cd940cce25c0e10fb73d118b898 Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.852660 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.865984 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.882373 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.882411 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.882420 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.882434 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.882443 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:44Z","lastTransitionTime":"2025-12-09T10:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.887539 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.899362 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.919964 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.936194 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:42Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551903 6292 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551957 6292 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551991 6292 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552129 6292 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552456 6292 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552499 6292 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 10:01:41.552527 6292 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 10:01:41.552537 6292 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 10:01:41.552557 6292 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 10:01:41.552545 6292 factory.go:656] Stopping watch factory\\\\nI1209 10:01:41.552577 6292 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.946610 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.971928 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.984533 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.984572 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.984583 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.984599 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.984611 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:44Z","lastTransitionTime":"2025-12-09T10:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:44 crc kubenswrapper[5002]: I1209 10:01:44.988259 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.000427 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:44Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.060022 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.060097 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:45 crc kubenswrapper[5002]: E1209 10:01:45.060155 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:45 crc kubenswrapper[5002]: E1209 10:01:45.060232 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.086284 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.086307 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.086314 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.086327 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.086335 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:45Z","lastTransitionTime":"2025-12-09T10:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.190242 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.190548 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.190564 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.190578 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.190589 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:45Z","lastTransitionTime":"2025-12-09T10:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.293350 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.293386 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.293395 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.293409 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.293418 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:45Z","lastTransitionTime":"2025-12-09T10:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.396339 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.396401 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.396414 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.396441 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.396459 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:45Z","lastTransitionTime":"2025-12-09T10:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.499289 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.499337 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.499350 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.499372 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.499385 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:45Z","lastTransitionTime":"2025-12-09T10:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.601549 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.601595 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.601606 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.601624 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.601646 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:45Z","lastTransitionTime":"2025-12-09T10:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.704185 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.704262 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.704284 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.704346 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.704365 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:45Z","lastTransitionTime":"2025-12-09T10:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.759029 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" event={"ID":"02305623-2d65-47e3-ac63-5182bf50d141","Type":"ContainerStarted","Data":"3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.759100 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" event={"ID":"02305623-2d65-47e3-ac63-5182bf50d141","Type":"ContainerStarted","Data":"e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.759127 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" event={"ID":"02305623-2d65-47e3-ac63-5182bf50d141","Type":"ContainerStarted","Data":"d00d053e3ebc5b5a0d6218df097badc14bd93cd940cce25c0e10fb73d118b898"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.761261 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/1.log" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.761898 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/0.log" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.769278 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47" exitCode=1 Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.769330 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.769399 5002 scope.go:117] "RemoveContainer" containerID="36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.770149 5002 scope.go:117] "RemoveContainer" containerID="f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47" Dec 09 10:01:45 crc kubenswrapper[5002]: E1209 10:01:45.770302 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.777446 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.802866 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.806609 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.806676 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.806699 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.806728 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.806749 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:45Z","lastTransitionTime":"2025-12-09T10:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.816509 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.828581 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.842415 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.855762 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.858524 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.872943 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.883074 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.899013 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:42Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551903 6292 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551957 6292 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551991 6292 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552129 6292 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552456 6292 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552499 6292 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 10:01:41.552527 6292 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 10:01:41.552537 6292 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 10:01:41.552557 6292 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 10:01:41.552545 6292 factory.go:656] Stopping watch factory\\\\nI1209 10:01:41.552577 6292 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.908577 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.908619 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.908631 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.908649 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.908662 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:45Z","lastTransitionTime":"2025-12-09T10:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.912274 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.924631 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.939295 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.952305 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.966427 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.977805 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.994749 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.996191 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-98z2f"] Dec 09 10:01:45 crc kubenswrapper[5002]: I1209 10:01:45.996791 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:45 crc kubenswrapper[5002]: E1209 10:01:45.996922 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.008080 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.010318 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.010352 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.010361 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.010376 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.010386 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:46Z","lastTransitionTime":"2025-12-09T10:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.022385 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.036877 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.049262 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.058841 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.060331 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:46 crc kubenswrapper[5002]: E1209 10:01:46.060524 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.060577 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.060675 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn62f\" (UniqueName: \"kubernetes.io/projected/7ffb94c3-624e-48aa-aaa9-450ace4e1862-kube-api-access-mn62f\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.072602 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.083570 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.101679 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.112942 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.112992 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.113008 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.113028 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.113043 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:46Z","lastTransitionTime":"2025-12-09T10:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.159187 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.161594 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn62f\" (UniqueName: \"kubernetes.io/projected/7ffb94c3-624e-48aa-aaa9-450ace4e1862-kube-api-access-mn62f\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.161664 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:46 crc kubenswrapper[5002]: E1209 10:01:46.161781 5002 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:01:46 crc kubenswrapper[5002]: E1209 10:01:46.161864 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs podName:7ffb94c3-624e-48aa-aaa9-450ace4e1862 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:46.661847373 +0000 UTC m=+39.053898454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs") pod "network-metrics-daemon-98z2f" (UID: "7ffb94c3-624e-48aa-aaa9-450ace4e1862") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.174055 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.183679 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn62f\" (UniqueName: \"kubernetes.io/projected/7ffb94c3-624e-48aa-aaa9-450ace4e1862-kube-api-access-mn62f\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.184830 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.197007 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.209957 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.215487 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.215510 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.215518 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.215532 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.215541 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:46Z","lastTransitionTime":"2025-12-09T10:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.222859 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.239655 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:42Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551903 6292 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551957 6292 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551991 6292 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552129 6292 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552456 6292 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552499 6292 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 10:01:41.552527 6292 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 10:01:41.552537 6292 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 10:01:41.552557 6292 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 10:01:41.552545 6292 factory.go:656] Stopping watch factory\\\\nI1209 10:01:41.552577 6292 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"UID: UUIDName:}]\\\\nI1209 10:01:45.368189 6439 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1209 10:01:45.368217 6439 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 10:01:45.368249 6439 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z]\\\\nI1209 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.250608 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.265268 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.275887 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.291403 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.304065 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.318258 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.318300 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.318309 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.318324 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.318334 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:46Z","lastTransitionTime":"2025-12-09T10:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.320628 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.331170 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.341619 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.352995 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.365671 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.378151 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.399658 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.412540 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.420698 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.420751 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.420769 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.420791 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.420803 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:46Z","lastTransitionTime":"2025-12-09T10:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.435610 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:42Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551903 6292 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551957 6292 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551991 6292 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552129 6292 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552456 6292 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552499 6292 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 10:01:41.552527 6292 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 10:01:41.552537 6292 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 10:01:41.552557 6292 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 10:01:41.552545 6292 factory.go:656] Stopping watch factory\\\\nI1209 10:01:41.552577 6292 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"UID: UUIDName:}]\\\\nI1209 10:01:45.368189 6439 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1209 10:01:45.368217 6439 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 10:01:45.368249 6439 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z]\\\\nI1209 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.447664 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.460806 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.470973 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.486778 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:46Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.523710 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.523769 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.523781 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.523799 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.523826 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:46Z","lastTransitionTime":"2025-12-09T10:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.626136 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.626178 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.626191 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.626207 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.626220 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:46Z","lastTransitionTime":"2025-12-09T10:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.666352 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:46 crc kubenswrapper[5002]: E1209 10:01:46.666573 5002 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:01:46 crc kubenswrapper[5002]: E1209 10:01:46.666650 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs podName:7ffb94c3-624e-48aa-aaa9-450ace4e1862 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:47.666629641 +0000 UTC m=+40.058680732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs") pod "network-metrics-daemon-98z2f" (UID: "7ffb94c3-624e-48aa-aaa9-450ace4e1862") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.729255 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.729300 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.729309 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.729323 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.729332 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:46Z","lastTransitionTime":"2025-12-09T10:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.776054 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/1.log" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.831924 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.831983 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.832007 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.832035 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.832058 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:46Z","lastTransitionTime":"2025-12-09T10:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.934099 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.934142 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.934151 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.934169 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:46 crc kubenswrapper[5002]: I1209 10:01:46.934179 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:46Z","lastTransitionTime":"2025-12-09T10:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.037191 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.037270 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.037287 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.037312 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.037328 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:47Z","lastTransitionTime":"2025-12-09T10:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.059568 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.059614 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:47 crc kubenswrapper[5002]: E1209 10:01:47.059751 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:47 crc kubenswrapper[5002]: E1209 10:01:47.059869 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.140923 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.140988 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.141009 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.141032 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.141046 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:47Z","lastTransitionTime":"2025-12-09T10:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.244008 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.244056 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.244068 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.244082 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.244091 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:47Z","lastTransitionTime":"2025-12-09T10:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.346852 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.346900 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.346915 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.346932 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.346944 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:47Z","lastTransitionTime":"2025-12-09T10:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.450220 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.450274 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.450286 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.450303 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.450314 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:47Z","lastTransitionTime":"2025-12-09T10:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.552849 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.552911 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.552928 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.552992 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.553010 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:47Z","lastTransitionTime":"2025-12-09T10:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.656618 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.656687 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.656710 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.656740 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.656762 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:47Z","lastTransitionTime":"2025-12-09T10:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.675309 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:47 crc kubenswrapper[5002]: E1209 10:01:47.675481 5002 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:01:47 crc kubenswrapper[5002]: E1209 10:01:47.675563 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs podName:7ffb94c3-624e-48aa-aaa9-450ace4e1862 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:49.675540037 +0000 UTC m=+42.067591138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs") pod "network-metrics-daemon-98z2f" (UID: "7ffb94c3-624e-48aa-aaa9-450ace4e1862") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.759788 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.759870 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.759878 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.759891 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.759902 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:47Z","lastTransitionTime":"2025-12-09T10:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.862061 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.862104 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.862115 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.862131 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.862142 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:47Z","lastTransitionTime":"2025-12-09T10:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.964218 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.964253 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.964263 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.964294 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:47 crc kubenswrapper[5002]: I1209 10:01:47.964305 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:47Z","lastTransitionTime":"2025-12-09T10:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.059455 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:48 crc kubenswrapper[5002]: E1209 10:01:48.059634 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.059691 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:48 crc kubenswrapper[5002]: E1209 10:01:48.059842 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.065965 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.066010 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.066024 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.066040 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.066052 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:48Z","lastTransitionTime":"2025-12-09T10:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.071103 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.082903 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.095553 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.107721 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.122943 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.146456 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.157169 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.169334 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.169540 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.169612 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.169675 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.169764 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:48Z","lastTransitionTime":"2025-12-09T10:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.180887 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.201044 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.215154 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.231156 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.244674 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.261759 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.272067 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.272098 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.272106 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.272118 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.272127 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:48Z","lastTransitionTime":"2025-12-09T10:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.276249 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.287684 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.308471 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36e952fdc4745c127dee2c79ee17c081bca784c7d35bb530d63619c963442937\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:42Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551903 6292 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551957 6292 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.551991 6292 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552129 6292 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552456 6292 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 10:01:41.552499 6292 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 10:01:41.552527 6292 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 10:01:41.552537 6292 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 10:01:41.552557 6292 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 10:01:41.552545 6292 factory.go:656] Stopping watch factory\\\\nI1209 10:01:41.552577 6292 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"UID: UUIDName:}]\\\\nI1209 10:01:45.368189 6439 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1209 10:01:45.368217 6439 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 10:01:45.368249 6439 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z]\\\\nI1209 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.320108 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:48Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.374899 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.374937 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.374948 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.374964 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.374974 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:48Z","lastTransitionTime":"2025-12-09T10:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.477877 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.477932 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.477944 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.477959 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.477970 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:48Z","lastTransitionTime":"2025-12-09T10:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.581010 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.581062 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.581073 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.581090 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.581101 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:48Z","lastTransitionTime":"2025-12-09T10:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.683970 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.684020 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.684035 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.684056 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.684073 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:48Z","lastTransitionTime":"2025-12-09T10:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.786547 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.786585 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.786596 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.786610 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.786620 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:48Z","lastTransitionTime":"2025-12-09T10:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.888646 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.888691 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.888704 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.888723 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.888736 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:48Z","lastTransitionTime":"2025-12-09T10:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.992510 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.992579 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.992598 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.992623 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:48 crc kubenswrapper[5002]: I1209 10:01:48.992640 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:48Z","lastTransitionTime":"2025-12-09T10:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.059495 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.059536 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:49 crc kubenswrapper[5002]: E1209 10:01:49.060065 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:49 crc kubenswrapper[5002]: E1209 10:01:49.060288 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.096395 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.096430 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.096440 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.096453 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.096463 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.199755 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.200055 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.200221 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.200404 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.200661 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.304272 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.304358 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.304382 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.304407 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.304425 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.407171 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.407231 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.407249 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.407270 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.407285 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.478617 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.478664 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.478680 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.478699 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.478714 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: E1209 10:01:49.495072 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:49Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.499287 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.499349 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.499368 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.499392 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.499411 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: E1209 10:01:49.513716 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:49Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.517969 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.518047 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.518071 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.518102 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.518125 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: E1209 10:01:49.532716 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:49Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.537795 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.537886 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.537904 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.537924 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.537937 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: E1209 10:01:49.553375 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:49Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.557365 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.557573 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.557713 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.557877 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.558014 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: E1209 10:01:49.575388 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:49Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:49 crc kubenswrapper[5002]: E1209 10:01:49.575800 5002 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.577898 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.578059 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.578151 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.578233 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.578312 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.681412 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.682327 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.682475 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.682603 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.682721 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.696318 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:49 crc kubenswrapper[5002]: E1209 10:01:49.696499 5002 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:01:49 crc kubenswrapper[5002]: E1209 10:01:49.696788 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs podName:7ffb94c3-624e-48aa-aaa9-450ace4e1862 nodeName:}" failed. No retries permitted until 2025-12-09 10:01:53.69676287 +0000 UTC m=+46.088814121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs") pod "network-metrics-daemon-98z2f" (UID: "7ffb94c3-624e-48aa-aaa9-450ace4e1862") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.785582 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.785855 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.785926 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.785988 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.786079 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.889100 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.889145 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.889156 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.889174 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.889185 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.991040 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.991101 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.991112 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.991128 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:49 crc kubenswrapper[5002]: I1209 10:01:49.991141 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:49Z","lastTransitionTime":"2025-12-09T10:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.060110 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.060119 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:50 crc kubenswrapper[5002]: E1209 10:01:50.060309 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:50 crc kubenswrapper[5002]: E1209 10:01:50.060390 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.094279 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.094646 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.095096 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.095386 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.095557 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:50Z","lastTransitionTime":"2025-12-09T10:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.198265 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.198312 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.198328 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.198348 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.198365 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:50Z","lastTransitionTime":"2025-12-09T10:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.300884 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.300946 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.300963 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.300987 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.301004 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:50Z","lastTransitionTime":"2025-12-09T10:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.403782 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.403883 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.403907 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.403934 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.403952 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:50Z","lastTransitionTime":"2025-12-09T10:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.506840 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.507237 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.507412 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.507554 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.507693 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:50Z","lastTransitionTime":"2025-12-09T10:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.611690 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.612072 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.612354 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.612584 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.612782 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:50Z","lastTransitionTime":"2025-12-09T10:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.715680 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.715745 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.715766 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.715795 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.715847 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:50Z","lastTransitionTime":"2025-12-09T10:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.819520 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.819578 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.819596 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.819620 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.819638 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:50Z","lastTransitionTime":"2025-12-09T10:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.923303 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.923371 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.923395 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.923424 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:50 crc kubenswrapper[5002]: I1209 10:01:50.923446 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:50Z","lastTransitionTime":"2025-12-09T10:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.030158 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.030234 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.030260 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.030292 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.030325 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:51Z","lastTransitionTime":"2025-12-09T10:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.059283 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.059298 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:51 crc kubenswrapper[5002]: E1209 10:01:51.059753 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:51 crc kubenswrapper[5002]: E1209 10:01:51.059618 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.133318 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.133359 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.133371 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.133386 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.133397 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:51Z","lastTransitionTime":"2025-12-09T10:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.236347 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.236393 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.236406 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.236422 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.236434 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:51Z","lastTransitionTime":"2025-12-09T10:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.339132 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.339187 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.339198 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.339212 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.339221 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:51Z","lastTransitionTime":"2025-12-09T10:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.442079 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.442121 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.442135 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.442154 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.442167 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:51Z","lastTransitionTime":"2025-12-09T10:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.544880 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.544949 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.544972 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.545001 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.545022 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:51Z","lastTransitionTime":"2025-12-09T10:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.646976 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.647016 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.647026 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.647042 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.647053 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:51Z","lastTransitionTime":"2025-12-09T10:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.750900 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.750949 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.750958 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.750973 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.750983 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:51Z","lastTransitionTime":"2025-12-09T10:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.854138 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.854181 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.854193 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.854210 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.854221 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:51Z","lastTransitionTime":"2025-12-09T10:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.957351 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.957467 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.957481 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.957498 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:51 crc kubenswrapper[5002]: I1209 10:01:51.957510 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:51Z","lastTransitionTime":"2025-12-09T10:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.059233 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.059348 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:52 crc kubenswrapper[5002]: E1209 10:01:52.059473 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:52 crc kubenswrapper[5002]: E1209 10:01:52.059690 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.060367 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.060394 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.060405 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.060417 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.060427 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:52Z","lastTransitionTime":"2025-12-09T10:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.163434 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.163737 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.163749 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.163769 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.163781 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:52Z","lastTransitionTime":"2025-12-09T10:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.265794 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.266029 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.266112 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.266227 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.266302 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:52Z","lastTransitionTime":"2025-12-09T10:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.369255 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.369291 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.369302 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.369320 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.369332 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:52Z","lastTransitionTime":"2025-12-09T10:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.472411 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.472458 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.472470 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.472486 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.472501 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:52Z","lastTransitionTime":"2025-12-09T10:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.575547 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.575583 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.575592 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.575605 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.575614 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:52Z","lastTransitionTime":"2025-12-09T10:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.678804 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.679114 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.679206 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.679278 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.679355 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:52Z","lastTransitionTime":"2025-12-09T10:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.784184 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.784235 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.784250 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.784284 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.784301 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:52Z","lastTransitionTime":"2025-12-09T10:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.886838 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.886880 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.886896 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.886916 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.886929 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:52Z","lastTransitionTime":"2025-12-09T10:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.989472 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.989547 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.989571 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.989606 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:52 crc kubenswrapper[5002]: I1209 10:01:52.989632 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:52Z","lastTransitionTime":"2025-12-09T10:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.059198 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:53 crc kubenswrapper[5002]: E1209 10:01:53.059354 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.059678 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:53 crc kubenswrapper[5002]: E1209 10:01:53.059909 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.092138 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.092183 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.092203 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.092220 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.092231 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:53Z","lastTransitionTime":"2025-12-09T10:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.194348 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.194419 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.194445 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.194475 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.194500 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:53Z","lastTransitionTime":"2025-12-09T10:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.297280 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.297637 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.297874 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.298044 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.298200 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:53Z","lastTransitionTime":"2025-12-09T10:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.405109 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.405205 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.405247 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.406106 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.406211 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:53Z","lastTransitionTime":"2025-12-09T10:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.509028 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.509075 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.509087 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.509106 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.509117 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:53Z","lastTransitionTime":"2025-12-09T10:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.611918 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.611965 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.611976 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.611991 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.612002 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:53Z","lastTransitionTime":"2025-12-09T10:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.715059 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.715143 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.715168 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.715203 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.715227 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:53Z","lastTransitionTime":"2025-12-09T10:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.734889 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:53 crc kubenswrapper[5002]: E1209 10:01:53.735026 5002 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:01:53 crc kubenswrapper[5002]: E1209 10:01:53.735090 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs podName:7ffb94c3-624e-48aa-aaa9-450ace4e1862 nodeName:}" failed. No retries permitted until 2025-12-09 10:02:01.735071359 +0000 UTC m=+54.127122440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs") pod "network-metrics-daemon-98z2f" (UID: "7ffb94c3-624e-48aa-aaa9-450ace4e1862") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.817893 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.818487 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.818572 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.818650 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.818713 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:53Z","lastTransitionTime":"2025-12-09T10:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.921165 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.921208 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.921221 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.921237 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:53 crc kubenswrapper[5002]: I1209 10:01:53.921248 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:53Z","lastTransitionTime":"2025-12-09T10:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.024572 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.024611 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.024623 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.024639 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.024652 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:54Z","lastTransitionTime":"2025-12-09T10:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.059640 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:54 crc kubenswrapper[5002]: E1209 10:01:54.059865 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.059670 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:54 crc kubenswrapper[5002]: E1209 10:01:54.060001 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.127611 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.127665 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.127677 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.127697 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.127711 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:54Z","lastTransitionTime":"2025-12-09T10:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.230334 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.230366 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.230374 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.230386 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.230395 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:54Z","lastTransitionTime":"2025-12-09T10:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.333076 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.333141 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.333158 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.333183 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.333200 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:54Z","lastTransitionTime":"2025-12-09T10:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.435694 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.435972 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.436063 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.436166 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.436232 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:54Z","lastTransitionTime":"2025-12-09T10:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.539391 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.539760 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.539953 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.540107 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.540250 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:54Z","lastTransitionTime":"2025-12-09T10:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.643258 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.643621 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.643785 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.644022 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.644162 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:54Z","lastTransitionTime":"2025-12-09T10:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.746921 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.746964 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.746975 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.746992 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.747004 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:54Z","lastTransitionTime":"2025-12-09T10:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.849230 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.850070 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.850095 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.850116 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.850127 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:54Z","lastTransitionTime":"2025-12-09T10:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.953467 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.953538 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.953562 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.953594 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:54 crc kubenswrapper[5002]: I1209 10:01:54.953617 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:54Z","lastTransitionTime":"2025-12-09T10:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.057374 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.057440 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.057465 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.057497 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.057518 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:55Z","lastTransitionTime":"2025-12-09T10:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.059697 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.059728 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:55 crc kubenswrapper[5002]: E1209 10:01:55.059919 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:55 crc kubenswrapper[5002]: E1209 10:01:55.060036 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.159196 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.159235 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.159255 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.159273 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.159283 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:55Z","lastTransitionTime":"2025-12-09T10:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.262267 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.262330 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.262346 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.262369 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.262387 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:55Z","lastTransitionTime":"2025-12-09T10:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.365127 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.365170 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.365179 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.365191 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.365199 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:55Z","lastTransitionTime":"2025-12-09T10:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.468706 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.468761 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.468774 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.468794 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.468805 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:55Z","lastTransitionTime":"2025-12-09T10:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.571756 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.571800 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.571826 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.571845 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.571857 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:55Z","lastTransitionTime":"2025-12-09T10:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.674686 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.674993 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.675114 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.675196 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.675278 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:55Z","lastTransitionTime":"2025-12-09T10:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.778182 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.778257 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.778281 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.778310 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.778330 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:55Z","lastTransitionTime":"2025-12-09T10:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.881162 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.881212 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.881223 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.881241 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.881253 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:55Z","lastTransitionTime":"2025-12-09T10:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.984248 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.984290 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.984301 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.984316 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:55 crc kubenswrapper[5002]: I1209 10:01:55.984328 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:55Z","lastTransitionTime":"2025-12-09T10:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.059288 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.059325 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:56 crc kubenswrapper[5002]: E1209 10:01:56.059477 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:01:56 crc kubenswrapper[5002]: E1209 10:01:56.059577 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.086181 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.086218 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.086230 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.086245 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.086257 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:56Z","lastTransitionTime":"2025-12-09T10:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.188707 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.188755 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.188772 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.188795 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.188855 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:56Z","lastTransitionTime":"2025-12-09T10:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.291153 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.291188 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.291199 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.291216 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.291227 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:56Z","lastTransitionTime":"2025-12-09T10:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.393892 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.393933 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.393945 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.393960 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.393971 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:56Z","lastTransitionTime":"2025-12-09T10:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.497238 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.497519 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.497622 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.497731 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.497888 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:56Z","lastTransitionTime":"2025-12-09T10:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.600522 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.600849 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.600961 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.601063 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.601177 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:56Z","lastTransitionTime":"2025-12-09T10:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.704168 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.704495 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.704761 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.705036 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.705165 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:56Z","lastTransitionTime":"2025-12-09T10:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.808649 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.809017 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.809163 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.809306 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.809431 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:56Z","lastTransitionTime":"2025-12-09T10:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.911965 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.912735 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.912954 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.913170 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:56 crc kubenswrapper[5002]: I1209 10:01:56.913377 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:56Z","lastTransitionTime":"2025-12-09T10:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.017185 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.017221 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.017231 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.017246 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.017256 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:57Z","lastTransitionTime":"2025-12-09T10:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.060015 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.060030 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.060366 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.060199 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.120133 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.120175 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.120185 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.120201 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.120213 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:57Z","lastTransitionTime":"2025-12-09T10:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.223287 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.223383 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.223401 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.223425 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.223443 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:57Z","lastTransitionTime":"2025-12-09T10:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.326702 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.326763 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.326776 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.326799 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.326871 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:57Z","lastTransitionTime":"2025-12-09T10:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.429569 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.429663 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.429677 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.429702 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.429722 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:57Z","lastTransitionTime":"2025-12-09T10:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.532530 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.532576 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.532595 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.532613 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.532624 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:57Z","lastTransitionTime":"2025-12-09T10:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.635957 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.636012 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.636025 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.636042 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.636052 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:57Z","lastTransitionTime":"2025-12-09T10:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.739646 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.739694 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.739708 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.739723 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.739732 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:57Z","lastTransitionTime":"2025-12-09T10:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.846706 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.846775 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.846805 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.846898 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.847170 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:57Z","lastTransitionTime":"2025-12-09T10:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.879985 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.880212 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:02:29.880174323 +0000 UTC m=+82.272225434 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.880529 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.880666 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.880686 5002 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.880857 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:02:29.88084522 +0000 UTC m=+82.272896301 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.880791 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.880777 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.881136 5002 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.881018 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.881139 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.881330 5002 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.881154 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.881374 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.881383 5002 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.881751 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:02:29.881276732 +0000 UTC m=+82.273327823 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.882527 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 10:02:29.882506254 +0000 UTC m=+82.274557335 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:57 crc kubenswrapper[5002]: E1209 10:01:57.882563 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 10:02:29.882556155 +0000 UTC m=+82.274607236 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.903982 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.904924 5002 scope.go:117] "RemoveContainer" containerID="f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.927840 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"UID: UUIDName:}]\\\\nI1209 10:01:45.368189 6439 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1209 10:01:45.368217 6439 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 10:01:45.368249 6439 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z]\\\\nI1209 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:57Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.939649 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:57Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.952251 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.952291 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.952300 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.952316 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.952325 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:57Z","lastTransitionTime":"2025-12-09T10:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.962039 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:57Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.977310 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:57Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:57 crc kubenswrapper[5002]: I1209 10:01:57.992949 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:57Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.009742 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.025340 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.046746 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.055725 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.055770 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.055791 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.055867 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.055880 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:58Z","lastTransitionTime":"2025-12-09T10:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.060275 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:01:58 crc kubenswrapper[5002]: E1209 10:01:58.060412 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.060907 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:01:58 crc kubenswrapper[5002]: E1209 10:01:58.061194 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.061309 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.075721 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.087396 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.100858 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.122495 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.137983 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.149953 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.158853 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.158932 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.158947 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.158966 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.158984 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:58Z","lastTransitionTime":"2025-12-09T10:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.164894 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.179654 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.195706 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.206996 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.224620 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.235489 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.247906 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.261161 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.261210 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.261219 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.261233 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.261242 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:58Z","lastTransitionTime":"2025-12-09T10:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.268077 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.283989 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.309149 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.325156 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.349917 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.363830 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.363877 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.363889 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.363907 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.363917 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:58Z","lastTransitionTime":"2025-12-09T10:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.367140 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.379585 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.401728 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"UID: UUIDName:}]\\\\nI1209 10:01:45.368189 6439 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1209 10:01:45.368217 6439 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 10:01:45.368249 6439 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z]\\\\nI1209 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.425654 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.442701 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.452943 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.463025 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.466531 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.466575 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.466587 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.466603 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.466614 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:58Z","lastTransitionTime":"2025-12-09T10:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.568991 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.569027 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.569037 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.569049 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.569058 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:58Z","lastTransitionTime":"2025-12-09T10:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.671375 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.671425 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.671436 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.671452 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.671463 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:58Z","lastTransitionTime":"2025-12-09T10:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.773626 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.773668 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.773679 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.773697 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.773708 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:58Z","lastTransitionTime":"2025-12-09T10:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.820943 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/1.log" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.824864 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c"} Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.825332 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.838172 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.850033 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.862680 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.876378 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.876421 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.876430 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.876445 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.876454 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:58Z","lastTransitionTime":"2025-12-09T10:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.885679 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.903941 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.919702 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.932140 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.942618 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.970547 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"UID: UUIDName:}]\\\\nI1209 10:01:45.368189 6439 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1209 10:01:45.368217 6439 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 10:01:45.368249 6439 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z]\\\\nI1209 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.978436 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.978469 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.978477 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.978492 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.978502 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:58Z","lastTransitionTime":"2025-12-09T10:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:58 crc kubenswrapper[5002]: I1209 10:01:58.984644 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.000043 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.012542 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.023331 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.039293 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.049585 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.059179 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:01:59 crc kubenswrapper[5002]: E1209 10:01:59.059486 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.059194 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:01:59 crc kubenswrapper[5002]: E1209 10:01:59.059726 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.070169 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.081347 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.081393 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.081406 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.081424 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.081436 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.084558 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.183879 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.183918 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.183934 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.183950 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.183963 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.286253 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.286338 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.286367 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.286392 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.286408 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.299557 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.309328 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.315044 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.326154 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.347714 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.362225 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.378456 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.388667 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.388991 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.389112 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.389268 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.389387 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.404881 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.422746 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.439800 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.455036 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.471056 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.485125 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.491857 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.491894 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.491904 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.491917 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.491927 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.499419 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.520454 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"UID: UUIDName:}]\\\\nI1209 10:01:45.368189 6439 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1209 10:01:45.368217 6439 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 10:01:45.368249 6439 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z]\\\\nI1209 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.531759 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.547280 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.558361 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.567128 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.594386 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.594643 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.594854 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.595068 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.595239 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.698990 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.699076 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.699105 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.699143 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.699168 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.801456 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.801499 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.801509 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.801524 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.801533 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.830661 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/2.log" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.831499 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/1.log" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.834405 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c" exitCode=1 Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.834498 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.834544 5002 scope.go:117] "RemoveContainer" containerID="f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.835371 5002 scope.go:117] "RemoveContainer" containerID="ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c" Dec 09 10:01:59 crc kubenswrapper[5002]: E1209 10:01:59.835511 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.855530 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.867841 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.878593 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.890910 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea4578dd-5c3e-4509-b456-a42a642871d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.901592 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.903648 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.903686 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.903697 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.903710 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.903720 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.914440 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.937876 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.938468 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.938496 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.938504 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.938517 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.938525 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.950027 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: E1209 10:01:59.950853 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.954226 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.954269 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.954278 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.954290 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.954300 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.962665 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: E1209 10:01:59.967040 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.970488 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.970549 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.970563 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.970577 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.970587 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.980263 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: E1209 10:01:59.982493 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.985631 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.985670 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.985679 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.985695 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.985704 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:01:59Z","lastTransitionTime":"2025-12-09T10:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:01:59 crc kubenswrapper[5002]: I1209 10:01:59.994429 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:01:59 crc kubenswrapper[5002]: E1209 10:01:59.997132 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:59Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.000915 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.000945 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.000954 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.000969 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.000979 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:00Z","lastTransitionTime":"2025-12-09T10:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:00 crc kubenswrapper[5002]: E1209 10:02:00.012531 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: E1209 10:02:00.012687 5002 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.014188 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.014242 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.014252 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.014265 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.014275 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:00Z","lastTransitionTime":"2025-12-09T10:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.018687 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.033718 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.047358 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.059289 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.059309 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:00 crc kubenswrapper[5002]: E1209 10:02:00.059666 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:00 crc kubenswrapper[5002]: E1209 10:02:00.060336 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.060911 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.072506 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.090878 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ddee8ee4fd0e1234ecff411ce29f6bad1943263a66cbb3117da6f1b6e84b47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"UID: UUIDName:}]\\\\nI1209 10:01:45.368189 6439 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1209 10:01:45.368217 6439 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 10:01:45.368249 6439 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:45Z is after 2025-08-24T17:21:41Z]\\\\nI1209 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:58Z\\\",\\\"message\\\":\\\"ailed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z]\\\\nI1209 10:01:58.693873 6637 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 10:01:58.693874 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1209 10:01:58.693876 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-kxpn6\\\\nI1209 10:01:58.693850 6637 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k in node crc\\\\nI1209 10:01:58.693892 6637 obj_retry.go\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.101943 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.117296 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.117338 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.117347 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.117362 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.117373 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:00Z","lastTransitionTime":"2025-12-09T10:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.220447 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.220482 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.220490 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.220505 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.220513 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:00Z","lastTransitionTime":"2025-12-09T10:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.322787 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.322875 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.322892 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.322939 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.322956 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:00Z","lastTransitionTime":"2025-12-09T10:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.425442 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.425485 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.425499 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.425520 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.425534 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:00Z","lastTransitionTime":"2025-12-09T10:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.527799 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.527864 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.527875 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.527892 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.527903 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:00Z","lastTransitionTime":"2025-12-09T10:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.630725 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.631201 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.631342 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.631506 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.631665 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:00Z","lastTransitionTime":"2025-12-09T10:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.734447 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.734493 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.734506 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.734521 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.734531 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:00Z","lastTransitionTime":"2025-12-09T10:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.837907 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.837958 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.837970 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.837986 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.837997 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:00Z","lastTransitionTime":"2025-12-09T10:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.841145 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/2.log" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.845387 5002 scope.go:117] "RemoveContainer" containerID="ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c" Dec 09 10:02:00 crc kubenswrapper[5002]: E1209 10:02:00.845574 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.864469 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:58Z\\\",\\\"message\\\":\\\"ailed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z]\\\\nI1209 10:01:58.693873 6637 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 10:01:58.693874 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1209 10:01:58.693876 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-kxpn6\\\\nI1209 10:01:58.693850 6637 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k in node crc\\\\nI1209 10:01:58.693892 6637 obj_retry.go\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.875010 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.893327 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.908878 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.921847 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.937059 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.940875 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.940923 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.940931 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.940968 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.940980 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:00Z","lastTransitionTime":"2025-12-09T10:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.947466 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.956709 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea4578dd-5c3e-4509-b456-a42a642871d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.968274 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.977123 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:00 crc kubenswrapper[5002]: I1209 10:02:00.988631 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.000277 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:00Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.010424 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:01Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.025054 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:01Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.036277 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:01Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.043046 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.043077 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.043088 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.043111 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.043126 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:01Z","lastTransitionTime":"2025-12-09T10:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.048033 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:01Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.059151 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.059208 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:01 crc kubenswrapper[5002]: E1209 10:02:01.059274 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:01 crc kubenswrapper[5002]: E1209 10:02:01.059414 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.066004 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:01Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.078745 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:01Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.146726 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.146754 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.146762 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.146776 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.146787 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:01Z","lastTransitionTime":"2025-12-09T10:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.250149 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.250191 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.250201 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.250215 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.250226 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:01Z","lastTransitionTime":"2025-12-09T10:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.352344 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.352398 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.352442 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.352458 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.352472 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:01Z","lastTransitionTime":"2025-12-09T10:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.455520 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.455584 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.455602 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.455628 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.455645 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:01Z","lastTransitionTime":"2025-12-09T10:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.558537 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.558902 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.559092 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.559235 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.559354 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:01Z","lastTransitionTime":"2025-12-09T10:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.661642 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.661889 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.662008 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.662096 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.662184 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:01Z","lastTransitionTime":"2025-12-09T10:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.765671 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.765721 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.765733 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.765755 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.765769 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:01Z","lastTransitionTime":"2025-12-09T10:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.818436 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:01 crc kubenswrapper[5002]: E1209 10:02:01.818659 5002 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:02:01 crc kubenswrapper[5002]: E1209 10:02:01.818788 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs podName:7ffb94c3-624e-48aa-aaa9-450ace4e1862 nodeName:}" failed. No retries permitted until 2025-12-09 10:02:17.81875655 +0000 UTC m=+70.210807661 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs") pod "network-metrics-daemon-98z2f" (UID: "7ffb94c3-624e-48aa-aaa9-450ace4e1862") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.868465 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.868522 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.868539 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.868564 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.868585 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:01Z","lastTransitionTime":"2025-12-09T10:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.972556 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.972614 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.972632 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.972655 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:01 crc kubenswrapper[5002]: I1209 10:02:01.972672 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:01Z","lastTransitionTime":"2025-12-09T10:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.059795 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.059860 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:02 crc kubenswrapper[5002]: E1209 10:02:02.059947 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:02 crc kubenswrapper[5002]: E1209 10:02:02.060053 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.075765 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.075840 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.075851 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.075865 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.075876 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:02Z","lastTransitionTime":"2025-12-09T10:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.178788 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.179063 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.179226 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.179298 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.179357 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:02Z","lastTransitionTime":"2025-12-09T10:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.282393 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.282456 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.282473 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.282496 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.282513 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:02Z","lastTransitionTime":"2025-12-09T10:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.385082 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.385351 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.385413 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.385509 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.385605 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:02Z","lastTransitionTime":"2025-12-09T10:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.488165 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.488197 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.488205 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.488219 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.488228 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:02Z","lastTransitionTime":"2025-12-09T10:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.591596 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.591660 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.591684 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.591716 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.591740 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:02Z","lastTransitionTime":"2025-12-09T10:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.694080 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.694121 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.694135 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.694154 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.694167 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:02Z","lastTransitionTime":"2025-12-09T10:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.796671 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.796759 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.796786 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.796869 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.796909 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:02Z","lastTransitionTime":"2025-12-09T10:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.900030 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.900107 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.900137 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.900155 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:02 crc kubenswrapper[5002]: I1209 10:02:02.900167 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:02Z","lastTransitionTime":"2025-12-09T10:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.004121 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.004173 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.004194 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.004227 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.004250 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:03Z","lastTransitionTime":"2025-12-09T10:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.059586 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.059625 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:03 crc kubenswrapper[5002]: E1209 10:02:03.059731 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:03 crc kubenswrapper[5002]: E1209 10:02:03.059884 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.106385 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.106447 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.106472 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.106501 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.106526 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:03Z","lastTransitionTime":"2025-12-09T10:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.210043 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.210117 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.210140 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.210170 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.210194 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:03Z","lastTransitionTime":"2025-12-09T10:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.312988 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.313055 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.313078 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.313105 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.313127 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:03Z","lastTransitionTime":"2025-12-09T10:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.415428 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.416280 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.416310 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.416331 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.416345 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:03Z","lastTransitionTime":"2025-12-09T10:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.520051 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.520137 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.520163 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.520213 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.520238 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:03Z","lastTransitionTime":"2025-12-09T10:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.623142 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.623194 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.623209 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.623230 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.623241 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:03Z","lastTransitionTime":"2025-12-09T10:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.726333 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.726399 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.726412 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.726431 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.726443 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:03Z","lastTransitionTime":"2025-12-09T10:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.828889 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.828931 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.828940 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.828955 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.828966 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:03Z","lastTransitionTime":"2025-12-09T10:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.931897 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.931938 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.931954 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.931972 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:03 crc kubenswrapper[5002]: I1209 10:02:03.931981 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:03Z","lastTransitionTime":"2025-12-09T10:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.034302 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.034357 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.034370 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.034393 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.034413 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:04Z","lastTransitionTime":"2025-12-09T10:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.079428 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:04 crc kubenswrapper[5002]: E1209 10:02:04.079638 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.080251 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:04 crc kubenswrapper[5002]: E1209 10:02:04.080485 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.137925 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.137984 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.137999 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.138024 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.138040 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:04Z","lastTransitionTime":"2025-12-09T10:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.240972 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.241026 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.241039 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.241072 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.241087 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:04Z","lastTransitionTime":"2025-12-09T10:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.344212 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.344253 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.344263 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.344281 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.344293 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:04Z","lastTransitionTime":"2025-12-09T10:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.446921 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.447005 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.447032 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.447051 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.447062 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:04Z","lastTransitionTime":"2025-12-09T10:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.555334 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.555379 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.555392 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.555414 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.555427 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:04Z","lastTransitionTime":"2025-12-09T10:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.658737 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.659111 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.659204 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.659299 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.659379 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:04Z","lastTransitionTime":"2025-12-09T10:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.761729 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.761793 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.761835 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.761856 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.761868 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:04Z","lastTransitionTime":"2025-12-09T10:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.864061 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.864092 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.864101 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.864112 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.864121 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:04Z","lastTransitionTime":"2025-12-09T10:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.966241 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.966283 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.966294 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.966310 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:04 crc kubenswrapper[5002]: I1209 10:02:04.966320 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:04Z","lastTransitionTime":"2025-12-09T10:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.059719 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:05 crc kubenswrapper[5002]: E1209 10:02:05.059867 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.059719 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:05 crc kubenswrapper[5002]: E1209 10:02:05.060023 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.068491 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.068556 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.068580 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.068619 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.068641 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:05Z","lastTransitionTime":"2025-12-09T10:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.172364 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.172401 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.172412 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.172427 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.172440 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:05Z","lastTransitionTime":"2025-12-09T10:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.275544 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.275611 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.275630 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.275653 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.275673 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:05Z","lastTransitionTime":"2025-12-09T10:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.378608 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.378671 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.378684 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.378701 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.378714 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:05Z","lastTransitionTime":"2025-12-09T10:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.481042 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.481332 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.481396 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.481472 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.481535 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:05Z","lastTransitionTime":"2025-12-09T10:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.584693 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.584724 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.584753 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.584766 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.584774 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:05Z","lastTransitionTime":"2025-12-09T10:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.687261 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.687312 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.687353 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.687370 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.687381 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:05Z","lastTransitionTime":"2025-12-09T10:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.790315 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.790359 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.790370 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.790386 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.790399 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:05Z","lastTransitionTime":"2025-12-09T10:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.892731 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.892764 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.892776 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.892793 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.892804 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:05Z","lastTransitionTime":"2025-12-09T10:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.995671 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.995717 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.995731 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.995750 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:05 crc kubenswrapper[5002]: I1209 10:02:05.995766 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:05Z","lastTransitionTime":"2025-12-09T10:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.060291 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.060291 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:06 crc kubenswrapper[5002]: E1209 10:02:06.061031 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:06 crc kubenswrapper[5002]: E1209 10:02:06.060958 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.098522 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.098589 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.098612 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.098640 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.098662 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:06Z","lastTransitionTime":"2025-12-09T10:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.202066 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.202105 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.202116 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.202133 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.202144 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:06Z","lastTransitionTime":"2025-12-09T10:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.303932 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.303987 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.304005 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.304029 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.304047 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:06Z","lastTransitionTime":"2025-12-09T10:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.406111 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.406146 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.406155 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.406168 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.406177 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:06Z","lastTransitionTime":"2025-12-09T10:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.508778 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.508856 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.508869 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.508886 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.508896 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:06Z","lastTransitionTime":"2025-12-09T10:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.611284 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.611325 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.611338 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.611357 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.611369 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:06Z","lastTransitionTime":"2025-12-09T10:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.713985 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.714025 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.714035 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.714051 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.714061 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:06Z","lastTransitionTime":"2025-12-09T10:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.816900 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.816937 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.816947 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.816959 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.816969 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:06Z","lastTransitionTime":"2025-12-09T10:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.919847 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.919893 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.919905 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.919922 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:06 crc kubenswrapper[5002]: I1209 10:02:06.919933 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:06Z","lastTransitionTime":"2025-12-09T10:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.027312 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.027788 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.028054 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.028295 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.028515 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:07Z","lastTransitionTime":"2025-12-09T10:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.059835 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:07 crc kubenswrapper[5002]: E1209 10:02:07.060216 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.059961 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:07 crc kubenswrapper[5002]: E1209 10:02:07.060510 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.131389 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.131436 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.131445 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.131458 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.131467 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:07Z","lastTransitionTime":"2025-12-09T10:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.233929 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.233967 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.233977 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.233995 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.234005 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:07Z","lastTransitionTime":"2025-12-09T10:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.336254 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.336300 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.336316 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.336333 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.336344 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:07Z","lastTransitionTime":"2025-12-09T10:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.439302 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.439407 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.439419 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.439432 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.439441 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:07Z","lastTransitionTime":"2025-12-09T10:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.541938 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.541972 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.541983 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.541999 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.542010 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:07Z","lastTransitionTime":"2025-12-09T10:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.644311 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.644342 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.644350 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.644362 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.644371 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:07Z","lastTransitionTime":"2025-12-09T10:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.747116 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.747184 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.747203 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.747227 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.747248 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:07Z","lastTransitionTime":"2025-12-09T10:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.850427 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.850500 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.850517 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.850546 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.850564 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:07Z","lastTransitionTime":"2025-12-09T10:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.954298 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.954367 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.954386 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.954408 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:07 crc kubenswrapper[5002]: I1209 10:02:07.954425 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:07Z","lastTransitionTime":"2025-12-09T10:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.057207 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.057240 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.057250 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.057265 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.057276 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:08Z","lastTransitionTime":"2025-12-09T10:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.060097 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:08 crc kubenswrapper[5002]: E1209 10:02:08.060289 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.060178 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:08 crc kubenswrapper[5002]: E1209 10:02:08.060990 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.082997 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.096411 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.109962 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.123734 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.143212 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.159772 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.160149 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.160272 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.160296 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.160325 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.160345 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:08Z","lastTransitionTime":"2025-12-09T10:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.179654 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.202156 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.215195 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.234489 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:58Z\\\",\\\"message\\\":\\\"ailed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z]\\\\nI1209 10:01:58.693873 6637 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 10:01:58.693874 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1209 10:01:58.693876 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-kxpn6\\\\nI1209 10:01:58.693850 6637 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k in node crc\\\\nI1209 10:01:58.693892 6637 obj_retry.go\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.245560 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.258510 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.262244 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.262279 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.262287 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.262303 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.262314 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:08Z","lastTransitionTime":"2025-12-09T10:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.270643 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.282406 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.295225 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.304721 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.316316 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea4578dd-5c3e-4509-b456-a42a642871d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.333044 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:08Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.364565 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.364805 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.364882 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.364977 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.365078 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:08Z","lastTransitionTime":"2025-12-09T10:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.468016 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.468479 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.468726 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.468981 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.469196 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:08Z","lastTransitionTime":"2025-12-09T10:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.571681 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.571722 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.571735 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.571751 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.571762 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:08Z","lastTransitionTime":"2025-12-09T10:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.674969 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.675007 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.675018 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.675034 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.675048 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:08Z","lastTransitionTime":"2025-12-09T10:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.777221 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.777295 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.777320 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.777352 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.777378 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:08Z","lastTransitionTime":"2025-12-09T10:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.880525 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.880914 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.880929 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.880950 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.880966 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:08Z","lastTransitionTime":"2025-12-09T10:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.983860 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.984101 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.984210 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.984304 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:08 crc kubenswrapper[5002]: I1209 10:02:08.984383 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:08Z","lastTransitionTime":"2025-12-09T10:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.059858 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.059858 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:09 crc kubenswrapper[5002]: E1209 10:02:09.059996 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:09 crc kubenswrapper[5002]: E1209 10:02:09.060081 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.086558 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.086609 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.086625 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.086646 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.086658 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:09Z","lastTransitionTime":"2025-12-09T10:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.189167 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.189208 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.189221 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.189237 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.189253 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:09Z","lastTransitionTime":"2025-12-09T10:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.292549 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.292856 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.292981 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.293231 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.293340 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:09Z","lastTransitionTime":"2025-12-09T10:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.395856 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.396284 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.396587 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.396906 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.397115 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:09Z","lastTransitionTime":"2025-12-09T10:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.500400 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.500466 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.500491 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.500519 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.500541 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:09Z","lastTransitionTime":"2025-12-09T10:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.603374 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.603678 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.603772 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.603908 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.603986 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:09Z","lastTransitionTime":"2025-12-09T10:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.706088 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.706166 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.706177 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.706191 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.706204 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:09Z","lastTransitionTime":"2025-12-09T10:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.808872 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.809132 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.809213 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.809302 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.809377 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:09Z","lastTransitionTime":"2025-12-09T10:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.911837 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.911869 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.911879 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.911893 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:09 crc kubenswrapper[5002]: I1209 10:02:09.911903 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:09Z","lastTransitionTime":"2025-12-09T10:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.015233 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.015288 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.015306 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.015332 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.015352 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.060340 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:10 crc kubenswrapper[5002]: E1209 10:02:10.060618 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.060731 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:10 crc kubenswrapper[5002]: E1209 10:02:10.060859 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.117496 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.117545 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.117561 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.117583 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.117599 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.220447 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.220513 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.220548 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.220563 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.220575 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.323325 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.323371 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.323386 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.323403 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.323416 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.377108 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.377139 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.377148 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.377161 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.377170 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: E1209 10:02:10.389377 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:10Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.393649 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.393677 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.393710 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.393725 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.393735 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: E1209 10:02:10.410717 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:10Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.416400 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.416453 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.416465 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.416483 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.416494 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: E1209 10:02:10.429305 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:10Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.433898 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.433950 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.433965 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.433983 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.433993 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: E1209 10:02:10.451725 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:10Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.455867 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.456042 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.456165 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.456293 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.456419 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: E1209 10:02:10.473929 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:10Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:10 crc kubenswrapper[5002]: E1209 10:02:10.474473 5002 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.476504 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.476534 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.476541 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.476555 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.476565 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.579501 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.580046 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.580291 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.580537 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.580808 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.684611 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.684668 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.684681 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.684700 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.684713 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.786683 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.786740 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.786755 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.786775 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.786790 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.889292 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.889329 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.889339 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.889357 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.889370 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.993159 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.993602 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.993636 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.993669 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:10 crc kubenswrapper[5002]: I1209 10:02:10.993686 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:10Z","lastTransitionTime":"2025-12-09T10:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.059709 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.059728 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:11 crc kubenswrapper[5002]: E1209 10:02:11.059869 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:11 crc kubenswrapper[5002]: E1209 10:02:11.059932 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.097294 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.097357 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.097369 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.097385 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.097398 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:11Z","lastTransitionTime":"2025-12-09T10:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.200345 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.200959 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.200997 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.201029 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.201051 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:11Z","lastTransitionTime":"2025-12-09T10:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.303861 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.303899 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.303908 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.303921 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.303930 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:11Z","lastTransitionTime":"2025-12-09T10:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.406161 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.406223 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.406234 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.406249 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.406261 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:11Z","lastTransitionTime":"2025-12-09T10:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.508571 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.508610 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.508622 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.508639 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.508649 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:11Z","lastTransitionTime":"2025-12-09T10:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.611409 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.611454 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.611468 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.611487 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.611500 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:11Z","lastTransitionTime":"2025-12-09T10:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.713895 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.713927 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.713954 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.713968 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.713976 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:11Z","lastTransitionTime":"2025-12-09T10:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.816055 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.816112 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.816127 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.816146 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.816158 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:11Z","lastTransitionTime":"2025-12-09T10:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.919042 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.919129 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.919154 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.919185 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:11 crc kubenswrapper[5002]: I1209 10:02:11.919210 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:11Z","lastTransitionTime":"2025-12-09T10:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.021868 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.021912 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.021921 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.021935 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.021944 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:12Z","lastTransitionTime":"2025-12-09T10:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.060123 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:12 crc kubenswrapper[5002]: E1209 10:02:12.060256 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.060152 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:12 crc kubenswrapper[5002]: E1209 10:02:12.060366 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.124185 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.124229 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.124239 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.124256 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.124266 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:12Z","lastTransitionTime":"2025-12-09T10:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.226903 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.227435 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.227524 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.227609 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.227689 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:12Z","lastTransitionTime":"2025-12-09T10:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.330245 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.330533 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.330631 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.330719 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.330786 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:12Z","lastTransitionTime":"2025-12-09T10:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.433062 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.433097 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.433106 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.433121 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.433131 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:12Z","lastTransitionTime":"2025-12-09T10:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.535609 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.535896 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.536056 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.536167 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.536255 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:12Z","lastTransitionTime":"2025-12-09T10:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.638978 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.639025 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.639042 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.639059 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.639071 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:12Z","lastTransitionTime":"2025-12-09T10:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.741605 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.741667 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.741682 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.741700 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.741715 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:12Z","lastTransitionTime":"2025-12-09T10:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.843419 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.843469 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.843486 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.843502 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.843513 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:12Z","lastTransitionTime":"2025-12-09T10:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.946327 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.946370 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.946381 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.946397 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:12 crc kubenswrapper[5002]: I1209 10:02:12.946407 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:12Z","lastTransitionTime":"2025-12-09T10:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.048461 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.048497 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.048505 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.048517 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.048529 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:13Z","lastTransitionTime":"2025-12-09T10:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.059746 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.059770 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:13 crc kubenswrapper[5002]: E1209 10:02:13.059887 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:13 crc kubenswrapper[5002]: E1209 10:02:13.059965 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.060639 5002 scope.go:117] "RemoveContainer" containerID="ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c" Dec 09 10:02:13 crc kubenswrapper[5002]: E1209 10:02:13.060800 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.151047 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.151083 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.151093 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.151108 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.151119 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:13Z","lastTransitionTime":"2025-12-09T10:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.253562 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.253607 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.253618 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.253634 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.253643 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:13Z","lastTransitionTime":"2025-12-09T10:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.358289 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.358545 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.358626 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.358661 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.358746 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:13Z","lastTransitionTime":"2025-12-09T10:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.462125 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.462173 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.462185 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.462203 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.462216 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:13Z","lastTransitionTime":"2025-12-09T10:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.565225 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.565268 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.565281 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.565297 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.565307 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:13Z","lastTransitionTime":"2025-12-09T10:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.668862 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.668906 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.668914 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.668928 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.668937 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:13Z","lastTransitionTime":"2025-12-09T10:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.771472 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.771506 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.771523 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.771540 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.771551 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:13Z","lastTransitionTime":"2025-12-09T10:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.874376 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.874422 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.874434 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.874453 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.874464 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:13Z","lastTransitionTime":"2025-12-09T10:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.977467 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.977500 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.977509 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.977521 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:13 crc kubenswrapper[5002]: I1209 10:02:13.977532 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:13Z","lastTransitionTime":"2025-12-09T10:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.059507 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.059536 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:14 crc kubenswrapper[5002]: E1209 10:02:14.059643 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:14 crc kubenswrapper[5002]: E1209 10:02:14.059806 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.080388 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.080436 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.080451 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.080473 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.080490 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:14Z","lastTransitionTime":"2025-12-09T10:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.183424 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.183507 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.183527 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.183548 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.183562 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:14Z","lastTransitionTime":"2025-12-09T10:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.286056 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.286101 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.286112 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.286128 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.286139 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:14Z","lastTransitionTime":"2025-12-09T10:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.388262 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.388304 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.388312 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.388328 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.388341 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:14Z","lastTransitionTime":"2025-12-09T10:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.490055 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.490090 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.490106 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.490121 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.490131 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:14Z","lastTransitionTime":"2025-12-09T10:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.592143 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.592176 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.592187 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.592201 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.592210 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:14Z","lastTransitionTime":"2025-12-09T10:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.694901 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.694954 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.694977 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.694997 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.695014 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:14Z","lastTransitionTime":"2025-12-09T10:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.796944 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.796992 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.797004 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.797023 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.797035 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:14Z","lastTransitionTime":"2025-12-09T10:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.899168 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.899203 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.899221 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.899239 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:14 crc kubenswrapper[5002]: I1209 10:02:14.899249 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:14Z","lastTransitionTime":"2025-12-09T10:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.001297 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.001334 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.001346 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.001362 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.001372 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:15Z","lastTransitionTime":"2025-12-09T10:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.059775 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.059775 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:15 crc kubenswrapper[5002]: E1209 10:02:15.059949 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:15 crc kubenswrapper[5002]: E1209 10:02:15.060063 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.103841 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.103881 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.103893 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.103908 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.103916 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:15Z","lastTransitionTime":"2025-12-09T10:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.206097 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.206139 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.206151 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.206166 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.206175 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:15Z","lastTransitionTime":"2025-12-09T10:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.308738 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.308773 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.308781 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.308794 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.308824 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:15Z","lastTransitionTime":"2025-12-09T10:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.411120 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.411155 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.411166 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.411181 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.411232 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:15Z","lastTransitionTime":"2025-12-09T10:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.513914 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.513980 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.513995 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.514020 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.514034 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:15Z","lastTransitionTime":"2025-12-09T10:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.618968 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.619017 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.619028 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.619045 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.619058 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:15Z","lastTransitionTime":"2025-12-09T10:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.721908 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.721959 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.721971 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.721988 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.722001 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:15Z","lastTransitionTime":"2025-12-09T10:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.824626 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.824682 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.824711 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.824728 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.824739 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:15Z","lastTransitionTime":"2025-12-09T10:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.927204 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.927243 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.927254 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.927268 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:15 crc kubenswrapper[5002]: I1209 10:02:15.927277 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:15Z","lastTransitionTime":"2025-12-09T10:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.029203 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.029247 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.029260 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.029277 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.029288 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:16Z","lastTransitionTime":"2025-12-09T10:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.059956 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.060029 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:16 crc kubenswrapper[5002]: E1209 10:02:16.060575 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:16 crc kubenswrapper[5002]: E1209 10:02:16.060713 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.131895 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.131974 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.131997 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.132027 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.132052 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:16Z","lastTransitionTime":"2025-12-09T10:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.234760 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.235078 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.235186 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.235290 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.235390 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:16Z","lastTransitionTime":"2025-12-09T10:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.337574 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.337609 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.337619 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.337656 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.337670 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:16Z","lastTransitionTime":"2025-12-09T10:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.440357 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.440434 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.440468 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.440486 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.440497 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:16Z","lastTransitionTime":"2025-12-09T10:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.543091 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.543128 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.543138 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.543155 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.543168 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:16Z","lastTransitionTime":"2025-12-09T10:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.645908 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.645949 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.645961 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.645978 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.645990 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:16Z","lastTransitionTime":"2025-12-09T10:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.748105 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.748152 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.748166 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.748181 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.748194 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:16Z","lastTransitionTime":"2025-12-09T10:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.851194 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.851243 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.851258 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.851275 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.851286 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:16Z","lastTransitionTime":"2025-12-09T10:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.953690 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.953736 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.953747 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.953763 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:16 crc kubenswrapper[5002]: I1209 10:02:16.953777 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:16Z","lastTransitionTime":"2025-12-09T10:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.056491 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.056536 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.056550 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.056568 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.056581 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:17Z","lastTransitionTime":"2025-12-09T10:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.059926 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.059986 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:17 crc kubenswrapper[5002]: E1209 10:02:17.060062 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:17 crc kubenswrapper[5002]: E1209 10:02:17.060190 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.159702 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.159782 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.159806 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.159879 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.159904 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:17Z","lastTransitionTime":"2025-12-09T10:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.262385 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.262436 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.262445 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.262461 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.262470 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:17Z","lastTransitionTime":"2025-12-09T10:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.365230 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.365316 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.365327 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.365348 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.365360 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:17Z","lastTransitionTime":"2025-12-09T10:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.467422 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.467461 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.467468 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.467482 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.467492 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:17Z","lastTransitionTime":"2025-12-09T10:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.570646 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.570717 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.570733 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.570753 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.570767 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:17Z","lastTransitionTime":"2025-12-09T10:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.672992 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.673037 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.673048 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.673063 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.673073 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:17Z","lastTransitionTime":"2025-12-09T10:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.775568 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.775603 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.775611 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.775626 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.775638 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:17Z","lastTransitionTime":"2025-12-09T10:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.819062 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:17 crc kubenswrapper[5002]: E1209 10:02:17.819191 5002 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:02:17 crc kubenswrapper[5002]: E1209 10:02:17.819242 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs podName:7ffb94c3-624e-48aa-aaa9-450ace4e1862 nodeName:}" failed. No retries permitted until 2025-12-09 10:02:49.819229097 +0000 UTC m=+102.211280178 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs") pod "network-metrics-daemon-98z2f" (UID: "7ffb94c3-624e-48aa-aaa9-450ace4e1862") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.879110 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.879155 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.879164 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.879177 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.879187 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:17Z","lastTransitionTime":"2025-12-09T10:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.981238 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.981277 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.981287 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.981300 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:17 crc kubenswrapper[5002]: I1209 10:02:17.981311 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:17Z","lastTransitionTime":"2025-12-09T10:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.059535 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.059628 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:18 crc kubenswrapper[5002]: E1209 10:02:18.059719 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:18 crc kubenswrapper[5002]: E1209 10:02:18.059842 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.082287 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.084083 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.084123 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.084136 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.084153 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.084164 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:18Z","lastTransitionTime":"2025-12-09T10:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.096785 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.108500 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.120327 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.131743 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.142974 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.156398 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.169407 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.186671 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.186900 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.186926 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.186940 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.186950 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:18Z","lastTransitionTime":"2025-12-09T10:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.187117 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:58Z\\\",\\\"message\\\":\\\"ailed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z]\\\\nI1209 10:01:58.693873 6637 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 10:01:58.693874 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1209 10:01:58.693876 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-kxpn6\\\\nI1209 10:01:58.693850 6637 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k in node crc\\\\nI1209 10:01:58.693892 6637 obj_retry.go\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.198872 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.211253 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.221600 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.234285 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.246628 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.256253 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.270693 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.280052 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.288828 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.288869 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.288880 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.288897 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.288907 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:18Z","lastTransitionTime":"2025-12-09T10:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.290644 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea4578dd-5c3e-4509-b456-a42a642871d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.391246 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.391289 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.391304 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.391326 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.391341 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:18Z","lastTransitionTime":"2025-12-09T10:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.495532 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.497870 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.497887 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.497926 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.497942 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:18Z","lastTransitionTime":"2025-12-09T10:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.601514 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.601559 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.601569 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.601586 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.601602 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:18Z","lastTransitionTime":"2025-12-09T10:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.704298 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.704337 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.704350 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.704373 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.704384 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:18Z","lastTransitionTime":"2025-12-09T10:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.806682 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.806742 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.806751 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.806765 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.806775 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:18Z","lastTransitionTime":"2025-12-09T10:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.905445 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rgf44_28ed6e93-eda5-4648-b185-25d2960ce0f0/kube-multus/0.log" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.905488 5002 generic.go:334] "Generic (PLEG): container finished" podID="28ed6e93-eda5-4648-b185-25d2960ce0f0" containerID="541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b" exitCode=1 Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.905514 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rgf44" event={"ID":"28ed6e93-eda5-4648-b185-25d2960ce0f0","Type":"ContainerDied","Data":"541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b"} Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.905880 5002 scope.go:117] "RemoveContainer" containerID="541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.909175 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.909209 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.909220 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.909276 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.909290 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:18Z","lastTransitionTime":"2025-12-09T10:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.920628 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea4578dd-5c3e-4509-b456-a42a642871d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.941767 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.951187 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.962861 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.972806 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.984078 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:18 crc kubenswrapper[5002]: I1209 10:02:18.996092 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:18Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.007479 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:02:18Z\\\",\\\"message\\\":\\\"2025-12-09T10:01:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891\\\\n2025-12-09T10:01:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891 to /host/opt/cni/bin/\\\\n2025-12-09T10:01:33Z [verbose] multus-daemon started\\\\n2025-12-09T10:01:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T10:02:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.010698 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.010741 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.010755 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.010772 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.010784 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:19Z","lastTransitionTime":"2025-12-09T10:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.018160 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.045805 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.058803 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.059118 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.059132 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:19 crc kubenswrapper[5002]: E1209 10:02:19.059239 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:19 crc kubenswrapper[5002]: E1209 10:02:19.059379 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.070428 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.082526 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.098757 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:58Z\\\",\\\"message\\\":\\\"ailed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z]\\\\nI1209 10:01:58.693873 6637 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 10:01:58.693874 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1209 10:01:58.693876 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-kxpn6\\\\nI1209 10:01:58.693850 6637 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k in node crc\\\\nI1209 10:01:58.693892 6637 obj_retry.go\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.111869 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.113555 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.113596 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.113608 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.113627 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.113641 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:19Z","lastTransitionTime":"2025-12-09T10:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.125568 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.136886 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.146548 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.216348 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.216391 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.216408 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.216423 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.216434 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:19Z","lastTransitionTime":"2025-12-09T10:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.318997 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.319065 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.319079 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.319095 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.319106 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:19Z","lastTransitionTime":"2025-12-09T10:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.421652 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.421730 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.421740 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.421755 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.421763 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:19Z","lastTransitionTime":"2025-12-09T10:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.524754 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.524797 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.524808 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.524846 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.524857 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:19Z","lastTransitionTime":"2025-12-09T10:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.628311 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.628364 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.628378 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.628396 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.628409 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:19Z","lastTransitionTime":"2025-12-09T10:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.731856 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.731896 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.731905 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.731919 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.731928 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:19Z","lastTransitionTime":"2025-12-09T10:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.834540 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.834587 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.834595 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.834609 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.834617 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:19Z","lastTransitionTime":"2025-12-09T10:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.910762 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rgf44_28ed6e93-eda5-4648-b185-25d2960ce0f0/kube-multus/0.log" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.910842 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rgf44" event={"ID":"28ed6e93-eda5-4648-b185-25d2960ce0f0","Type":"ContainerStarted","Data":"fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23"} Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.924394 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.935229 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.937693 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.937744 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.937761 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.937783 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.937800 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:19Z","lastTransitionTime":"2025-12-09T10:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.946854 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.964345 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.980057 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:19 crc kubenswrapper[5002]: I1209 10:02:19.993676 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea4578dd-5c3e-4509-b456-a42a642871d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:19Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.005433 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.015237 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.028656 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.040139 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.040189 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.040205 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.040226 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.040244 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.041908 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.054118 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.059751 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.059802 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:20 crc kubenswrapper[5002]: E1209 10:02:20.059957 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:20 crc kubenswrapper[5002]: E1209 10:02:20.060101 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.067333 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.081258 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:02:18Z\\\",\\\"message\\\":\\\"2025-12-09T10:01:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891\\\\n2025-12-09T10:01:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891 to /host/opt/cni/bin/\\\\n2025-12-09T10:01:33Z [verbose] multus-daemon started\\\\n2025-12-09T10:01:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T10:02:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.094007 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.110218 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.122151 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.138021 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:58Z\\\",\\\"message\\\":\\\"ailed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z]\\\\nI1209 10:01:58.693873 6637 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 10:01:58.693874 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1209 10:01:58.693876 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-kxpn6\\\\nI1209 10:01:58.693850 6637 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k in node crc\\\\nI1209 10:01:58.693892 6637 obj_retry.go\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.142144 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.142182 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.142193 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.142208 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.142217 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.150312 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.244713 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.244760 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.244778 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.244845 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.244864 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.346714 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.346754 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.346763 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.346777 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.346787 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.448417 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.448463 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.448475 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.448494 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.448506 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.551107 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.551152 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.551165 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.551181 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.551192 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.554199 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.554247 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.554259 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.554275 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.554287 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: E1209 10:02:20.570639 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.574757 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.574833 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.574847 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.574866 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.574881 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: E1209 10:02:20.588375 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.592182 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.592230 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.592327 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.592371 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.592382 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: E1209 10:02:20.606718 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.610416 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.610456 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.610465 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.610481 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.610493 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: E1209 10:02:20.622771 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.625746 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.625789 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.625806 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.625839 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.625850 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: E1209 10:02:20.638734 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:20Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:20 crc kubenswrapper[5002]: E1209 10:02:20.638931 5002 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.653602 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.653643 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.653654 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.653668 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.653678 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.755681 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.755724 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.755734 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.755749 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.755759 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.858045 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.858220 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.858237 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.858253 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.858263 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.960731 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.960773 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.960782 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.960794 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:20 crc kubenswrapper[5002]: I1209 10:02:20.960803 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:20Z","lastTransitionTime":"2025-12-09T10:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.059577 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.059576 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:21 crc kubenswrapper[5002]: E1209 10:02:21.059720 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:21 crc kubenswrapper[5002]: E1209 10:02:21.059792 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.063831 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.063882 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.063893 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.063914 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.063926 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:21Z","lastTransitionTime":"2025-12-09T10:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.166411 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.166492 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.166503 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.166517 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.166526 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:21Z","lastTransitionTime":"2025-12-09T10:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.269113 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.269167 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.269182 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.269204 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.269298 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:21Z","lastTransitionTime":"2025-12-09T10:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.371632 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.371672 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.371686 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.371705 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.371717 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:21Z","lastTransitionTime":"2025-12-09T10:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.473866 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.473917 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.473927 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.473941 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.473952 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:21Z","lastTransitionTime":"2025-12-09T10:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.576005 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.576052 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.576061 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.576080 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.576101 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:21Z","lastTransitionTime":"2025-12-09T10:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.679343 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.679387 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.679399 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.679416 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.679434 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:21Z","lastTransitionTime":"2025-12-09T10:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.781800 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.781879 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.781895 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.781917 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.781930 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:21Z","lastTransitionTime":"2025-12-09T10:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.884992 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.885025 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.885034 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.885047 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.885055 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:21Z","lastTransitionTime":"2025-12-09T10:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.987737 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.987774 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.987782 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.987797 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:21 crc kubenswrapper[5002]: I1209 10:02:21.987823 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:21Z","lastTransitionTime":"2025-12-09T10:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.063077 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:22 crc kubenswrapper[5002]: E1209 10:02:22.063175 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.063384 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:22 crc kubenswrapper[5002]: E1209 10:02:22.063455 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.077792 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.090362 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.090398 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.090408 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.090423 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.090435 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:22Z","lastTransitionTime":"2025-12-09T10:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.192989 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.193038 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.193048 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.193064 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.193074 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:22Z","lastTransitionTime":"2025-12-09T10:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.296706 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.296764 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.296775 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.296794 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.296808 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:22Z","lastTransitionTime":"2025-12-09T10:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.399934 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.399969 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.399983 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.400000 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.400012 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:22Z","lastTransitionTime":"2025-12-09T10:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.512284 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.512361 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.512375 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.512418 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.512430 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:22Z","lastTransitionTime":"2025-12-09T10:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.614920 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.614959 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.614968 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.614982 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.614992 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:22Z","lastTransitionTime":"2025-12-09T10:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.718216 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.718278 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.718296 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.718322 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.718358 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:22Z","lastTransitionTime":"2025-12-09T10:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.822455 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.822770 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.822793 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.822850 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.822872 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:22Z","lastTransitionTime":"2025-12-09T10:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.926558 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.926611 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.926624 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.926641 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:22 crc kubenswrapper[5002]: I1209 10:02:22.926653 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:22Z","lastTransitionTime":"2025-12-09T10:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.029291 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.029366 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.029386 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.029413 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.029432 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:23Z","lastTransitionTime":"2025-12-09T10:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.059922 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.059950 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:23 crc kubenswrapper[5002]: E1209 10:02:23.060210 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:23 crc kubenswrapper[5002]: E1209 10:02:23.060091 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.131755 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.131803 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.131838 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.131860 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.131869 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:23Z","lastTransitionTime":"2025-12-09T10:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.234390 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.234441 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.234455 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.234472 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.234485 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:23Z","lastTransitionTime":"2025-12-09T10:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.336300 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.336331 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.336340 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.336352 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.336360 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:23Z","lastTransitionTime":"2025-12-09T10:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.439067 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.439112 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.439122 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.439137 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.439148 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:23Z","lastTransitionTime":"2025-12-09T10:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.543095 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.543143 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.543151 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.543168 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.543177 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:23Z","lastTransitionTime":"2025-12-09T10:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.645960 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.645988 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.645996 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.646008 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.646017 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:23Z","lastTransitionTime":"2025-12-09T10:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.748473 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.748509 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.748517 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.748536 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.748546 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:23Z","lastTransitionTime":"2025-12-09T10:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.851584 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.851631 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.851641 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.851660 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.851671 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:23Z","lastTransitionTime":"2025-12-09T10:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.954234 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.954275 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.954286 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.954302 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:23 crc kubenswrapper[5002]: I1209 10:02:23.954315 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:23Z","lastTransitionTime":"2025-12-09T10:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.056633 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.056711 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.056735 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.056766 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.056790 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:24Z","lastTransitionTime":"2025-12-09T10:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.059476 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.059516 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:24 crc kubenswrapper[5002]: E1209 10:02:24.059669 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:24 crc kubenswrapper[5002]: E1209 10:02:24.059921 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.158881 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.158916 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.158928 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.158944 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.158952 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:24Z","lastTransitionTime":"2025-12-09T10:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.261669 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.262054 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.262156 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.262278 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.262394 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:24Z","lastTransitionTime":"2025-12-09T10:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.365891 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.366228 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.366364 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.366561 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.366703 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:24Z","lastTransitionTime":"2025-12-09T10:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.469378 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.469428 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.469445 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.469472 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.469489 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:24Z","lastTransitionTime":"2025-12-09T10:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.572040 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.572094 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.572111 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.572133 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.572150 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:24Z","lastTransitionTime":"2025-12-09T10:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.675217 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.675261 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.675269 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.675284 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.675294 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:24Z","lastTransitionTime":"2025-12-09T10:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.778508 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.778555 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.778568 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.778587 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.778597 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:24Z","lastTransitionTime":"2025-12-09T10:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.884926 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.884984 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.885005 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.885027 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.885042 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:24Z","lastTransitionTime":"2025-12-09T10:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.987520 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.987567 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.987579 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.987594 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:24 crc kubenswrapper[5002]: I1209 10:02:24.987604 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:24Z","lastTransitionTime":"2025-12-09T10:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.059937 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.059961 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:25 crc kubenswrapper[5002]: E1209 10:02:25.060092 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:25 crc kubenswrapper[5002]: E1209 10:02:25.060149 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.060952 5002 scope.go:117] "RemoveContainer" containerID="ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.090145 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.090194 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.090207 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.090225 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.090239 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:25Z","lastTransitionTime":"2025-12-09T10:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.192898 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.192937 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.192947 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.192961 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.192972 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:25Z","lastTransitionTime":"2025-12-09T10:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.295433 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.295712 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.295787 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.295900 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.296008 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:25Z","lastTransitionTime":"2025-12-09T10:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.398408 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.398452 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.398469 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.398491 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.398507 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:25Z","lastTransitionTime":"2025-12-09T10:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.501440 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.501496 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.501509 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.501532 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.501544 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:25Z","lastTransitionTime":"2025-12-09T10:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.603526 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.603573 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.603582 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.603597 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.603605 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:25Z","lastTransitionTime":"2025-12-09T10:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.706742 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.706793 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.706804 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.706841 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.706852 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:25Z","lastTransitionTime":"2025-12-09T10:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.809672 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.809752 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.809768 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.809800 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.809836 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:25Z","lastTransitionTime":"2025-12-09T10:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.912000 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.912050 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.912067 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.912089 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.912103 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:25Z","lastTransitionTime":"2025-12-09T10:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.931793 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/2.log" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.934856 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871"} Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.936009 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.946590 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"389768b5-f13a-492b-a44f-4d71acf1c9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d529f16cadd597347e6640d9f4c6fecf86943db64b9d1b154ee1adc9e68364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd3267e6e931890e236a6659bb3f39f0b33ed2217ec46326efb6ef7b8e13b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffd3267e6e931890e236a6659bb3f39f0b33ed2217ec46326efb6ef7b8e13b77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:25Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.977650 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:58Z\\\",\\\"message\\\":\\\"ailed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z]\\\\nI1209 10:01:58.693873 6637 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 10:01:58.693874 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1209 10:01:58.693876 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-kxpn6\\\\nI1209 10:01:58.693850 6637 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k in node crc\\\\nI1209 10:01:58.693892 6637 obj_retry.go\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:25Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.988277 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:25Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:25 crc kubenswrapper[5002]: I1209 10:02:25.999581 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:25Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.010851 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.014310 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.014350 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.014358 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.014370 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.014379 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:26Z","lastTransitionTime":"2025-12-09T10:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.024690 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.036465 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea4578dd-5c3e-4509-b456-a42a642871d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.048016 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.056214 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.059699 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.059705 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:26 crc kubenswrapper[5002]: E1209 10:02:26.059910 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:26 crc kubenswrapper[5002]: E1209 10:02:26.059806 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.073241 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.083474 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.094029 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.111095 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:02:18Z\\\",\\\"message\\\":\\\"2025-12-09T10:01:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891\\\\n2025-12-09T10:01:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891 to /host/opt/cni/bin/\\\\n2025-12-09T10:01:33Z [verbose] multus-daemon started\\\\n2025-12-09T10:01:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T10:02:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.116959 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.117009 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.117023 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.117041 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.117054 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:26Z","lastTransitionTime":"2025-12-09T10:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.122996 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.141372 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.154192 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.165172 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.180912 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.193683 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.219479 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.219525 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.219536 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.219549 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.219558 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:26Z","lastTransitionTime":"2025-12-09T10:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.321952 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.322003 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.322014 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.322031 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.322042 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:26Z","lastTransitionTime":"2025-12-09T10:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.423856 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.423894 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.423903 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.423926 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.423943 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:26Z","lastTransitionTime":"2025-12-09T10:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.527214 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.527263 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.527279 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.527300 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.527319 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:26Z","lastTransitionTime":"2025-12-09T10:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.629871 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.630489 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.630581 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.630666 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.630758 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:26Z","lastTransitionTime":"2025-12-09T10:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.733253 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.733539 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.733618 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.733686 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.733749 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:26Z","lastTransitionTime":"2025-12-09T10:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.836179 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.836403 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.836511 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.836619 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.836694 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:26Z","lastTransitionTime":"2025-12-09T10:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.938684 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.938738 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.938750 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.938765 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.938777 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:26Z","lastTransitionTime":"2025-12-09T10:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.940745 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/3.log" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.941620 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/2.log" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.944004 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" exitCode=1 Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.944052 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871"} Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.944089 5002 scope.go:117] "RemoveContainer" containerID="ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.945443 5002 scope.go:117] "RemoveContainer" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" Dec 09 10:02:26 crc kubenswrapper[5002]: E1209 10:02:26.945695 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.965705 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.978889 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:26 crc kubenswrapper[5002]: I1209 10:02:26.994505 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:26Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.012394 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.027251 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.042716 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.043128 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.043168 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.043478 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.043579 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.043688 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:27Z","lastTransitionTime":"2025-12-09T10:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.060055 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:27 crc kubenswrapper[5002]: E1209 10:02:27.060201 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.060368 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:27 crc kubenswrapper[5002]: E1209 10:02:27.060415 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.061674 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:02:18Z\\\",\\\"message\\\":\\\"2025-12-09T10:01:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891\\\\n2025-12-09T10:01:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891 to /host/opt/cni/bin/\\\\n2025-12-09T10:01:33Z [verbose] multus-daemon started\\\\n2025-12-09T10:01:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T10:02:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.078338 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.119518 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed556fa1bcef63fb127866b4317c98104f846e12a67a6865a362fe6777a3b40c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:01:58Z\\\",\\\"message\\\":\\\"ailed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:01:58Z is after 2025-08-24T17:21:41Z]\\\\nI1209 10:01:58.693873 6637 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 10:01:58.693874 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1209 10:01:58.693876 6637 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-kxpn6\\\\nI1209 10:01:58.693850 6637 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k in node crc\\\\nI1209 10:01:58.693892 6637 obj_retry.go\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:02:26Z\\\",\\\"message\\\":\\\"26.247440 7025 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-internal-default_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 10:02:26.247605 7025 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.146350 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.146408 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.146421 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.146440 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.146457 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:27Z","lastTransitionTime":"2025-12-09T10:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.152403 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.164800 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"389768b5-f13a-492b-a44f-4d71acf1c9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d529f16cadd597347e6640d9f4c6fecf86943db64b9d1b154ee1adc9e68364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd3267e6e931890e236a6659bb3f39f0b33ed2217ec46326efb6ef7b8e13b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffd3267e6e931890e236a6659bb3f39f0b33ed2217ec46326efb6ef7b8e13b77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.177474 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.188670 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.202953 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.214841 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.227920 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.244704 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.251321 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.251365 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.251381 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.251396 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.251405 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:27Z","lastTransitionTime":"2025-12-09T10:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.259903 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.271542 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea4578dd-5c3e-4509-b456-a42a642871d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.353955 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.353995 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.354005 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.354019 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.354029 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:27Z","lastTransitionTime":"2025-12-09T10:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.456243 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.456284 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.456298 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.456313 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.456323 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:27Z","lastTransitionTime":"2025-12-09T10:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.558685 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.558731 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.558749 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.558766 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.558778 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:27Z","lastTransitionTime":"2025-12-09T10:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.661506 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.661557 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.661580 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.661596 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.661607 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:27Z","lastTransitionTime":"2025-12-09T10:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.764135 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.764172 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.764184 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.764200 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.764211 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:27Z","lastTransitionTime":"2025-12-09T10:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.867255 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.867293 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.867306 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.867322 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.867334 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:27Z","lastTransitionTime":"2025-12-09T10:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.950328 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/3.log" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.955341 5002 scope.go:117] "RemoveContainer" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" Dec 09 10:02:27 crc kubenswrapper[5002]: E1209 10:02:27.955589 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.970740 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"389768b5-f13a-492b-a44f-4d71acf1c9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d529f16cadd597347e6640d9f4c6fecf86943db64b9d1b154ee1adc9e68364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd3267e6e931890e236a6659bb3f39f0b33ed2217ec46326efb6ef7b8e13b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffd3267e6e931890e236a6659bb3f39f0b33ed2217ec46326efb6ef7b8e13b77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.971169 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.971202 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.971214 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.971230 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.971242 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:27Z","lastTransitionTime":"2025-12-09T10:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:27 crc kubenswrapper[5002]: I1209 10:02:27.998207 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:02:26Z\\\",\\\"message\\\":\\\"26.247440 7025 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-internal-default_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 10:02:26.247605 7025 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:02:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:27Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.012570 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.028152 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.041122 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.052360 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.059777 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.059871 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:28 crc kubenswrapper[5002]: E1209 10:02:28.059948 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:28 crc kubenswrapper[5002]: E1209 10:02:28.059993 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.062564 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.072904 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea4578dd-5c3e-4509-b456-a42a642871d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.073461 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.073497 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.073509 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.073525 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.073537 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:28Z","lastTransitionTime":"2025-12-09T10:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.086448 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.096597 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.109466 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.120861 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.137898 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.151470 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.166017 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:02:18Z\\\",\\\"message\\\":\\\"2025-12-09T10:01:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891\\\\n2025-12-09T10:01:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891 to /host/opt/cni/bin/\\\\n2025-12-09T10:01:33Z [verbose] multus-daemon started\\\\n2025-12-09T10:01:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T10:02:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.175359 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.175400 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.175411 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.175500 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.175511 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:28Z","lastTransitionTime":"2025-12-09T10:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.180379 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.201063 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.214861 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.227440 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.238772 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea4578dd-5c3e-4509-b456-a42a642871d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.249991 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.258489 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.270848 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.277631 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.277832 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.277919 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.278012 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.278076 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:28Z","lastTransitionTime":"2025-12-09T10:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.281516 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.293370 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.305506 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rgf44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ed6e93-eda5-4648-b185-25d2960ce0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:02:18Z\\\",\\\"message\\\":\\\"2025-12-09T10:01:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891\\\\n2025-12-09T10:01:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e520db89-1ef9-44e8-b02a-6287f2e9d891 to /host/opt/cni/bin/\\\\n2025-12-09T10:01:33Z [verbose] multus-daemon started\\\\n2025-12-09T10:01:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T10:02:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh57p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rgf44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.316650 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bc64859-6675-4dc6-b0a1-579abb87580e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0db847425b24ea6804034220f2050b153b78d21bc1cc934dad6784c11c68dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea3ea4cb1e3f00acc4ef769928988a0a2c2ee54afa0ab5f040ef50f465a9d6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1c1315eade2f326ac5feefc45cbcec29c7ee59fb40494f5153b7f8dbdfc404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.337547 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48170a45-0766-4c86-af19-b829960de244\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3df655bf1ce56b5aef6728759b8b3262260171afd0e4924212afc506fa313e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e29bcc86bc8035d73f3f12857d0024d93752d70d6a77b51d98278981669ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c750385ffbe9096e2cef43dffe002fd49599e44ed69806710b492749637dbe93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73ac21971d31e17f4f76bcbb1e02201b53b12189815ced304d7913f6aa76f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd16d2319de83381932ca61f8b08dd31ecfde33e270c3ea2ea3edb3c2fa174b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6276edea61fa26eb54c6c213ac9d63ac29345d0322fa80b2b98fccdc648bb782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b91eb209f111c54aa96bcbb7d65621f2d34eeb2554662305ccec5251d26f4136\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://474e5c6172b067d8f3f91fb7b8eb8259194d8defa339b5f03d60fbb45db923a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.351351 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.365593 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.380886 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.380931 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.380943 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.380963 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.380977 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:28Z","lastTransitionTime":"2025-12-09T10:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.381005 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.394085 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.404615 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"389768b5-f13a-492b-a44f-4d71acf1c9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d529f16cadd597347e6640d9f4c6fecf86943db64b9d1b154ee1adc9e68364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd3267e6e931890e236a6659bb3f39f0b33ed2217ec46326efb6ef7b8e13b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffd3267e6e931890e236a6659bb3f39f0b33ed2217ec46326efb6ef7b8e13b77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.421914 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7013527e-73de-4427-af9c-e33663b1c222\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T10:02:26Z\\\",\\\"message\\\":\\\"26.247440 7025 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-internal-default_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 10:02:26.247605 7025 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:02:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnh82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2mnnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.432391 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02305623-2d65-47e3-ac63-5182bf50d141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93e73de78b5120b5d2bf38748e84dad9dd5353e18130635243988d7b131ace3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c8fa17f92ca9a774f62e20c5eeec59041bee23e426c36b2949d4f82c7e45f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg8sd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m689k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.445887 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.455992 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.467454 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:28Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.483650 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.483704 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.483717 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.483735 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.483748 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:28Z","lastTransitionTime":"2025-12-09T10:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.586668 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.586747 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.586770 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.586800 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.586895 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:28Z","lastTransitionTime":"2025-12-09T10:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.689518 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.689550 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.689559 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.689573 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.689582 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:28Z","lastTransitionTime":"2025-12-09T10:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.792538 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.792588 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.792606 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.792632 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.792650 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:28Z","lastTransitionTime":"2025-12-09T10:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.896043 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.896104 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.896124 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.896153 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.896172 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:28Z","lastTransitionTime":"2025-12-09T10:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.998938 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.999336 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.999415 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:28 crc kubenswrapper[5002]: I1209 10:02:28.999577 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.000023 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:28Z","lastTransitionTime":"2025-12-09T10:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.059802 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.059803 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.059957 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.060092 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.103241 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.103306 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.103318 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.103333 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.103344 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:29Z","lastTransitionTime":"2025-12-09T10:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.206593 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.206634 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.206643 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.206658 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.206667 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:29Z","lastTransitionTime":"2025-12-09T10:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.309337 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.309395 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.309408 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.309428 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.309441 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:29Z","lastTransitionTime":"2025-12-09T10:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.412860 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.412930 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.412945 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.412969 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.412982 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:29Z","lastTransitionTime":"2025-12-09T10:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.516290 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.516346 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.516360 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.516378 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.516389 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:29Z","lastTransitionTime":"2025-12-09T10:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.618632 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.618690 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.618698 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.618711 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.618737 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:29Z","lastTransitionTime":"2025-12-09T10:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.721656 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.721689 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.721697 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.721711 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.721721 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:29Z","lastTransitionTime":"2025-12-09T10:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.824248 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.824313 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.824338 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.824366 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.824387 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:29Z","lastTransitionTime":"2025-12-09T10:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.927076 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.927232 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.927254 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.927274 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.927287 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:29Z","lastTransitionTime":"2025-12-09T10:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.937895 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938047 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:33.938014935 +0000 UTC m=+146.330066056 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.938108 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.938186 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.938230 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:29 crc kubenswrapper[5002]: I1209 10:02:29.938268 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938383 5002 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938446 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:03:33.938428616 +0000 UTC m=+146.330479737 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938565 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938594 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938611 5002 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938656 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 10:03:33.938642271 +0000 UTC m=+146.330693392 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938736 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938763 5002 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938779 5002 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938868 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 10:03:33.938805496 +0000 UTC m=+146.330856607 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.938997 5002 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:02:29 crc kubenswrapper[5002]: E1209 10:02:29.939091 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 10:03:33.939068713 +0000 UTC m=+146.331119834 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.030750 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.030832 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.030847 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.030866 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.030885 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.059863 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.059963 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:30 crc kubenswrapper[5002]: E1209 10:02:30.060030 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:30 crc kubenswrapper[5002]: E1209 10:02:30.060162 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.132983 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.133030 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.133044 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.133061 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.133073 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.235119 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.235160 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.235171 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.235187 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.235199 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.338077 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.338330 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.338394 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.338465 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.338525 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.442155 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.442195 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.442208 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.442224 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.442237 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.544634 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.545015 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.545116 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.545274 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.545374 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.647852 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.648143 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.648235 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.648320 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.648437 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.751177 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.751254 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.751269 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.751290 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.751307 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.853889 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.853935 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.853947 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.853968 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.853981 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.917794 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.918224 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.918424 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.918623 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.918788 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: E1209 10:02:30.937758 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.942105 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.942142 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.942152 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.942164 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.942174 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: E1209 10:02:30.958712 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.965963 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.966004 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.966013 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.966028 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.966037 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:30 crc kubenswrapper[5002]: E1209 10:02:30.977965 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:30Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.987049 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.987096 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.987106 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.987123 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:30 crc kubenswrapper[5002]: I1209 10:02:30.987134 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:30Z","lastTransitionTime":"2025-12-09T10:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:31 crc kubenswrapper[5002]: E1209 10:02:31.004771 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.008706 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.008742 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.008750 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.008764 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.008773 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:31Z","lastTransitionTime":"2025-12-09T10:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:31 crc kubenswrapper[5002]: E1209 10:02:31.021707 5002 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T10:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T10:02:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb4d43e7-bcbf-4472-90e9-44716d72c15e\\\",\\\"systemUUID\\\":\\\"8af61218-105c-4188-8c40-2d81c3899a86\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:31Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:31 crc kubenswrapper[5002]: E1209 10:02:31.021913 5002 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.023512 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.023544 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.023552 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.023565 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.023575 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:31Z","lastTransitionTime":"2025-12-09T10:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.059123 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.059168 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:31 crc kubenswrapper[5002]: E1209 10:02:31.059278 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:31 crc kubenswrapper[5002]: E1209 10:02:31.059361 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.126473 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.126570 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.126592 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.126614 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.126632 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:31Z","lastTransitionTime":"2025-12-09T10:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.230251 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.230328 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.230352 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.230382 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.230407 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:31Z","lastTransitionTime":"2025-12-09T10:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.333317 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.333362 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.333371 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.333391 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.333402 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:31Z","lastTransitionTime":"2025-12-09T10:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.436741 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.436848 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.436873 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.436942 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.436965 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:31Z","lastTransitionTime":"2025-12-09T10:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.542134 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.542177 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.542187 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.542206 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.542216 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:31Z","lastTransitionTime":"2025-12-09T10:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.645471 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.645573 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.645592 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.645655 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.645677 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:31Z","lastTransitionTime":"2025-12-09T10:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.748797 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.748880 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.748915 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.748944 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.748960 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:31Z","lastTransitionTime":"2025-12-09T10:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.851615 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.851676 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.851692 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.851713 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.851729 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:31Z","lastTransitionTime":"2025-12-09T10:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.956128 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.956182 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.956198 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.956214 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:31 crc kubenswrapper[5002]: I1209 10:02:31.956473 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:31Z","lastTransitionTime":"2025-12-09T10:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.058178 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.058211 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.058219 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.058231 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.058240 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:32Z","lastTransitionTime":"2025-12-09T10:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.059582 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:32 crc kubenswrapper[5002]: E1209 10:02:32.059764 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.059926 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:32 crc kubenswrapper[5002]: E1209 10:02:32.060054 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.159667 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.159710 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.159721 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.159735 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.159744 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:32Z","lastTransitionTime":"2025-12-09T10:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.262122 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.262167 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.262178 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.262193 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.262205 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:32Z","lastTransitionTime":"2025-12-09T10:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.365051 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.365377 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.365388 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.365403 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.365415 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:32Z","lastTransitionTime":"2025-12-09T10:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.468221 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.468344 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.468372 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.468402 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.468427 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:32Z","lastTransitionTime":"2025-12-09T10:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.570752 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.570844 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.570879 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.570908 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.570944 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:32Z","lastTransitionTime":"2025-12-09T10:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.674324 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.674370 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.674381 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.674402 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.674419 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:32Z","lastTransitionTime":"2025-12-09T10:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.777605 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.777647 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.777661 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.777682 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.777696 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:32Z","lastTransitionTime":"2025-12-09T10:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.881232 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.881281 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.881290 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.881304 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.881317 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:32Z","lastTransitionTime":"2025-12-09T10:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.984159 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.984203 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.984213 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.984230 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:32 crc kubenswrapper[5002]: I1209 10:02:32.984243 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:32Z","lastTransitionTime":"2025-12-09T10:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.059651 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:33 crc kubenswrapper[5002]: E1209 10:02:33.059768 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.059651 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:33 crc kubenswrapper[5002]: E1209 10:02:33.059986 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.087089 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.087122 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.087131 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.087144 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.087155 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:33Z","lastTransitionTime":"2025-12-09T10:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.189525 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.190076 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.190100 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.190122 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.190138 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:33Z","lastTransitionTime":"2025-12-09T10:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.292766 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.292832 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.292844 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.292869 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.292885 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:33Z","lastTransitionTime":"2025-12-09T10:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.395612 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.395653 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.395664 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.395682 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.395697 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:33Z","lastTransitionTime":"2025-12-09T10:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.498693 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.498741 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.498749 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.498763 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.498772 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:33Z","lastTransitionTime":"2025-12-09T10:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.601876 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.601934 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.601944 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.601967 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.601980 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:33Z","lastTransitionTime":"2025-12-09T10:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.705115 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.705162 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.705171 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.705193 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.705205 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:33Z","lastTransitionTime":"2025-12-09T10:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.807540 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.807613 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.807624 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.807645 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.807659 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:33Z","lastTransitionTime":"2025-12-09T10:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.910437 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.910485 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.910497 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.910513 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:33 crc kubenswrapper[5002]: I1209 10:02:33.910523 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:33Z","lastTransitionTime":"2025-12-09T10:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.012750 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.012842 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.012857 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.012875 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.012887 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:34Z","lastTransitionTime":"2025-12-09T10:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.059421 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.059504 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:34 crc kubenswrapper[5002]: E1209 10:02:34.059564 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:34 crc kubenswrapper[5002]: E1209 10:02:34.059678 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.115950 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.115998 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.116011 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.116028 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.116041 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:34Z","lastTransitionTime":"2025-12-09T10:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.218742 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.218778 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.218789 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.218829 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.218843 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:34Z","lastTransitionTime":"2025-12-09T10:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.321611 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.321663 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.321675 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.321691 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.321709 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:34Z","lastTransitionTime":"2025-12-09T10:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.424182 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.424250 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.424273 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.424301 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.424321 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:34Z","lastTransitionTime":"2025-12-09T10:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.527339 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.527389 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.527398 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.527411 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.527419 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:34Z","lastTransitionTime":"2025-12-09T10:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.629451 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.629504 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.629515 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.629531 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.629542 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:34Z","lastTransitionTime":"2025-12-09T10:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.731751 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.731866 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.731901 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.731934 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.731954 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:34Z","lastTransitionTime":"2025-12-09T10:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.835313 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.835388 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.835408 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.835440 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.835458 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:34Z","lastTransitionTime":"2025-12-09T10:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.938598 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.938628 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.938639 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.938652 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:34 crc kubenswrapper[5002]: I1209 10:02:34.938660 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:34Z","lastTransitionTime":"2025-12-09T10:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.041500 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.041556 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.041568 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.041587 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.041601 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:35Z","lastTransitionTime":"2025-12-09T10:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.059499 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.059502 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:35 crc kubenswrapper[5002]: E1209 10:02:35.059911 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:35 crc kubenswrapper[5002]: E1209 10:02:35.060032 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.146541 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.146613 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.146628 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.146654 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.146669 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:35Z","lastTransitionTime":"2025-12-09T10:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.249648 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.249723 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.249733 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.249750 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.249767 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:35Z","lastTransitionTime":"2025-12-09T10:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.353207 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.353274 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.353288 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.353305 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.353316 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:35Z","lastTransitionTime":"2025-12-09T10:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.456460 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.456506 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.456516 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.456529 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.456539 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:35Z","lastTransitionTime":"2025-12-09T10:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.559245 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.559307 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.559322 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.559342 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.559356 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:35Z","lastTransitionTime":"2025-12-09T10:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.661946 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.662000 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.662012 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.662031 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.662043 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:35Z","lastTransitionTime":"2025-12-09T10:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.764227 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.764271 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.764287 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.764308 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.764323 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:35Z","lastTransitionTime":"2025-12-09T10:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.866912 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.866975 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.866995 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.867019 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.867039 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:35Z","lastTransitionTime":"2025-12-09T10:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.970355 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.970430 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.970441 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.970457 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:35 crc kubenswrapper[5002]: I1209 10:02:35.970468 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:35Z","lastTransitionTime":"2025-12-09T10:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.060106 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.060207 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:36 crc kubenswrapper[5002]: E1209 10:02:36.060274 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:36 crc kubenswrapper[5002]: E1209 10:02:36.060331 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.072949 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.072993 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.073003 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.073018 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.073029 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:36Z","lastTransitionTime":"2025-12-09T10:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.175369 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.175415 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.175427 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.175442 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.175454 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:36Z","lastTransitionTime":"2025-12-09T10:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.278538 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.278622 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.278647 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.278675 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.278697 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:36Z","lastTransitionTime":"2025-12-09T10:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.381436 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.381506 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.381546 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.381577 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.381599 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:36Z","lastTransitionTime":"2025-12-09T10:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.485150 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.485196 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.485210 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.485231 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.485245 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:36Z","lastTransitionTime":"2025-12-09T10:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.587681 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.587727 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.587737 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.587752 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.587762 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:36Z","lastTransitionTime":"2025-12-09T10:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.690721 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.690799 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.690877 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.690910 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.690935 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:36Z","lastTransitionTime":"2025-12-09T10:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.793480 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.793513 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.793525 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.793540 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.793552 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:36Z","lastTransitionTime":"2025-12-09T10:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.897441 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.897526 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.897553 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.897584 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:36 crc kubenswrapper[5002]: I1209 10:02:36.897609 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:36Z","lastTransitionTime":"2025-12-09T10:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.001302 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.001368 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.001385 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.001408 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.001426 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:37Z","lastTransitionTime":"2025-12-09T10:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.059353 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.059352 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:37 crc kubenswrapper[5002]: E1209 10:02:37.059587 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:37 crc kubenswrapper[5002]: E1209 10:02:37.059676 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.104000 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.104044 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.104055 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.104073 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.104086 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:37Z","lastTransitionTime":"2025-12-09T10:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.207405 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.207486 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.207508 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.207533 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.207553 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:37Z","lastTransitionTime":"2025-12-09T10:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.310759 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.310970 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.311004 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.311037 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.311060 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:37Z","lastTransitionTime":"2025-12-09T10:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.414197 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.414262 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.414281 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.414306 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.414325 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:37Z","lastTransitionTime":"2025-12-09T10:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.518171 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.518217 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.518230 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.518246 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.518259 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:37Z","lastTransitionTime":"2025-12-09T10:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.620380 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.620420 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.620430 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.620445 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.620456 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:37Z","lastTransitionTime":"2025-12-09T10:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.722563 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.722611 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.722623 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.722641 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.722653 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:37Z","lastTransitionTime":"2025-12-09T10:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.825459 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.825507 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.825524 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.825546 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.825562 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:37Z","lastTransitionTime":"2025-12-09T10:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.928031 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.928090 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.928109 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.928131 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:37 crc kubenswrapper[5002]: I1209 10:02:37.928147 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:37Z","lastTransitionTime":"2025-12-09T10:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.030589 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.030617 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.030624 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.030636 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.030644 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:38Z","lastTransitionTime":"2025-12-09T10:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.064935 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:38 crc kubenswrapper[5002]: E1209 10:02:38.065107 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.065172 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:38 crc kubenswrapper[5002]: E1209 10:02:38.065964 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.076354 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7t88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edef2440-1e4a-4676-9517-08b21b3b66ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc19a171af6f6875b4d953edd75048a1249b44348ba03757126ebe943c118be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzs5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7t88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.089235 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.100221 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49c6392-68b2-4847-9291-a0b4d9c1cbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a019843745b7d5198565771c39f7949cf45738e236a2283588db2e57d07f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sslq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kxpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.110063 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t5hm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de8c639-f176-405c-ae34-6717f9f9458c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://097fa8d84999c942645591541e1377c08cdeec593f64525db5a5aa8f7ec8cbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9wwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t5hm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.121977 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9281be7a-3f7d-4b00-a3ff-f5b9236dd74a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60fd7aba09010bae4f15fe793e0084c71d381f63bc4c1549f2ccbe57cdb90ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc4cbbaac292f51065cd93b4bd0962820e3fed45d417f93136939082f5f9f76b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d28aba5d30f5176f31f26a4917a4a454922f88e4732d16fe3e02577330049643\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006a6eebee908dda7b3a073d145211abdac7832ac7d8759c15770cc9d1bd022b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a15d04ef5c291981367747944f9f25504f781d01b5ff8742f5be6a1f054d2cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75f6f0863417856e6de310d67d64adc7734c60c75bb1c8d7a14d30d10bbbe858\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://257f2f34a26cbd57dfbbe04f21b95523f6102919a61dbbc07b1202665c442210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctz8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zjdhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.132265 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffb94c3-624e-48aa-aaa9-450ace4e1862\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mn62f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.134084 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.134117 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.134132 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.134148 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.134157 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:38Z","lastTransitionTime":"2025-12-09T10:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.147265 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea4578dd-5c3e-4509-b456-a42a642871d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b30d7f55d95c36f85285f235dcf2de31c04cc358d5cce4d49f9ea43945fd3f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de5d36b22bcd84f50dfb6ae8858f98665f3ae3981d5dc2233fa9e3b92db56b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa189df1ab85704e8528d42da3e500dba354bb99dace868af40a49fec5b19fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaffe6e7f597ce14c1bc564a62ec71519af84fa5d220d14a5e8000653d396c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.159996 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.171444 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1209 10:01:25.592088 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 10:01:25.592207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 10:01:25.593930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3155801387/tls.crt::/tmp/serving-cert-3155801387/tls.key\\\\\\\"\\\\nI1209 10:01:25.900616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 10:01:25.902593 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 10:01:25.902611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 10:01:25.902628 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 10:01:25.902633 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 10:01:25.906542 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 10:01:25.906611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906618 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 10:01:25.906623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 10:01:25.906630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 10:01:25.906634 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 10:01:25.906639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 10:01:25.906563 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 10:01:25.907863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T10:01:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T10:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T10:01:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.184184 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab29aa6d19002efb0309c548e059e004c5002ccde634df95e3c2661a3e81207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcda2271939034d8c6c54fa3d648500ebda150fafcce9216338fad68552a65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.195870 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c149ed39076fc7ee5538e60dbc0a8fc303a21578e5cc3ac89a3aeaad2c21c6a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.207534 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d199b778c03f10fc3b1a2623600057801f54ee3240768ede1a79213b678fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T10:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.219263 5002 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T10:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T10:02:38Z is after 2025-08-24T17:21:41Z" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.237653 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.237691 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.237701 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.237715 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.237724 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:38Z","lastTransitionTime":"2025-12-09T10:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.258041 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rgf44" podStartSLOduration=68.258021245 podStartE2EDuration="1m8.258021245s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:38.245118332 +0000 UTC m=+90.637169413" watchObservedRunningTime="2025-12-09 10:02:38.258021245 +0000 UTC m=+90.650072326" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.285582 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.285564717 podStartE2EDuration="1m8.285564717s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:38.284742076 +0000 UTC m=+90.676793167" watchObservedRunningTime="2025-12-09 10:02:38.285564717 +0000 UTC m=+90.677615818" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.286022 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=65.286015529 podStartE2EDuration="1m5.286015529s" podCreationTimestamp="2025-12-09 10:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:38.258380815 +0000 UTC m=+90.650431906" watchObservedRunningTime="2025-12-09 10:02:38.286015529 +0000 UTC m=+90.678066610" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.297391 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m689k" podStartSLOduration=67.297369691 podStartE2EDuration="1m7.297369691s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:38.297273849 +0000 UTC m=+90.689324920" watchObservedRunningTime="2025-12-09 10:02:38.297369691 +0000 UTC m=+90.689420772" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.310701 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=16.310665685 podStartE2EDuration="16.310665685s" podCreationTimestamp="2025-12-09 10:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:38.310334496 +0000 UTC m=+90.702385577" watchObservedRunningTime="2025-12-09 10:02:38.310665685 +0000 UTC m=+90.702716776" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.339384 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.339431 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.339444 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.339460 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.339471 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:38Z","lastTransitionTime":"2025-12-09T10:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.441917 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.441964 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.441977 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.441993 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.442006 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:38Z","lastTransitionTime":"2025-12-09T10:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.543599 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.543633 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.543643 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.543656 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.543664 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:38Z","lastTransitionTime":"2025-12-09T10:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.650388 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.650465 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.650478 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.650499 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.650514 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:38Z","lastTransitionTime":"2025-12-09T10:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.752893 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.752967 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.752985 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.753011 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.753029 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:38Z","lastTransitionTime":"2025-12-09T10:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.856088 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.856132 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.856145 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.856165 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.856180 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:38Z","lastTransitionTime":"2025-12-09T10:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.958998 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.959067 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.959079 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.959096 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:38 crc kubenswrapper[5002]: I1209 10:02:38.959105 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:38Z","lastTransitionTime":"2025-12-09T10:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.060099 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.060185 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:39 crc kubenswrapper[5002]: E1209 10:02:39.061025 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:39 crc kubenswrapper[5002]: E1209 10:02:39.061139 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.061849 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.061924 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.061941 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.061954 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.061964 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:39Z","lastTransitionTime":"2025-12-09T10:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.165375 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.165428 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.165438 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.165459 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.165472 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:39Z","lastTransitionTime":"2025-12-09T10:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.268496 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.268562 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.268582 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.268611 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.268629 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:39Z","lastTransitionTime":"2025-12-09T10:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.372506 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.372588 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.372606 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.372629 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.372644 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:39Z","lastTransitionTime":"2025-12-09T10:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.474750 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.474852 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.474872 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.474896 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.474912 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:39Z","lastTransitionTime":"2025-12-09T10:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.578864 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.578949 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.578966 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.578983 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.578995 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:39Z","lastTransitionTime":"2025-12-09T10:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.681610 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.681676 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.681692 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.681717 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.681733 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:39Z","lastTransitionTime":"2025-12-09T10:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.784306 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.784361 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.784379 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.784402 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.784419 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:39Z","lastTransitionTime":"2025-12-09T10:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.887863 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.887921 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.887937 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.887960 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.887977 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:39Z","lastTransitionTime":"2025-12-09T10:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.991126 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.991182 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.991198 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.991219 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:39 crc kubenswrapper[5002]: I1209 10:02:39.991234 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:39Z","lastTransitionTime":"2025-12-09T10:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.060099 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:40 crc kubenswrapper[5002]: E1209 10:02:40.060259 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.060387 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:40 crc kubenswrapper[5002]: E1209 10:02:40.060613 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.094188 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.094266 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.094285 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.094317 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.094336 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:40Z","lastTransitionTime":"2025-12-09T10:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.199369 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.199415 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.199427 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.199447 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.199459 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:40Z","lastTransitionTime":"2025-12-09T10:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.301916 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.301998 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.302014 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.302037 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.302051 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:40Z","lastTransitionTime":"2025-12-09T10:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.404877 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.404969 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.404992 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.405026 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.405045 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:40Z","lastTransitionTime":"2025-12-09T10:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.508133 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.508179 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.508189 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.508209 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.508220 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:40Z","lastTransitionTime":"2025-12-09T10:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.612614 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.612669 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.612684 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.612704 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.612717 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:40Z","lastTransitionTime":"2025-12-09T10:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.715310 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.715388 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.715406 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.715433 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.715450 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:40Z","lastTransitionTime":"2025-12-09T10:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.818243 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.818326 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.818366 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.818398 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.818428 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:40Z","lastTransitionTime":"2025-12-09T10:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.921414 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.921469 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.921479 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.921492 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:40 crc kubenswrapper[5002]: I1209 10:02:40.921501 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:40Z","lastTransitionTime":"2025-12-09T10:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.024642 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.024702 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.024721 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.024744 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.024760 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:41Z","lastTransitionTime":"2025-12-09T10:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.059214 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:41 crc kubenswrapper[5002]: E1209 10:02:41.059414 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.059656 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:41 crc kubenswrapper[5002]: E1209 10:02:41.059911 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.096531 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.096567 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.096583 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.096600 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.096610 5002 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T10:02:41Z","lastTransitionTime":"2025-12-09T10:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.148806 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn"] Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.149394 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.150949 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.151643 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.151738 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.152623 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.163260 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-p7t88" podStartSLOduration=71.163244554 podStartE2EDuration="1m11.163244554s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:41.16160552 +0000 UTC m=+93.553656601" watchObservedRunningTime="2025-12-09 10:02:41.163244554 +0000 UTC m=+93.555295635" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.203739 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podStartSLOduration=71.20371844 podStartE2EDuration="1m11.20371844s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:41.194533635 +0000 UTC m=+93.586584716" watchObservedRunningTime="2025-12-09 10:02:41.20371844 +0000 UTC m=+93.595769531" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.207343 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t5hm5" podStartSLOduration=71.207327625 podStartE2EDuration="1m11.207327625s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:41.203679068 +0000 UTC m=+93.595730149" watchObservedRunningTime="2025-12-09 10:02:41.207327625 +0000 UTC m=+93.599378706" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.234625 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zjdhx" podStartSLOduration=71.234603221 podStartE2EDuration="1m11.234603221s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:41.220165467 +0000 UTC m=+93.612216548" watchObservedRunningTime="2025-12-09 10:02:41.234603221 +0000 UTC m=+93.626654302" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.247721 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.247702019 podStartE2EDuration="42.247702019s" podCreationTimestamp="2025-12-09 10:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:41.247071982 +0000 UTC m=+93.639123093" watchObservedRunningTime="2025-12-09 10:02:41.247702019 +0000 UTC m=+93.639753100" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.258853 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/df998d36-4d4b-44e1-a3c3-60f878d11eb8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.258902 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df998d36-4d4b-44e1-a3c3-60f878d11eb8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.258975 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/df998d36-4d4b-44e1-a3c3-60f878d11eb8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.259012 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df998d36-4d4b-44e1-a3c3-60f878d11eb8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.259036 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df998d36-4d4b-44e1-a3c3-60f878d11eb8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.274292 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=75.274274935 podStartE2EDuration="1m15.274274935s" podCreationTimestamp="2025-12-09 10:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:41.273806612 +0000 UTC m=+93.665857693" watchObservedRunningTime="2025-12-09 10:02:41.274274935 +0000 UTC m=+93.666326016" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.359954 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df998d36-4d4b-44e1-a3c3-60f878d11eb8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.360004 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df998d36-4d4b-44e1-a3c3-60f878d11eb8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.360032 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/df998d36-4d4b-44e1-a3c3-60f878d11eb8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.360055 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df998d36-4d4b-44e1-a3c3-60f878d11eb8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.360123 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/df998d36-4d4b-44e1-a3c3-60f878d11eb8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.360176 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/df998d36-4d4b-44e1-a3c3-60f878d11eb8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.360203 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/df998d36-4d4b-44e1-a3c3-60f878d11eb8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.361004 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df998d36-4d4b-44e1-a3c3-60f878d11eb8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.365786 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df998d36-4d4b-44e1-a3c3-60f878d11eb8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.377063 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df998d36-4d4b-44e1-a3c3-60f878d11eb8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pvjhn\" (UID: \"df998d36-4d4b-44e1-a3c3-60f878d11eb8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: I1209 10:02:41.464550 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" Dec 09 10:02:41 crc kubenswrapper[5002]: W1209 10:02:41.485000 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf998d36_4d4b_44e1_a3c3_60f878d11eb8.slice/crio-064ca7fffc8284ca33e1f24037c8637910458cb550e2dd594c245520c0d95c13 WatchSource:0}: Error finding container 064ca7fffc8284ca33e1f24037c8637910458cb550e2dd594c245520c0d95c13: Status 404 returned error can't find the container with id 064ca7fffc8284ca33e1f24037c8637910458cb550e2dd594c245520c0d95c13 Dec 09 10:02:42 crc kubenswrapper[5002]: I1209 10:02:42.001309 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" event={"ID":"df998d36-4d4b-44e1-a3c3-60f878d11eb8","Type":"ContainerStarted","Data":"ef08a994eb0162dcd12738df9b73531a1a5a4f64a1e10bfc8b28082a865a1a54"} Dec 09 10:02:42 crc kubenswrapper[5002]: I1209 10:02:42.001741 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" event={"ID":"df998d36-4d4b-44e1-a3c3-60f878d11eb8","Type":"ContainerStarted","Data":"064ca7fffc8284ca33e1f24037c8637910458cb550e2dd594c245520c0d95c13"} Dec 09 10:02:42 crc kubenswrapper[5002]: I1209 10:02:42.059468 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:42 crc kubenswrapper[5002]: E1209 10:02:42.059625 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:42 crc kubenswrapper[5002]: I1209 10:02:42.059922 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:42 crc kubenswrapper[5002]: E1209 10:02:42.060515 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:43 crc kubenswrapper[5002]: I1209 10:02:43.060193 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:43 crc kubenswrapper[5002]: I1209 10:02:43.060245 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:43 crc kubenswrapper[5002]: E1209 10:02:43.060399 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:43 crc kubenswrapper[5002]: E1209 10:02:43.060873 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:43 crc kubenswrapper[5002]: I1209 10:02:43.061355 5002 scope.go:117] "RemoveContainer" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" Dec 09 10:02:43 crc kubenswrapper[5002]: E1209 10:02:43.061582 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" Dec 09 10:02:44 crc kubenswrapper[5002]: I1209 10:02:44.059614 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:44 crc kubenswrapper[5002]: I1209 10:02:44.059768 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:44 crc kubenswrapper[5002]: E1209 10:02:44.059913 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:44 crc kubenswrapper[5002]: E1209 10:02:44.060065 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:45 crc kubenswrapper[5002]: I1209 10:02:45.059289 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:45 crc kubenswrapper[5002]: I1209 10:02:45.059325 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:45 crc kubenswrapper[5002]: E1209 10:02:45.059415 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:45 crc kubenswrapper[5002]: E1209 10:02:45.059503 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:46 crc kubenswrapper[5002]: I1209 10:02:46.060347 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:46 crc kubenswrapper[5002]: I1209 10:02:46.060427 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:46 crc kubenswrapper[5002]: E1209 10:02:46.060550 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:46 crc kubenswrapper[5002]: E1209 10:02:46.060611 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:47 crc kubenswrapper[5002]: I1209 10:02:47.059939 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:47 crc kubenswrapper[5002]: I1209 10:02:47.059939 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:47 crc kubenswrapper[5002]: E1209 10:02:47.060123 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:47 crc kubenswrapper[5002]: E1209 10:02:47.060290 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:48 crc kubenswrapper[5002]: I1209 10:02:48.059861 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:48 crc kubenswrapper[5002]: I1209 10:02:48.059985 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:48 crc kubenswrapper[5002]: E1209 10:02:48.061963 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:48 crc kubenswrapper[5002]: E1209 10:02:48.062261 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:49 crc kubenswrapper[5002]: I1209 10:02:49.059316 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:49 crc kubenswrapper[5002]: I1209 10:02:49.059407 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:49 crc kubenswrapper[5002]: E1209 10:02:49.059531 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:49 crc kubenswrapper[5002]: E1209 10:02:49.059618 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:49 crc kubenswrapper[5002]: I1209 10:02:49.851208 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:49 crc kubenswrapper[5002]: E1209 10:02:49.851353 5002 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:02:49 crc kubenswrapper[5002]: E1209 10:02:49.851698 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs podName:7ffb94c3-624e-48aa-aaa9-450ace4e1862 nodeName:}" failed. No retries permitted until 2025-12-09 10:03:53.851677916 +0000 UTC m=+166.243728997 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs") pod "network-metrics-daemon-98z2f" (UID: "7ffb94c3-624e-48aa-aaa9-450ace4e1862") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 10:02:50 crc kubenswrapper[5002]: I1209 10:02:50.060281 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:50 crc kubenswrapper[5002]: I1209 10:02:50.060391 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:50 crc kubenswrapper[5002]: E1209 10:02:50.060886 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:50 crc kubenswrapper[5002]: E1209 10:02:50.061736 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:51 crc kubenswrapper[5002]: I1209 10:02:51.059163 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:51 crc kubenswrapper[5002]: I1209 10:02:51.059175 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:51 crc kubenswrapper[5002]: E1209 10:02:51.059395 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:51 crc kubenswrapper[5002]: E1209 10:02:51.059624 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:52 crc kubenswrapper[5002]: I1209 10:02:52.059799 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:52 crc kubenswrapper[5002]: I1209 10:02:52.059897 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:52 crc kubenswrapper[5002]: E1209 10:02:52.060095 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:52 crc kubenswrapper[5002]: E1209 10:02:52.060243 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:53 crc kubenswrapper[5002]: I1209 10:02:53.059508 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:53 crc kubenswrapper[5002]: I1209 10:02:53.059534 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:53 crc kubenswrapper[5002]: E1209 10:02:53.059647 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:53 crc kubenswrapper[5002]: E1209 10:02:53.059797 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:54 crc kubenswrapper[5002]: I1209 10:02:54.059966 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:54 crc kubenswrapper[5002]: I1209 10:02:54.060085 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:54 crc kubenswrapper[5002]: E1209 10:02:54.060158 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:54 crc kubenswrapper[5002]: E1209 10:02:54.060603 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:54 crc kubenswrapper[5002]: I1209 10:02:54.060928 5002 scope.go:117] "RemoveContainer" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" Dec 09 10:02:54 crc kubenswrapper[5002]: E1209 10:02:54.061086 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2mnnl_openshift-ovn-kubernetes(7013527e-73de-4427-af9c-e33663b1c222)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" Dec 09 10:02:55 crc kubenswrapper[5002]: I1209 10:02:55.060041 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:55 crc kubenswrapper[5002]: I1209 10:02:55.060112 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:55 crc kubenswrapper[5002]: E1209 10:02:55.060175 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:55 crc kubenswrapper[5002]: E1209 10:02:55.060234 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:56 crc kubenswrapper[5002]: I1209 10:02:56.059750 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:56 crc kubenswrapper[5002]: I1209 10:02:56.059750 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:56 crc kubenswrapper[5002]: E1209 10:02:56.060055 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:56 crc kubenswrapper[5002]: E1209 10:02:56.059926 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:57 crc kubenswrapper[5002]: I1209 10:02:57.059531 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:57 crc kubenswrapper[5002]: I1209 10:02:57.059565 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:57 crc kubenswrapper[5002]: E1209 10:02:57.059674 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:57 crc kubenswrapper[5002]: E1209 10:02:57.059891 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:02:58 crc kubenswrapper[5002]: I1209 10:02:58.060211 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:02:58 crc kubenswrapper[5002]: I1209 10:02:58.061498 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:02:58 crc kubenswrapper[5002]: E1209 10:02:58.061796 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:02:58 crc kubenswrapper[5002]: E1209 10:02:58.061873 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:02:59 crc kubenswrapper[5002]: I1209 10:02:59.059738 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:02:59 crc kubenswrapper[5002]: I1209 10:02:59.059881 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:02:59 crc kubenswrapper[5002]: E1209 10:02:59.059940 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:02:59 crc kubenswrapper[5002]: E1209 10:02:59.060050 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:00 crc kubenswrapper[5002]: I1209 10:03:00.059273 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:00 crc kubenswrapper[5002]: I1209 10:03:00.059418 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:00 crc kubenswrapper[5002]: E1209 10:03:00.059564 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:00 crc kubenswrapper[5002]: E1209 10:03:00.059789 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:01 crc kubenswrapper[5002]: I1209 10:03:01.059941 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:01 crc kubenswrapper[5002]: I1209 10:03:01.059990 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:01 crc kubenswrapper[5002]: E1209 10:03:01.060088 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:01 crc kubenswrapper[5002]: E1209 10:03:01.060230 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:02 crc kubenswrapper[5002]: I1209 10:03:02.059779 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:02 crc kubenswrapper[5002]: E1209 10:03:02.060021 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:02 crc kubenswrapper[5002]: I1209 10:03:02.060124 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:02 crc kubenswrapper[5002]: E1209 10:03:02.060326 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:03 crc kubenswrapper[5002]: I1209 10:03:03.060141 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:03 crc kubenswrapper[5002]: I1209 10:03:03.060148 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:03 crc kubenswrapper[5002]: E1209 10:03:03.060337 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:03 crc kubenswrapper[5002]: E1209 10:03:03.060436 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:04 crc kubenswrapper[5002]: I1209 10:03:04.059762 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:04 crc kubenswrapper[5002]: E1209 10:03:04.059949 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:04 crc kubenswrapper[5002]: I1209 10:03:04.060453 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:04 crc kubenswrapper[5002]: E1209 10:03:04.060568 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:05 crc kubenswrapper[5002]: I1209 10:03:05.060274 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:05 crc kubenswrapper[5002]: I1209 10:03:05.060546 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:05 crc kubenswrapper[5002]: E1209 10:03:05.060694 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:05 crc kubenswrapper[5002]: E1209 10:03:05.061151 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:05 crc kubenswrapper[5002]: I1209 10:03:05.079906 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rgf44_28ed6e93-eda5-4648-b185-25d2960ce0f0/kube-multus/1.log" Dec 09 10:03:05 crc kubenswrapper[5002]: I1209 10:03:05.080723 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rgf44_28ed6e93-eda5-4648-b185-25d2960ce0f0/kube-multus/0.log" Dec 09 10:03:05 crc kubenswrapper[5002]: I1209 10:03:05.080800 5002 generic.go:334] "Generic (PLEG): container finished" podID="28ed6e93-eda5-4648-b185-25d2960ce0f0" containerID="fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23" exitCode=1 Dec 09 10:03:05 crc kubenswrapper[5002]: I1209 10:03:05.080885 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rgf44" event={"ID":"28ed6e93-eda5-4648-b185-25d2960ce0f0","Type":"ContainerDied","Data":"fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23"} Dec 09 10:03:05 crc kubenswrapper[5002]: I1209 10:03:05.080976 5002 scope.go:117] "RemoveContainer" containerID="541795eab5847f1d993b8f9f324b454b951ed3930155e455f013e8da805c019b" Dec 09 10:03:05 crc kubenswrapper[5002]: I1209 10:03:05.081745 5002 scope.go:117] "RemoveContainer" containerID="fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23" Dec 09 10:03:05 crc kubenswrapper[5002]: E1209 10:03:05.082341 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-rgf44_openshift-multus(28ed6e93-eda5-4648-b185-25d2960ce0f0)\"" pod="openshift-multus/multus-rgf44" podUID="28ed6e93-eda5-4648-b185-25d2960ce0f0" Dec 09 10:03:05 crc kubenswrapper[5002]: I1209 10:03:05.101007 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pvjhn" podStartSLOduration=95.100986592 podStartE2EDuration="1m35.100986592s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:02:42.023866139 +0000 UTC m=+94.415917280" watchObservedRunningTime="2025-12-09 10:03:05.100986592 +0000 UTC m=+117.493037673" Dec 09 10:03:06 crc kubenswrapper[5002]: I1209 10:03:06.059682 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:06 crc kubenswrapper[5002]: I1209 10:03:06.059764 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:06 crc kubenswrapper[5002]: E1209 10:03:06.059887 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:06 crc kubenswrapper[5002]: E1209 10:03:06.059940 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:06 crc kubenswrapper[5002]: I1209 10:03:06.085618 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rgf44_28ed6e93-eda5-4648-b185-25d2960ce0f0/kube-multus/1.log" Dec 09 10:03:07 crc kubenswrapper[5002]: I1209 10:03:07.059449 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:07 crc kubenswrapper[5002]: I1209 10:03:07.059449 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:07 crc kubenswrapper[5002]: E1209 10:03:07.059630 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:07 crc kubenswrapper[5002]: E1209 10:03:07.059730 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:08 crc kubenswrapper[5002]: E1209 10:03:08.048808 5002 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 09 10:03:08 crc kubenswrapper[5002]: I1209 10:03:08.061728 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:08 crc kubenswrapper[5002]: I1209 10:03:08.061775 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:08 crc kubenswrapper[5002]: E1209 10:03:08.061918 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:08 crc kubenswrapper[5002]: E1209 10:03:08.062134 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:08 crc kubenswrapper[5002]: I1209 10:03:08.062796 5002 scope.go:117] "RemoveContainer" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" Dec 09 10:03:08 crc kubenswrapper[5002]: E1209 10:03:08.138358 5002 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 10:03:08 crc kubenswrapper[5002]: I1209 10:03:08.846728 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-98z2f"] Dec 09 10:03:08 crc kubenswrapper[5002]: I1209 10:03:08.846911 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:08 crc kubenswrapper[5002]: E1209 10:03:08.847043 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:09 crc kubenswrapper[5002]: I1209 10:03:09.059616 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:09 crc kubenswrapper[5002]: I1209 10:03:09.059631 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:09 crc kubenswrapper[5002]: E1209 10:03:09.059751 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:09 crc kubenswrapper[5002]: E1209 10:03:09.059937 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:09 crc kubenswrapper[5002]: I1209 10:03:09.106050 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/3.log" Dec 09 10:03:09 crc kubenswrapper[5002]: I1209 10:03:09.109510 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerStarted","Data":"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424"} Dec 09 10:03:09 crc kubenswrapper[5002]: I1209 10:03:09.110240 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:03:09 crc kubenswrapper[5002]: I1209 10:03:09.134176 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podStartSLOduration=99.13415924 podStartE2EDuration="1m39.13415924s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:09.133182074 +0000 UTC m=+121.525233165" watchObservedRunningTime="2025-12-09 10:03:09.13415924 +0000 UTC m=+121.526210321" Dec 09 10:03:10 crc kubenswrapper[5002]: I1209 10:03:10.059697 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:10 crc kubenswrapper[5002]: E1209 10:03:10.059913 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:11 crc kubenswrapper[5002]: I1209 10:03:11.059688 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:11 crc kubenswrapper[5002]: I1209 10:03:11.059776 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:11 crc kubenswrapper[5002]: E1209 10:03:11.059850 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:11 crc kubenswrapper[5002]: I1209 10:03:11.059782 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:11 crc kubenswrapper[5002]: E1209 10:03:11.059904 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:11 crc kubenswrapper[5002]: E1209 10:03:11.059923 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:12 crc kubenswrapper[5002]: I1209 10:03:12.059438 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:12 crc kubenswrapper[5002]: E1209 10:03:12.059606 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:13 crc kubenswrapper[5002]: I1209 10:03:13.060046 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:13 crc kubenswrapper[5002]: I1209 10:03:13.060044 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:13 crc kubenswrapper[5002]: E1209 10:03:13.060231 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:13 crc kubenswrapper[5002]: I1209 10:03:13.060356 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:13 crc kubenswrapper[5002]: E1209 10:03:13.060452 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:13 crc kubenswrapper[5002]: E1209 10:03:13.060636 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:13 crc kubenswrapper[5002]: E1209 10:03:13.140181 5002 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 10:03:14 crc kubenswrapper[5002]: I1209 10:03:14.060265 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:14 crc kubenswrapper[5002]: E1209 10:03:14.060671 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:15 crc kubenswrapper[5002]: I1209 10:03:15.060106 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:15 crc kubenswrapper[5002]: E1209 10:03:15.060246 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:15 crc kubenswrapper[5002]: I1209 10:03:15.060263 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:15 crc kubenswrapper[5002]: I1209 10:03:15.060287 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:15 crc kubenswrapper[5002]: E1209 10:03:15.060372 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:15 crc kubenswrapper[5002]: E1209 10:03:15.060441 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:16 crc kubenswrapper[5002]: I1209 10:03:16.059645 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:16 crc kubenswrapper[5002]: E1209 10:03:16.059866 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:17 crc kubenswrapper[5002]: I1209 10:03:17.059538 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:17 crc kubenswrapper[5002]: I1209 10:03:17.059577 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:17 crc kubenswrapper[5002]: I1209 10:03:17.059651 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:17 crc kubenswrapper[5002]: E1209 10:03:17.059733 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:17 crc kubenswrapper[5002]: E1209 10:03:17.059881 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:17 crc kubenswrapper[5002]: E1209 10:03:17.059966 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:18 crc kubenswrapper[5002]: I1209 10:03:18.059259 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:18 crc kubenswrapper[5002]: E1209 10:03:18.061936 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:18 crc kubenswrapper[5002]: E1209 10:03:18.140702 5002 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 10:03:19 crc kubenswrapper[5002]: I1209 10:03:19.060044 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:19 crc kubenswrapper[5002]: I1209 10:03:19.060138 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:19 crc kubenswrapper[5002]: I1209 10:03:19.060060 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:19 crc kubenswrapper[5002]: E1209 10:03:19.060225 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:19 crc kubenswrapper[5002]: E1209 10:03:19.060364 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:19 crc kubenswrapper[5002]: E1209 10:03:19.060580 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:20 crc kubenswrapper[5002]: I1209 10:03:20.059706 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:20 crc kubenswrapper[5002]: E1209 10:03:20.060161 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:20 crc kubenswrapper[5002]: I1209 10:03:20.060200 5002 scope.go:117] "RemoveContainer" containerID="fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23" Dec 09 10:03:21 crc kubenswrapper[5002]: I1209 10:03:21.059410 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:21 crc kubenswrapper[5002]: I1209 10:03:21.059440 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:21 crc kubenswrapper[5002]: E1209 10:03:21.059545 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:21 crc kubenswrapper[5002]: E1209 10:03:21.059781 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:21 crc kubenswrapper[5002]: I1209 10:03:21.060030 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:21 crc kubenswrapper[5002]: E1209 10:03:21.060175 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:21 crc kubenswrapper[5002]: I1209 10:03:21.146282 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rgf44_28ed6e93-eda5-4648-b185-25d2960ce0f0/kube-multus/1.log" Dec 09 10:03:21 crc kubenswrapper[5002]: I1209 10:03:21.146344 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rgf44" event={"ID":"28ed6e93-eda5-4648-b185-25d2960ce0f0","Type":"ContainerStarted","Data":"687b3ce71d2c60749a207725755ffed8681eeb08793418478b64111660f7d33d"} Dec 09 10:03:22 crc kubenswrapper[5002]: I1209 10:03:22.060145 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:22 crc kubenswrapper[5002]: E1209 10:03:22.060354 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 10:03:23 crc kubenswrapper[5002]: I1209 10:03:23.059670 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:23 crc kubenswrapper[5002]: I1209 10:03:23.059657 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:23 crc kubenswrapper[5002]: E1209 10:03:23.060344 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 10:03:23 crc kubenswrapper[5002]: I1209 10:03:23.059731 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:23 crc kubenswrapper[5002]: E1209 10:03:23.060476 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 10:03:23 crc kubenswrapper[5002]: E1209 10:03:23.060630 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z2f" podUID="7ffb94c3-624e-48aa-aaa9-450ace4e1862" Dec 09 10:03:24 crc kubenswrapper[5002]: I1209 10:03:24.060220 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:24 crc kubenswrapper[5002]: I1209 10:03:24.062967 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 09 10:03:24 crc kubenswrapper[5002]: I1209 10:03:24.063118 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 09 10:03:25 crc kubenswrapper[5002]: I1209 10:03:25.059280 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:25 crc kubenswrapper[5002]: I1209 10:03:25.059353 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:25 crc kubenswrapper[5002]: I1209 10:03:25.059406 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:25 crc kubenswrapper[5002]: I1209 10:03:25.062538 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 09 10:03:25 crc kubenswrapper[5002]: I1209 10:03:25.062663 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 09 10:03:25 crc kubenswrapper[5002]: I1209 10:03:25.062666 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 09 10:03:25 crc kubenswrapper[5002]: I1209 10:03:25.062754 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 09 10:03:27 crc kubenswrapper[5002]: I1209 10:03:27.920156 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.046657 5002 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.090356 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-99l4h"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.092153 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.092550 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.092894 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.093222 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-np52q"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.093951 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.094436 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k6s7s"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.095244 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.096340 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-899rx"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.096936 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cmm6f"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.097085 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.097879 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9fjq6"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.098268 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.098317 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.098670 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-n967v"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.099163 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.100652 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-xp9jk"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.101456 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xp9jk" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.102434 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.105870 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.106025 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.106192 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.106533 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.106810 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.107075 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.107149 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.108063 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.111153 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.111490 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.111674 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rwv2p"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.112132 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.112432 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.112924 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.112966 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.113296 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.112931 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.113785 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.113898 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.114009 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.114017 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.114119 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.114242 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.114301 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.115087 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.115546 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.115563 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.116551 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.116961 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.117406 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.117613 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.117791 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.118747 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.119082 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.119297 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.119529 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.119889 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.120382 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.120632 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.120920 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.121362 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.121534 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.121709 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.121934 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.122146 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.123296 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.123541 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.123743 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.124049 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.124766 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.125013 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.125203 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.125335 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.125430 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.125633 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.125740 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.125870 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.125967 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.125967 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mtwnb"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.126338 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.126395 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.126482 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.126524 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.126648 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.125013 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.126350 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.126753 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.125051 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.126689 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.127180 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.127393 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.127648 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2s9w5"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.127897 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128077 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-client-ca\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128118 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae53c0af-26d8-4db2-8885-86b643bf2ee1-serving-cert\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128141 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca986cd-d4df-4af5-8918-33f445a52f49-serving-cert\") pod \"openshift-config-operator-7777fb866f-899rx\" (UID: \"2ca986cd-d4df-4af5-8918-33f445a52f49\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128167 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2fp5\" (UniqueName: \"kubernetes.io/projected/8089a4b0-6e30-4f86-a14c-17435b2eaab8-kube-api-access-p2fp5\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128189 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr4w7\" (UniqueName: \"kubernetes.io/projected/e2598866-d004-41a0-b058-01fb9a379df5-kube-api-access-jr4w7\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128212 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84005632-49e7-400a-8748-5f16909cfda1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128232 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8089a4b0-6e30-4f86-a14c-17435b2eaab8-config\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128255 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128276 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66daadf1-5a22-4a90-a004-68fc7848d557-config\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128296 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-config\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128317 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/84005632-49e7-400a-8748-5f16909cfda1-encryption-config\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128341 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pfj8\" (UniqueName: \"kubernetes.io/projected/ae53c0af-26d8-4db2-8885-86b643bf2ee1-kube-api-access-4pfj8\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128364 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcdecdfe-61c0-4d85-99b5-e1fe25727259-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128680 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwzvz\" (UniqueName: \"kubernetes.io/projected/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-kube-api-access-nwzvz\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128751 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcdecdfe-61c0-4d85-99b5-e1fe25727259-config\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128776 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128796 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fef799ad-c87c-4f60-9b8e-afb51958013d-serving-cert\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128881 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/84005632-49e7-400a-8748-5f16909cfda1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.128950 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmb4z\" (UniqueName: \"kubernetes.io/projected/fef799ad-c87c-4f60-9b8e-afb51958013d-kube-api-access-kmb4z\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129009 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-serving-cert\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129057 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-config\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129091 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dcdecdfe-61c0-4d85-99b5-e1fe25727259-images\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129122 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2ltg\" (UniqueName: \"kubernetes.io/projected/8a5ea199-20b4-419b-97bc-0845ac5bb720-kube-api-access-h2ltg\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129153 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mmb9\" (UniqueName: \"kubernetes.io/projected/58ef0377-76a5-4299-999f-7bdd55606533-kube-api-access-9mmb9\") pod \"downloads-7954f5f757-xp9jk\" (UID: \"58ef0377-76a5-4299-999f-7bdd55606533\") " pod="openshift-console/downloads-7954f5f757-xp9jk" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129182 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvxdt\" (UniqueName: \"kubernetes.io/projected/66daadf1-5a22-4a90-a004-68fc7848d557-kube-api-access-fvxdt\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129199 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-image-import-ca\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129216 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-audit-policies\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129231 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb35474f-f199-45c9-8a0d-3c7c93268ef6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129245 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129258 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a5ea199-20b4-419b-97bc-0845ac5bb720-node-pullsecrets\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129272 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a5ea199-20b4-419b-97bc-0845ac5bb720-serving-cert\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129285 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870c64cb-d3e9-4491-9c83-31c51100256a-config\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129308 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84005632-49e7-400a-8748-5f16909cfda1-audit-policies\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129323 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66daadf1-5a22-4a90-a004-68fc7848d557-serving-cert\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129337 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84005632-49e7-400a-8748-5f16909cfda1-serving-cert\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129351 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcx4k\" (UniqueName: \"kubernetes.io/projected/870c64cb-d3e9-4491-9c83-31c51100256a-kube-api-access-xcx4k\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129366 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129386 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4mjj\" (UniqueName: \"kubernetes.io/projected/84005632-49e7-400a-8748-5f16909cfda1-kube-api-access-c4mjj\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129400 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8089a4b0-6e30-4f86-a14c-17435b2eaab8-trusted-ca\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129414 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-config\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129435 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129449 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krgnn\" (UniqueName: \"kubernetes.io/projected/2ca986cd-d4df-4af5-8918-33f445a52f49-kube-api-access-krgnn\") pod \"openshift-config-operator-7777fb866f-899rx\" (UID: \"2ca986cd-d4df-4af5-8918-33f445a52f49\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129466 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-trusted-ca-bundle\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129482 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129496 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8a5ea199-20b4-419b-97bc-0845ac5bb720-etcd-client\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129510 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8a5ea199-20b4-419b-97bc-0845ac5bb720-encryption-config\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129526 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66daadf1-5a22-4a90-a004-68fc7848d557-service-ca-bundle\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129541 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84005632-49e7-400a-8748-5f16909cfda1-audit-dir\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129554 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8089a4b0-6e30-4f86-a14c-17435b2eaab8-serving-cert\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129571 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0da3a8de-5a81-4bfb-87d0-756273bfefb3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-79h6t\" (UID: \"0da3a8de-5a81-4bfb-87d0-756273bfefb3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129586 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn69d\" (UniqueName: \"kubernetes.io/projected/dcdecdfe-61c0-4d85-99b5-e1fe25727259-kube-api-access-jn69d\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129600 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-etcd-serving-ca\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129617 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129632 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129647 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/84005632-49e7-400a-8748-5f16909cfda1-etcd-client\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129661 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129675 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp6p9\" (UniqueName: \"kubernetes.io/projected/0da3a8de-5a81-4bfb-87d0-756273bfefb3-kube-api-access-rp6p9\") pod \"cluster-samples-operator-665b6dd947-79h6t\" (UID: \"0da3a8de-5a81-4bfb-87d0-756273bfefb3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129689 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129702 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129718 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2ca986cd-d4df-4af5-8918-33f445a52f49-available-featuregates\") pod \"openshift-config-operator-7777fb866f-899rx\" (UID: \"2ca986cd-d4df-4af5-8918-33f445a52f49\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129734 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-oauth-serving-cert\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129759 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb35474f-f199-45c9-8a0d-3c7c93268ef6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129775 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129789 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66daadf1-5a22-4a90-a004-68fc7848d557-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129805 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n4g8\" (UniqueName: \"kubernetes.io/projected/bb35474f-f199-45c9-8a0d-3c7c93268ef6-kube-api-access-6n4g8\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129842 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-oauth-config\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129856 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e2598866-d004-41a0-b058-01fb9a379df5-audit-dir\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129869 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a5ea199-20b4-419b-97bc-0845ac5bb720-audit-dir\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129883 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/870c64cb-d3e9-4491-9c83-31c51100256a-auth-proxy-config\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129904 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-client-ca\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129929 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-service-ca\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129944 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-trusted-ca-bundle\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129960 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb35474f-f199-45c9-8a0d-3c7c93268ef6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129980 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-config\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.129993 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-audit\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.130006 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/870c64cb-d3e9-4491-9c83-31c51100256a-machine-approver-tls\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.132749 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.133599 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.137003 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-82dlm"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.137494 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.143603 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.144085 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.144095 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.144118 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.144255 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.159825 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.144268 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.144524 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.170658 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.170845 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.170845 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.171366 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.171547 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.171584 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.183021 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.183868 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.185404 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.187130 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.187220 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.187296 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.187221 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.187984 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.188233 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.188285 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.188698 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.189152 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.193835 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.198425 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.198430 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.198481 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.198482 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.198519 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.198845 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.199276 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.199842 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.201022 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.202020 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-99l4h"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.202589 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.204369 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.204655 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.205677 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.205997 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.207310 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.211877 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lxrwt"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.212549 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.227638 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.228400 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.228638 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9fjq6"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.228748 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.237711 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.238694 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239168 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2fp5\" (UniqueName: \"kubernetes.io/projected/8089a4b0-6e30-4f86-a14c-17435b2eaab8-kube-api-access-p2fp5\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239199 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a70e2957-fa0c-4393-adaf-93a0759263d1-serving-cert\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239226 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-client-ca\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239246 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae53c0af-26d8-4db2-8885-86b643bf2ee1-serving-cert\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239261 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239266 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca986cd-d4df-4af5-8918-33f445a52f49-serving-cert\") pod \"openshift-config-operator-7777fb866f-899rx\" (UID: \"2ca986cd-d4df-4af5-8918-33f445a52f49\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239288 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr4w7\" (UniqueName: \"kubernetes.io/projected/e2598866-d004-41a0-b058-01fb9a379df5-kube-api-access-jr4w7\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239307 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84005632-49e7-400a-8748-5f16909cfda1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239325 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8089a4b0-6e30-4f86-a14c-17435b2eaab8-config\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239345 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcnr5\" (UniqueName: \"kubernetes.io/projected/a70e2957-fa0c-4393-adaf-93a0759263d1-kube-api-access-qcnr5\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239364 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239385 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66daadf1-5a22-4a90-a004-68fc7848d557-config\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239765 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/84005632-49e7-400a-8748-5f16909cfda1-encryption-config\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239804 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-config\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239848 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcdecdfe-61c0-4d85-99b5-e1fe25727259-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239873 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwzvz\" (UniqueName: \"kubernetes.io/projected/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-kube-api-access-nwzvz\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239900 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pfj8\" (UniqueName: \"kubernetes.io/projected/ae53c0af-26d8-4db2-8885-86b643bf2ee1-kube-api-access-4pfj8\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239963 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcdecdfe-61c0-4d85-99b5-e1fe25727259-config\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240062 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240114 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fef799ad-c87c-4f60-9b8e-afb51958013d-serving-cert\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240171 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/84005632-49e7-400a-8748-5f16909cfda1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240199 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmb4z\" (UniqueName: \"kubernetes.io/projected/fef799ad-c87c-4f60-9b8e-afb51958013d-kube-api-access-kmb4z\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240223 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-serving-cert\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240252 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rmfg\" (UniqueName: \"kubernetes.io/projected/14ca2bab-e020-4c2d-be3c-e4138095c149-kube-api-access-5rmfg\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffzj8\" (UID: \"14ca2bab-e020-4c2d-be3c-e4138095c149\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240279 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7272a5-afe4-4ad1-bc7e-2c2397b84a95-metrics-tls\") pod \"dns-operator-744455d44c-lxrwt\" (UID: \"bf7272a5-afe4-4ad1-bc7e-2c2397b84a95\") " pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240303 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-config\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240326 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dcdecdfe-61c0-4d85-99b5-e1fe25727259-images\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240352 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2ltg\" (UniqueName: \"kubernetes.io/projected/8a5ea199-20b4-419b-97bc-0845ac5bb720-kube-api-access-h2ltg\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240375 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mmb9\" (UniqueName: \"kubernetes.io/projected/58ef0377-76a5-4299-999f-7bdd55606533-kube-api-access-9mmb9\") pod \"downloads-7954f5f757-xp9jk\" (UID: \"58ef0377-76a5-4299-999f-7bdd55606533\") " pod="openshift-console/downloads-7954f5f757-xp9jk" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240397 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-client-ca\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240400 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvxdt\" (UniqueName: \"kubernetes.io/projected/66daadf1-5a22-4a90-a004-68fc7848d557-kube-api-access-fvxdt\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240448 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-image-import-ca\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240473 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a70e2957-fa0c-4393-adaf-93a0759263d1-config\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240502 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-audit-policies\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240526 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb35474f-f199-45c9-8a0d-3c7c93268ef6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240548 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ca2bab-e020-4c2d-be3c-e4138095c149-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffzj8\" (UID: \"14ca2bab-e020-4c2d-be3c-e4138095c149\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240571 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240618 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a5ea199-20b4-419b-97bc-0845ac5bb720-node-pullsecrets\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240638 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a5ea199-20b4-419b-97bc-0845ac5bb720-serving-cert\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240660 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870c64cb-d3e9-4491-9c83-31c51100256a-config\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240694 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84005632-49e7-400a-8748-5f16909cfda1-audit-policies\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240720 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66daadf1-5a22-4a90-a004-68fc7848d557-serving-cert\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240739 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84005632-49e7-400a-8748-5f16909cfda1-serving-cert\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240759 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcx4k\" (UniqueName: \"kubernetes.io/projected/870c64cb-d3e9-4491-9c83-31c51100256a-kube-api-access-xcx4k\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.240791 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.241046 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84005632-49e7-400a-8748-5f16909cfda1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.241079 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.239309 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.241454 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcdecdfe-61c0-4d85-99b5-e1fe25727259-config\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.241450 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a5ea199-20b4-419b-97bc-0845ac5bb720-node-pullsecrets\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.241772 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/84005632-49e7-400a-8748-5f16909cfda1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242350 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4mjj\" (UniqueName: \"kubernetes.io/projected/84005632-49e7-400a-8748-5f16909cfda1-kube-api-access-c4mjj\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242391 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8089a4b0-6e30-4f86-a14c-17435b2eaab8-trusted-ca\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242454 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-config\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242480 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242504 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krgnn\" (UniqueName: \"kubernetes.io/projected/2ca986cd-d4df-4af5-8918-33f445a52f49-kube-api-access-krgnn\") pod \"openshift-config-operator-7777fb866f-899rx\" (UID: \"2ca986cd-d4df-4af5-8918-33f445a52f49\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242526 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-trusted-ca-bundle\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242550 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242571 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8a5ea199-20b4-419b-97bc-0845ac5bb720-etcd-client\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242574 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-audit-policies\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242592 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8a5ea199-20b4-419b-97bc-0845ac5bb720-encryption-config\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242615 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8089a4b0-6e30-4f86-a14c-17435b2eaab8-serving-cert\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242637 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66daadf1-5a22-4a90-a004-68fc7848d557-service-ca-bundle\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242658 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84005632-49e7-400a-8748-5f16909cfda1-audit-dir\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242680 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-etcd-serving-ca\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242703 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a05f51-2e2c-4918-80a6-201afd3d14a5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wxfc6\" (UID: \"e8a05f51-2e2c-4918-80a6-201afd3d14a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242728 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0da3a8de-5a81-4bfb-87d0-756273bfefb3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-79h6t\" (UID: \"0da3a8de-5a81-4bfb-87d0-756273bfefb3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242744 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242757 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn69d\" (UniqueName: \"kubernetes.io/projected/dcdecdfe-61c0-4d85-99b5-e1fe25727259-kube-api-access-jn69d\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242780 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/84005632-49e7-400a-8748-5f16909cfda1-etcd-client\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242802 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a70e2957-fa0c-4393-adaf-93a0759263d1-etcd-service-ca\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242845 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242870 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242894 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a70e2957-fa0c-4393-adaf-93a0759263d1-etcd-client\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242922 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242951 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ca2bab-e020-4c2d-be3c-e4138095c149-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffzj8\" (UID: \"14ca2bab-e020-4c2d-be3c-e4138095c149\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242975 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a70e2957-fa0c-4393-adaf-93a0759263d1-etcd-ca\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.243002 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-oauth-serving-cert\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.243027 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7wjn\" (UniqueName: \"kubernetes.io/projected/e8a05f51-2e2c-4918-80a6-201afd3d14a5-kube-api-access-v7wjn\") pod \"openshift-apiserver-operator-796bbdcf4f-wxfc6\" (UID: \"e8a05f51-2e2c-4918-80a6-201afd3d14a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.243053 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp6p9\" (UniqueName: \"kubernetes.io/projected/0da3a8de-5a81-4bfb-87d0-756273bfefb3-kube-api-access-rp6p9\") pod \"cluster-samples-operator-665b6dd947-79h6t\" (UID: \"0da3a8de-5a81-4bfb-87d0-756273bfefb3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.243270 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-image-import-ca\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.243659 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-config\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.246092 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.246206 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/84005632-49e7-400a-8748-5f16909cfda1-encryption-config\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.246288 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dcdecdfe-61c0-4d85-99b5-e1fe25727259-images\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.246876 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.246962 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.246998 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2ca986cd-d4df-4af5-8918-33f445a52f49-available-featuregates\") pod \"openshift-config-operator-7777fb866f-899rx\" (UID: \"2ca986cd-d4df-4af5-8918-33f445a52f49\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.247030 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb35474f-f199-45c9-8a0d-3c7c93268ef6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.247058 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.247083 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66daadf1-5a22-4a90-a004-68fc7848d557-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.247109 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n4g8\" (UniqueName: \"kubernetes.io/projected/bb35474f-f199-45c9-8a0d-3c7c93268ef6-kube-api-access-6n4g8\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.247370 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66daadf1-5a22-4a90-a004-68fc7848d557-service-ca-bundle\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.247412 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84005632-49e7-400a-8748-5f16909cfda1-audit-dir\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.247693 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae53c0af-26d8-4db2-8885-86b643bf2ee1-serving-cert\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.247791 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-etcd-serving-ca\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249081 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249531 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-config\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249688 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-oauth-config\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249730 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e2598866-d004-41a0-b058-01fb9a379df5-audit-dir\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249756 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a5ea199-20b4-419b-97bc-0845ac5bb720-audit-dir\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249780 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/870c64cb-d3e9-4491-9c83-31c51100256a-auth-proxy-config\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249804 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-client-ca\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249860 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-service-ca\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249887 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8a05f51-2e2c-4918-80a6-201afd3d14a5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wxfc6\" (UID: \"e8a05f51-2e2c-4918-80a6-201afd3d14a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249920 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-trusted-ca-bundle\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249945 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scrpt\" (UniqueName: \"kubernetes.io/projected/bf7272a5-afe4-4ad1-bc7e-2c2397b84a95-kube-api-access-scrpt\") pod \"dns-operator-744455d44c-lxrwt\" (UID: \"bf7272a5-afe4-4ad1-bc7e-2c2397b84a95\") " pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.249970 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb35474f-f199-45c9-8a0d-3c7c93268ef6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.250009 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-config\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.250029 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-audit\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.250054 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/870c64cb-d3e9-4491-9c83-31c51100256a-machine-approver-tls\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.250266 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fef799ad-c87c-4f60-9b8e-afb51958013d-serving-cert\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.242681 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.252046 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.252160 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8089a4b0-6e30-4f86-a14c-17435b2eaab8-serving-cert\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.252365 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbsjj"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.252659 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.252857 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8089a4b0-6e30-4f86-a14c-17435b2eaab8-config\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.253015 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.253414 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.253666 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.253747 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-95fhm"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.253852 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.253990 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.254212 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.254258 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca986cd-d4df-4af5-8918-33f445a52f49-serving-cert\") pod \"openshift-config-operator-7777fb866f-899rx\" (UID: \"2ca986cd-d4df-4af5-8918-33f445a52f49\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.254415 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.254698 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-serving-cert\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.254800 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66daadf1-5a22-4a90-a004-68fc7848d557-config\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.254855 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e2598866-d004-41a0-b058-01fb9a379df5-audit-dir\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.255291 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8089a4b0-6e30-4f86-a14c-17435b2eaab8-trusted-ca\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.255873 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.256014 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a5ea199-20b4-419b-97bc-0845ac5bb720-serving-cert\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.256597 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-oauth-serving-cert\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.256634 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870c64cb-d3e9-4491-9c83-31c51100256a-config\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.256893 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.257257 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-config\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.257490 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a5ea199-20b4-419b-97bc-0845ac5bb720-audit-dir\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.257512 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-trusted-ca-bundle\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.257598 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84005632-49e7-400a-8748-5f16909cfda1-audit-policies\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.257655 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.258225 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0da3a8de-5a81-4bfb-87d0-756273bfefb3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-79h6t\" (UID: \"0da3a8de-5a81-4bfb-87d0-756273bfefb3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.258395 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.258470 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.258575 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/870c64cb-d3e9-4491-9c83-31c51100256a-auth-proxy-config\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.259384 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-trusted-ca-bundle\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.260045 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.260574 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-client-ca\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.261467 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.261511 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-config\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.262073 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8a5ea199-20b4-419b-97bc-0845ac5bb720-audit\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.262273 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x5gjs"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.262328 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2ca986cd-d4df-4af5-8918-33f445a52f49-available-featuregates\") pod \"openshift-config-operator-7777fb866f-899rx\" (UID: \"2ca986cd-d4df-4af5-8918-33f445a52f49\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.262417 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.263190 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.264496 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66daadf1-5a22-4a90-a004-68fc7848d557-serving-cert\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.264623 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.264640 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66daadf1-5a22-4a90-a004-68fc7848d557-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.264792 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8a5ea199-20b4-419b-97bc-0845ac5bb720-encryption-config\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.264885 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-service-ca\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.265098 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.265153 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.265097 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.266124 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.267200 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.267304 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5gtc8"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.267422 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.267586 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-oauth-config\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.268535 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.268715 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb35474f-f199-45c9-8a0d-3c7c93268ef6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.269139 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lm8br"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.269412 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.269787 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.269859 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.269977 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.270343 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.270388 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.270704 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.271019 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb35474f-f199-45c9-8a0d-3c7c93268ef6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.271892 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.272277 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.273092 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.274332 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.275516 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-kpldz"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.275943 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kpldz" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.277278 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cmm6f"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.277295 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.279005 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.279069 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2s9w5"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.282201 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rwv2p"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.283938 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k6s7s"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.286049 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xp9jk"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.288046 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-899rx"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.289406 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.289474 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-82dlm"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.289709 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84005632-49e7-400a-8748-5f16909cfda1-serving-cert\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.289715 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/84005632-49e7-400a-8748-5f16909cfda1-etcd-client\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.289870 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcdecdfe-61c0-4d85-99b5-e1fe25727259-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.289890 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/870c64cb-d3e9-4491-9c83-31c51100256a-machine-approver-tls\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.290824 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8a5ea199-20b4-419b-97bc-0845ac5bb720-etcd-client\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.291072 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.292336 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbsjj"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.294759 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mtwnb"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.295948 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.296015 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.296834 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.298242 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.299803 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.301105 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.305751 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lxrwt"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.310075 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-n967v"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.310129 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.311850 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.318424 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.320914 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-95fhm"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.323105 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x5gjs"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.326913 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.328458 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fm26s"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.329361 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fm26s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.330588 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.332294 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5z75v"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.334347 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lm8br"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.334450 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.335230 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.335547 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.336578 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.337771 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.338916 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.340046 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.341140 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.342470 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fm26s"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.343661 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.344927 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.346398 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5z75v"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.347483 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pm68f"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.348717 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pm68f"] Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.348850 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351291 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rmfg\" (UniqueName: \"kubernetes.io/projected/14ca2bab-e020-4c2d-be3c-e4138095c149-kube-api-access-5rmfg\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffzj8\" (UID: \"14ca2bab-e020-4c2d-be3c-e4138095c149\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351341 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7272a5-afe4-4ad1-bc7e-2c2397b84a95-metrics-tls\") pod \"dns-operator-744455d44c-lxrwt\" (UID: \"bf7272a5-afe4-4ad1-bc7e-2c2397b84a95\") " pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351399 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a70e2957-fa0c-4393-adaf-93a0759263d1-config\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351423 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ca2bab-e020-4c2d-be3c-e4138095c149-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffzj8\" (UID: \"14ca2bab-e020-4c2d-be3c-e4138095c149\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351511 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a05f51-2e2c-4918-80a6-201afd3d14a5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wxfc6\" (UID: \"e8a05f51-2e2c-4918-80a6-201afd3d14a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351537 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a70e2957-fa0c-4393-adaf-93a0759263d1-etcd-service-ca\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351558 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a70e2957-fa0c-4393-adaf-93a0759263d1-etcd-ca\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351577 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a70e2957-fa0c-4393-adaf-93a0759263d1-etcd-client\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351611 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ca2bab-e020-4c2d-be3c-e4138095c149-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffzj8\" (UID: \"14ca2bab-e020-4c2d-be3c-e4138095c149\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351644 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7wjn\" (UniqueName: \"kubernetes.io/projected/e8a05f51-2e2c-4918-80a6-201afd3d14a5-kube-api-access-v7wjn\") pod \"openshift-apiserver-operator-796bbdcf4f-wxfc6\" (UID: \"e8a05f51-2e2c-4918-80a6-201afd3d14a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351698 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scrpt\" (UniqueName: \"kubernetes.io/projected/bf7272a5-afe4-4ad1-bc7e-2c2397b84a95-kube-api-access-scrpt\") pod \"dns-operator-744455d44c-lxrwt\" (UID: \"bf7272a5-afe4-4ad1-bc7e-2c2397b84a95\") " pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351720 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8a05f51-2e2c-4918-80a6-201afd3d14a5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wxfc6\" (UID: \"e8a05f51-2e2c-4918-80a6-201afd3d14a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351770 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a70e2957-fa0c-4393-adaf-93a0759263d1-serving-cert\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.351792 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcnr5\" (UniqueName: \"kubernetes.io/projected/a70e2957-fa0c-4393-adaf-93a0759263d1-kube-api-access-qcnr5\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.352949 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a70e2957-fa0c-4393-adaf-93a0759263d1-etcd-ca\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.353422 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a70e2957-fa0c-4393-adaf-93a0759263d1-config\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.353738 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a05f51-2e2c-4918-80a6-201afd3d14a5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wxfc6\" (UID: \"e8a05f51-2e2c-4918-80a6-201afd3d14a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.354264 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a70e2957-fa0c-4393-adaf-93a0759263d1-etcd-service-ca\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.355802 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.357465 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a70e2957-fa0c-4393-adaf-93a0759263d1-serving-cert\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.357764 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a70e2957-fa0c-4393-adaf-93a0759263d1-etcd-client\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.358051 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8a05f51-2e2c-4918-80a6-201afd3d14a5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wxfc6\" (UID: \"e8a05f51-2e2c-4918-80a6-201afd3d14a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.375028 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.384604 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ca2bab-e020-4c2d-be3c-e4138095c149-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffzj8\" (UID: \"14ca2bab-e020-4c2d-be3c-e4138095c149\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.395294 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.414924 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.434696 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.444008 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ca2bab-e020-4c2d-be3c-e4138095c149-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffzj8\" (UID: \"14ca2bab-e020-4c2d-be3c-e4138095c149\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.475275 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.494754 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.514928 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.524731 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7272a5-afe4-4ad1-bc7e-2c2397b84a95-metrics-tls\") pod \"dns-operator-744455d44c-lxrwt\" (UID: \"bf7272a5-afe4-4ad1-bc7e-2c2397b84a95\") " pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.534884 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.574964 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.595511 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.615535 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.634197 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.670458 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr4w7\" (UniqueName: \"kubernetes.io/projected/e2598866-d004-41a0-b058-01fb9a379df5-kube-api-access-jr4w7\") pod \"oauth-openshift-558db77b4-9fjq6\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.690323 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2fp5\" (UniqueName: \"kubernetes.io/projected/8089a4b0-6e30-4f86-a14c-17435b2eaab8-kube-api-access-p2fp5\") pod \"console-operator-58897d9998-mtwnb\" (UID: \"8089a4b0-6e30-4f86-a14c-17435b2eaab8\") " pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.692089 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.709902 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvxdt\" (UniqueName: \"kubernetes.io/projected/66daadf1-5a22-4a90-a004-68fc7848d557-kube-api-access-fvxdt\") pod \"authentication-operator-69f744f599-rwv2p\" (UID: \"66daadf1-5a22-4a90-a004-68fc7848d557\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.729034 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pfj8\" (UniqueName: \"kubernetes.io/projected/ae53c0af-26d8-4db2-8885-86b643bf2ee1-kube-api-access-4pfj8\") pod \"controller-manager-879f6c89f-k6s7s\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.750208 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmb4z\" (UniqueName: \"kubernetes.io/projected/fef799ad-c87c-4f60-9b8e-afb51958013d-kube-api-access-kmb4z\") pod \"route-controller-manager-6576b87f9c-c9pnt\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.754979 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.791446 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb35474f-f199-45c9-8a0d-3c7c93268ef6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.811862 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2ltg\" (UniqueName: \"kubernetes.io/projected/8a5ea199-20b4-419b-97bc-0845ac5bb720-kube-api-access-h2ltg\") pod \"apiserver-76f77b778f-99l4h\" (UID: \"8a5ea199-20b4-419b-97bc-0845ac5bb720\") " pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.830282 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.832510 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mmb9\" (UniqueName: \"kubernetes.io/projected/58ef0377-76a5-4299-999f-7bdd55606533-kube-api-access-9mmb9\") pod \"downloads-7954f5f757-xp9jk\" (UID: \"58ef0377-76a5-4299-999f-7bdd55606533\") " pod="openshift-console/downloads-7954f5f757-xp9jk" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.849366 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn69d\" (UniqueName: \"kubernetes.io/projected/dcdecdfe-61c0-4d85-99b5-e1fe25727259-kube-api-access-jn69d\") pod \"machine-api-operator-5694c8668f-cmm6f\" (UID: \"dcdecdfe-61c0-4d85-99b5-e1fe25727259\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.858236 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.859445 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mtwnb"] Dec 09 10:03:32 crc kubenswrapper[5002]: W1209 10:03:32.868168 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8089a4b0_6e30_4f86_a14c_17435b2eaab8.slice/crio-68d3bd2f0ee24d01f346b1f4232de53d822d26f15497891f5d013b17dd0af5a5 WatchSource:0}: Error finding container 68d3bd2f0ee24d01f346b1f4232de53d822d26f15497891f5d013b17dd0af5a5: Status 404 returned error can't find the container with id 68d3bd2f0ee24d01f346b1f4232de53d822d26f15497891f5d013b17dd0af5a5 Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.874962 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.895124 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.910164 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.915658 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.922581 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.935527 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.941398 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xp9jk" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.956532 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.957666 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.971565 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.975460 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 09 10:03:32 crc kubenswrapper[5002]: I1209 10:03:32.999386 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k6s7s"] Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.004564 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.015513 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.036400 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.039020 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.054926 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.075036 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.098933 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.100390 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9fjq6"] Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.115024 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.126474 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cmm6f"] Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.135651 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.154614 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xp9jk"] Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.155206 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.176465 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.192412 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mtwnb" event={"ID":"8089a4b0-6e30-4f86-a14c-17435b2eaab8","Type":"ContainerStarted","Data":"68d3bd2f0ee24d01f346b1f4232de53d822d26f15497891f5d013b17dd0af5a5"} Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.193464 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" event={"ID":"ae53c0af-26d8-4db2-8885-86b643bf2ee1","Type":"ContainerStarted","Data":"917b424463977c2974818a32b9e61d3ead53c2d7a90b84b270f96c2d44c65ee8"} Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.196477 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.215907 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.234715 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.257913 5002 request.go:700] Waited for 1.002844388s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.279789 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp6p9\" (UniqueName: \"kubernetes.io/projected/0da3a8de-5a81-4bfb-87d0-756273bfefb3-kube-api-access-rp6p9\") pod \"cluster-samples-operator-665b6dd947-79h6t\" (UID: \"0da3a8de-5a81-4bfb-87d0-756273bfefb3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.296720 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcx4k\" (UniqueName: \"kubernetes.io/projected/870c64cb-d3e9-4491-9c83-31c51100256a-kube-api-access-xcx4k\") pod \"machine-approver-56656f9798-np52q\" (UID: \"870c64cb-d3e9-4491-9c83-31c51100256a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.308464 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4mjj\" (UniqueName: \"kubernetes.io/projected/84005632-49e7-400a-8748-5f16909cfda1-kube-api-access-c4mjj\") pod \"apiserver-7bbb656c7d-nmgxj\" (UID: \"84005632-49e7-400a-8748-5f16909cfda1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.315876 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.348576 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krgnn\" (UniqueName: \"kubernetes.io/projected/2ca986cd-d4df-4af5-8918-33f445a52f49-kube-api-access-krgnn\") pod \"openshift-config-operator-7777fb866f-899rx\" (UID: \"2ca986cd-d4df-4af5-8918-33f445a52f49\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.355034 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.371472 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.387825 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.389119 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rwv2p"] Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.391277 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwzvz\" (UniqueName: \"kubernetes.io/projected/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-kube-api-access-nwzvz\") pod \"console-f9d7485db-n967v\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.395086 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.415719 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.435132 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.454402 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.454537 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.474500 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.494962 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.515838 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.535579 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.536167 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.574546 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n4g8\" (UniqueName: \"kubernetes.io/projected/bb35474f-f199-45c9-8a0d-3c7c93268ef6-kube-api-access-6n4g8\") pod \"cluster-image-registry-operator-dc59b4c8b-zvz4c\" (UID: \"bb35474f-f199-45c9-8a0d-3c7c93268ef6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.575913 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.577102 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.585143 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.595648 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.616006 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.635281 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.656602 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.675966 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.696600 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.716593 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.735270 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.755670 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.785867 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.795179 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.814885 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.835033 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.854849 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.875287 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.895460 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.916455 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.936670 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.956297 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.971137 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.971344 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.971525 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.971591 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.971697 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:33 crc kubenswrapper[5002]: E1209 10:03:33.972298 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:05:35.972245617 +0000 UTC m=+268.364296738 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.977598 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.978778 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.978897 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.987478 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 09 10:03:33 crc kubenswrapper[5002]: I1209 10:03:33.995490 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.015507 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.036171 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.056154 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.072596 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.075408 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.076592 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:34 crc kubenswrapper[5002]: W1209 10:03:34.083785 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2598866_d004_41a0_b058_01fb9a379df5.slice/crio-567fbf93663e94a933f79457f9afaa944177b75e3ae16dfb87fc19fa0e90f5af WatchSource:0}: Error finding container 567fbf93663e94a933f79457f9afaa944177b75e3ae16dfb87fc19fa0e90f5af: Status 404 returned error can't find the container with id 567fbf93663e94a933f79457f9afaa944177b75e3ae16dfb87fc19fa0e90f5af Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.088960 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.094570 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt"] Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.098556 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.116116 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 09 10:03:34 crc kubenswrapper[5002]: W1209 10:03:34.116687 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfef799ad_c87c_4f60_9b8e_afb51958013d.slice/crio-ae94c6aa98f334d9b2a408983bfebf6e7e28471071dd496597775f7c3d410e8e WatchSource:0}: Error finding container ae94c6aa98f334d9b2a408983bfebf6e7e28471071dd496597775f7c3d410e8e: Status 404 returned error can't find the container with id ae94c6aa98f334d9b2a408983bfebf6e7e28471071dd496597775f7c3d410e8e Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.135731 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.155045 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.175579 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.195458 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.202898 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" event={"ID":"fef799ad-c87c-4f60-9b8e-afb51958013d","Type":"ContainerStarted","Data":"ae94c6aa98f334d9b2a408983bfebf6e7e28471071dd496597775f7c3d410e8e"} Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.204309 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" event={"ID":"66daadf1-5a22-4a90-a004-68fc7848d557","Type":"ContainerStarted","Data":"a9ab154355fe95ecabaadaf3afa4b6058d0d47bed2d7e34d8877e2442ea117a2"} Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.207660 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xp9jk" event={"ID":"58ef0377-76a5-4299-999f-7bdd55606533","Type":"ContainerStarted","Data":"002589faa7b3f6e0c9bd3a23b4c63c4a0067e3cea6be2887458fe5155e1ad813"} Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.208269 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" event={"ID":"dcdecdfe-61c0-4d85-99b5-e1fe25727259","Type":"ContainerStarted","Data":"95437a9e22a570525145cbb320ae0b16d3d9d57c950cff51605c5a4f3b53ab3c"} Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.208858 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" event={"ID":"e2598866-d004-41a0-b058-01fb9a379df5","Type":"ContainerStarted","Data":"567fbf93663e94a933f79457f9afaa944177b75e3ae16dfb87fc19fa0e90f5af"} Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.211397 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mtwnb" event={"ID":"8089a4b0-6e30-4f86-a14c-17435b2eaab8","Type":"ContainerStarted","Data":"55ed8743eb3737629832c15539be466d15947b56d764d759de90ca8df0fc346c"} Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.211761 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.217610 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.223631 5002 patch_prober.go:28] interesting pod/console-operator-58897d9998-mtwnb container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.223678 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mtwnb" podUID="8089a4b0-6e30-4f86-a14c-17435b2eaab8" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.235726 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.255527 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.274202 5002 request.go:700] Waited for 1.944436835s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.276273 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.276807 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.300320 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.315805 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.335693 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.358291 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.376840 5002 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.395858 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.415291 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.440652 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.479147 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rmfg\" (UniqueName: \"kubernetes.io/projected/14ca2bab-e020-4c2d-be3c-e4138095c149-kube-api-access-5rmfg\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffzj8\" (UID: \"14ca2bab-e020-4c2d-be3c-e4138095c149\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.508322 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcnr5\" (UniqueName: \"kubernetes.io/projected/a70e2957-fa0c-4393-adaf-93a0759263d1-kube-api-access-qcnr5\") pod \"etcd-operator-b45778765-2s9w5\" (UID: \"a70e2957-fa0c-4393-adaf-93a0759263d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.522919 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.527481 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scrpt\" (UniqueName: \"kubernetes.io/projected/bf7272a5-afe4-4ad1-bc7e-2c2397b84a95-kube-api-access-scrpt\") pod \"dns-operator-744455d44c-lxrwt\" (UID: \"bf7272a5-afe4-4ad1-bc7e-2c2397b84a95\") " pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.535848 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.542420 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7wjn\" (UniqueName: \"kubernetes.io/projected/e8a05f51-2e2c-4918-80a6-201afd3d14a5-kube-api-access-v7wjn\") pod \"openshift-apiserver-operator-796bbdcf4f-wxfc6\" (UID: \"e8a05f51-2e2c-4918-80a6-201afd3d14a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.594760 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-certificates\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.595171 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-tls\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.595189 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv5lc\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-kube-api-access-xv5lc\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.595222 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cac4bbbb-1f9f-4107-99ee-cb71771ec460-ca-trust-extracted\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.595277 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-trusted-ca\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.595317 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cac4bbbb-1f9f-4107-99ee-cb71771ec460-installation-pull-secrets\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.595350 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.595454 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-bound-sa-token\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: E1209 10:03:34.596376 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.096361967 +0000 UTC m=+147.488413048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.600791 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c"] Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.671154 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t"] Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.693979 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-99l4h"] Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.702514 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:34 crc kubenswrapper[5002]: E1209 10:03:34.702673 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.202646926 +0000 UTC m=+147.594697997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703042 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d9cdd76-7711-48de-a449-98e306ba4fe7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lb5s8\" (UID: \"4d9cdd76-7711-48de-a449-98e306ba4fe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703077 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9pmz\" (UniqueName: \"kubernetes.io/projected/f85c45d5-d9fa-418c-91ea-0f054181f4c7-kube-api-access-p9pmz\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703142 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e8605e9-eb8b-4676-8d27-b5f424cb94d1-srv-cert\") pod \"catalog-operator-68c6474976-2sjdm\" (UID: \"9e8605e9-eb8b-4676-8d27-b5f424cb94d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703233 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6bb106-46e5-4ae3-8bf2-4c07c9106a92-config\") pod \"service-ca-operator-777779d784-95fhm\" (UID: \"ea6bb106-46e5-4ae3-8bf2-4c07c9106a92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703261 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/20bd2a89-b1af-4bb2-9e28-862273221604-srv-cert\") pod \"olm-operator-6b444d44fb-zrfsb\" (UID: \"20bd2a89-b1af-4bb2-9e28-862273221604\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703295 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/79241a31-6d92-482a-9cc3-c2370d516cbf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x5gjs\" (UID: \"79241a31-6d92-482a-9cc3-c2370d516cbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703325 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn6zt\" (UniqueName: \"kubernetes.io/projected/54bf04d4-958e-4105-ab04-73920ec14b8f-kube-api-access-kn6zt\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703338 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b2z6\" (UniqueName: \"kubernetes.io/projected/ea6bb106-46e5-4ae3-8bf2-4c07c9106a92-kube-api-access-4b2z6\") pod \"service-ca-operator-777779d784-95fhm\" (UID: \"ea6bb106-46e5-4ae3-8bf2-4c07c9106a92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703368 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-bound-sa-token\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703384 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b80223e8-1b01-446b-8db3-6f39c3455dca-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4n59b\" (UID: \"b80223e8-1b01-446b-8db3-6f39c3455dca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703432 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32959a40-5fd2-485b-9fff-373e5c39a86b-signing-cabundle\") pod \"service-ca-9c57cc56f-lm8br\" (UID: \"32959a40-5fd2-485b-9fff-373e5c39a86b\") " pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703449 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj84f\" (UniqueName: \"kubernetes.io/projected/250ef544-c81b-4487-9eba-fcce62841a3d-kube-api-access-rj84f\") pod \"migrator-59844c95c7-f2pn5\" (UID: \"250ef544-c81b-4487-9eba-fcce62841a3d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703474 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54bf04d4-958e-4105-ab04-73920ec14b8f-trusted-ca\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703489 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67316e5c-6e24-45d1-8789-eec0570baa40-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r2fkt\" (UID: \"67316e5c-6e24-45d1-8789-eec0570baa40\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703514 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e408545-84a5-4b62-ab02-e213a58d1c53-config-volume\") pod \"collect-profiles-29421240-5jmzz\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703527 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78m8f\" (UniqueName: \"kubernetes.io/projected/495d9e9a-51ee-49cc-b870-6c990157bdf5-kube-api-access-78m8f\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703553 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d9cdd76-7711-48de-a449-98e306ba4fe7-config\") pod \"kube-controller-manager-operator-78b949d7b-lb5s8\" (UID: \"4d9cdd76-7711-48de-a449-98e306ba4fe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.703568 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb3d4992-003e-4411-a1e4-357e176545f8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.705751 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggkpf\" (UniqueName: \"kubernetes.io/projected/20bd2a89-b1af-4bb2-9e28-862273221604-kube-api-access-ggkpf\") pod \"olm-operator-6b444d44fb-zrfsb\" (UID: \"20bd2a89-b1af-4bb2-9e28-862273221604\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.705792 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zbsjj\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.705849 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-tls\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.705895 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5lc\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-kube-api-access-xv5lc\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.705919 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32959a40-5fd2-485b-9fff-373e5c39a86b-signing-key\") pod \"service-ca-9c57cc56f-lm8br\" (UID: \"32959a40-5fd2-485b-9fff-373e5c39a86b\") " pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.705963 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b80223e8-1b01-446b-8db3-6f39c3455dca-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4n59b\" (UID: \"b80223e8-1b01-446b-8db3-6f39c3455dca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.706129 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99f91e54-9dc7-43ca-86ee-4258c22c6004-default-certificate\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.706298 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54bf04d4-958e-4105-ab04-73920ec14b8f-metrics-tls\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.706332 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed519358-adaa-4396-a436-0695d79eb551-proxy-tls\") pod \"machine-config-controller-84d6567774-mbrln\" (UID: \"ed519358-adaa-4396-a436-0695d79eb551\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.706599 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24664ba4-8500-4f9f-bd2a-c7ecd2b114dc-cert\") pod \"ingress-canary-fm26s\" (UID: \"24664ba4-8500-4f9f-bd2a-c7ecd2b114dc\") " pod="openshift-ingress-canary/ingress-canary-fm26s" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.706743 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqqc8\" (UniqueName: \"kubernetes.io/projected/ed519358-adaa-4396-a436-0695d79eb551-kube-api-access-lqqc8\") pod \"machine-config-controller-84d6567774-mbrln\" (UID: \"ed519358-adaa-4396-a436-0695d79eb551\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.706760 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99f91e54-9dc7-43ca-86ee-4258c22c6004-service-ca-bundle\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.706790 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-plugins-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.706825 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0af4f459-58c9-4433-9faf-5e669262fb2e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r22tz\" (UID: \"0af4f459-58c9-4433-9faf-5e669262fb2e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.706858 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/495d9e9a-51ee-49cc-b870-6c990157bdf5-tmpfs\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.706911 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/495d9e9a-51ee-49cc-b870-6c990157bdf5-webhook-cert\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.708182 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/20bd2a89-b1af-4bb2-9e28-862273221604-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zrfsb\" (UID: \"20bd2a89-b1af-4bb2-9e28-862273221604\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.708234 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zbsjj\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.708257 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb3d4992-003e-4411-a1e4-357e176545f8-proxy-tls\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.708273 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0357ba06-3248-4036-86c9-975788e67bdf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9pdrd\" (UID: \"0357ba06-3248-4036-86c9-975788e67bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.708318 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f579d242-665c-49be-b592-b6e539eb99c0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dhngp\" (UID: \"f579d242-665c-49be-b592-b6e539eb99c0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.708465 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0357ba06-3248-4036-86c9-975788e67bdf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9pdrd\" (UID: \"0357ba06-3248-4036-86c9-975788e67bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.708489 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bclmz\" (UniqueName: \"kubernetes.io/projected/0af4f459-58c9-4433-9faf-5e669262fb2e-kube-api-access-bclmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-r22tz\" (UID: \"0af4f459-58c9-4433-9faf-5e669262fb2e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.708508 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ab422596-7d3f-4a87-b7ac-c62e68465bb1-metrics-tls\") pod \"dns-default-pm68f\" (UID: \"ab422596-7d3f-4a87-b7ac-c62e68465bb1\") " pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.708536 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed519358-adaa-4396-a436-0695d79eb551-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mbrln\" (UID: \"ed519358-adaa-4396-a436-0695d79eb551\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.708570 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea6bb106-46e5-4ae3-8bf2-4c07c9106a92-serving-cert\") pod \"service-ca-operator-777779d784-95fhm\" (UID: \"ea6bb106-46e5-4ae3-8bf2-4c07c9106a92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.710040 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0357ba06-3248-4036-86c9-975788e67bdf-config\") pod \"kube-apiserver-operator-766d6c64bb-9pdrd\" (UID: \"0357ba06-3248-4036-86c9-975788e67bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.710499 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnf5n\" (UniqueName: \"kubernetes.io/projected/f579d242-665c-49be-b592-b6e539eb99c0-kube-api-access-rnf5n\") pod \"package-server-manager-789f6589d5-dhngp\" (UID: \"f579d242-665c-49be-b592-b6e539eb99c0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.711073 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcq5j\" (UniqueName: \"kubernetes.io/projected/6e408545-84a5-4b62-ab02-e213a58d1c53-kube-api-access-tcq5j\") pod \"collect-profiles-29421240-5jmzz\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.711142 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab422596-7d3f-4a87-b7ac-c62e68465bb1-config-volume\") pod \"dns-default-pm68f\" (UID: \"ab422596-7d3f-4a87-b7ac-c62e68465bb1\") " pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.711494 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwwtz\" (UniqueName: \"kubernetes.io/projected/32959a40-5fd2-485b-9fff-373e5c39a86b-kube-api-access-xwwtz\") pod \"service-ca-9c57cc56f-lm8br\" (UID: \"32959a40-5fd2-485b-9fff-373e5c39a86b\") " pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.711598 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shw9f\" (UniqueName: \"kubernetes.io/projected/fb3d4992-003e-4411-a1e4-357e176545f8-kube-api-access-shw9f\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.711990 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9f2b7402-d357-4912-9cb3-b0976966ad20-certs\") pod \"machine-config-server-kpldz\" (UID: \"9f2b7402-d357-4912-9cb3-b0976966ad20\") " pod="openshift-machine-config-operator/machine-config-server-kpldz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.712067 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwldf\" (UniqueName: \"kubernetes.io/projected/b80223e8-1b01-446b-8db3-6f39c3455dca-kube-api-access-rwldf\") pod \"kube-storage-version-migrator-operator-b67b599dd-4n59b\" (UID: \"b80223e8-1b01-446b-8db3-6f39c3455dca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.712096 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99f91e54-9dc7-43ca-86ee-4258c22c6004-metrics-certs\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.712180 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54bf04d4-958e-4105-ab04-73920ec14b8f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.712473 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e408545-84a5-4b62-ab02-e213a58d1c53-secret-volume\") pod \"collect-profiles-29421240-5jmzz\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.712568 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krdfc\" (UniqueName: \"kubernetes.io/projected/ab422596-7d3f-4a87-b7ac-c62e68465bb1-kube-api-access-krdfc\") pod \"dns-default-pm68f\" (UID: \"ab422596-7d3f-4a87-b7ac-c62e68465bb1\") " pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.712605 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-certificates\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.712909 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e8605e9-eb8b-4676-8d27-b5f424cb94d1-profile-collector-cert\") pod \"catalog-operator-68c6474976-2sjdm\" (UID: \"9e8605e9-eb8b-4676-8d27-b5f424cb94d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.712933 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs9vj\" (UniqueName: \"kubernetes.io/projected/9f2b7402-d357-4912-9cb3-b0976966ad20-kube-api-access-zs9vj\") pod \"machine-config-server-kpldz\" (UID: \"9f2b7402-d357-4912-9cb3-b0976966ad20\") " pod="openshift-machine-config-operator/machine-config-server-kpldz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.712967 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psrns\" (UniqueName: \"kubernetes.io/projected/24664ba4-8500-4f9f-bd2a-c7ecd2b114dc-kube-api-access-psrns\") pod \"ingress-canary-fm26s\" (UID: \"24664ba4-8500-4f9f-bd2a-c7ecd2b114dc\") " pod="openshift-ingress-canary/ingress-canary-fm26s" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.714504 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cac4bbbb-1f9f-4107-99ee-cb71771ec460-ca-trust-extracted\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.714835 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-registration-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.714862 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/495d9e9a-51ee-49cc-b870-6c990157bdf5-apiservice-cert\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.714907 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9f2b7402-d357-4912-9cb3-b0976966ad20-node-bootstrap-token\") pod \"machine-config-server-kpldz\" (UID: \"9f2b7402-d357-4912-9cb3-b0976966ad20\") " pod="openshift-machine-config-operator/machine-config-server-kpldz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.715334 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-certificates\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.715385 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cac4bbbb-1f9f-4107-99ee-cb71771ec460-ca-trust-extracted\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.715408 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk94v\" (UniqueName: \"kubernetes.io/projected/79241a31-6d92-482a-9cc3-c2370d516cbf-kube-api-access-hk94v\") pod \"multus-admission-controller-857f4d67dd-x5gjs\" (UID: \"79241a31-6d92-482a-9cc3-c2370d516cbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.715791 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67316e5c-6e24-45d1-8789-eec0570baa40-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r2fkt\" (UID: \"67316e5c-6e24-45d1-8789-eec0570baa40\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.715990 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47gkv\" (UniqueName: \"kubernetes.io/projected/9e8605e9-eb8b-4676-8d27-b5f424cb94d1-kube-api-access-47gkv\") pod \"catalog-operator-68c6474976-2sjdm\" (UID: \"9e8605e9-eb8b-4676-8d27-b5f424cb94d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.716011 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbszl\" (UniqueName: \"kubernetes.io/projected/58eda1c5-bb10-4843-9ed4-64ee38d040ee-kube-api-access-jbszl\") pod \"marketplace-operator-79b997595-zbsjj\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.716308 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9cdd76-7711-48de-a449-98e306ba4fe7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lb5s8\" (UID: \"4d9cdd76-7711-48de-a449-98e306ba4fe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.716560 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4q4t\" (UniqueName: \"kubernetes.io/projected/99f91e54-9dc7-43ca-86ee-4258c22c6004-kube-api-access-v4q4t\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.716624 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-trusted-ca\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.716645 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-socket-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.716739 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cac4bbbb-1f9f-4107-99ee-cb71771ec460-installation-pull-secrets\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.716838 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67316e5c-6e24-45d1-8789-eec0570baa40-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r2fkt\" (UID: \"67316e5c-6e24-45d1-8789-eec0570baa40\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.716867 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-csi-data-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.716883 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99f91e54-9dc7-43ca-86ee-4258c22c6004-stats-auth\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.716951 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.716969 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fb3d4992-003e-4411-a1e4-357e176545f8-images\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.717125 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-mountpoint-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.719638 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-tls\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: E1209 10:03:34.719933 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.219916199 +0000 UTC m=+147.611967280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.721617 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cac4bbbb-1f9f-4107-99ee-cb71771ec460-installation-pull-secrets\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.723981 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-bound-sa-token\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.725430 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-n967v"] Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.729001 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-trusted-ca\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.740063 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv5lc\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-kube-api-access-xv5lc\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.799231 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.807055 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.817506 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:34 crc kubenswrapper[5002]: E1209 10:03:34.817702 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.317669444 +0000 UTC m=+147.709720535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.817996 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0357ba06-3248-4036-86c9-975788e67bdf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9pdrd\" (UID: \"0357ba06-3248-4036-86c9-975788e67bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818022 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f579d242-665c-49be-b592-b6e539eb99c0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dhngp\" (UID: \"f579d242-665c-49be-b592-b6e539eb99c0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818038 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0357ba06-3248-4036-86c9-975788e67bdf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9pdrd\" (UID: \"0357ba06-3248-4036-86c9-975788e67bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818054 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bclmz\" (UniqueName: \"kubernetes.io/projected/0af4f459-58c9-4433-9faf-5e669262fb2e-kube-api-access-bclmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-r22tz\" (UID: \"0af4f459-58c9-4433-9faf-5e669262fb2e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818229 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ab422596-7d3f-4a87-b7ac-c62e68465bb1-metrics-tls\") pod \"dns-default-pm68f\" (UID: \"ab422596-7d3f-4a87-b7ac-c62e68465bb1\") " pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818248 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed519358-adaa-4396-a436-0695d79eb551-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mbrln\" (UID: \"ed519358-adaa-4396-a436-0695d79eb551\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818391 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea6bb106-46e5-4ae3-8bf2-4c07c9106a92-serving-cert\") pod \"service-ca-operator-777779d784-95fhm\" (UID: \"ea6bb106-46e5-4ae3-8bf2-4c07c9106a92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818413 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0357ba06-3248-4036-86c9-975788e67bdf-config\") pod \"kube-apiserver-operator-766d6c64bb-9pdrd\" (UID: \"0357ba06-3248-4036-86c9-975788e67bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818428 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnf5n\" (UniqueName: \"kubernetes.io/projected/f579d242-665c-49be-b592-b6e539eb99c0-kube-api-access-rnf5n\") pod \"package-server-manager-789f6589d5-dhngp\" (UID: \"f579d242-665c-49be-b592-b6e539eb99c0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818448 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcq5j\" (UniqueName: \"kubernetes.io/projected/6e408545-84a5-4b62-ab02-e213a58d1c53-kube-api-access-tcq5j\") pod \"collect-profiles-29421240-5jmzz\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818462 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab422596-7d3f-4a87-b7ac-c62e68465bb1-config-volume\") pod \"dns-default-pm68f\" (UID: \"ab422596-7d3f-4a87-b7ac-c62e68465bb1\") " pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818476 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwwtz\" (UniqueName: \"kubernetes.io/projected/32959a40-5fd2-485b-9fff-373e5c39a86b-kube-api-access-xwwtz\") pod \"service-ca-9c57cc56f-lm8br\" (UID: \"32959a40-5fd2-485b-9fff-373e5c39a86b\") " pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818494 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shw9f\" (UniqueName: \"kubernetes.io/projected/fb3d4992-003e-4411-a1e4-357e176545f8-kube-api-access-shw9f\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818509 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9f2b7402-d357-4912-9cb3-b0976966ad20-certs\") pod \"machine-config-server-kpldz\" (UID: \"9f2b7402-d357-4912-9cb3-b0976966ad20\") " pod="openshift-machine-config-operator/machine-config-server-kpldz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818523 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwldf\" (UniqueName: \"kubernetes.io/projected/b80223e8-1b01-446b-8db3-6f39c3455dca-kube-api-access-rwldf\") pod \"kube-storage-version-migrator-operator-b67b599dd-4n59b\" (UID: \"b80223e8-1b01-446b-8db3-6f39c3455dca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818538 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99f91e54-9dc7-43ca-86ee-4258c22c6004-metrics-certs\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818553 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54bf04d4-958e-4105-ab04-73920ec14b8f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818567 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krdfc\" (UniqueName: \"kubernetes.io/projected/ab422596-7d3f-4a87-b7ac-c62e68465bb1-kube-api-access-krdfc\") pod \"dns-default-pm68f\" (UID: \"ab422596-7d3f-4a87-b7ac-c62e68465bb1\") " pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818586 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e408545-84a5-4b62-ab02-e213a58d1c53-secret-volume\") pod \"collect-profiles-29421240-5jmzz\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818602 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e8605e9-eb8b-4676-8d27-b5f424cb94d1-profile-collector-cert\") pod \"catalog-operator-68c6474976-2sjdm\" (UID: \"9e8605e9-eb8b-4676-8d27-b5f424cb94d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818616 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs9vj\" (UniqueName: \"kubernetes.io/projected/9f2b7402-d357-4912-9cb3-b0976966ad20-kube-api-access-zs9vj\") pod \"machine-config-server-kpldz\" (UID: \"9f2b7402-d357-4912-9cb3-b0976966ad20\") " pod="openshift-machine-config-operator/machine-config-server-kpldz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818633 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psrns\" (UniqueName: \"kubernetes.io/projected/24664ba4-8500-4f9f-bd2a-c7ecd2b114dc-kube-api-access-psrns\") pod \"ingress-canary-fm26s\" (UID: \"24664ba4-8500-4f9f-bd2a-c7ecd2b114dc\") " pod="openshift-ingress-canary/ingress-canary-fm26s" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818650 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-registration-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818666 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/495d9e9a-51ee-49cc-b870-6c990157bdf5-apiservice-cert\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818680 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9f2b7402-d357-4912-9cb3-b0976966ad20-node-bootstrap-token\") pod \"machine-config-server-kpldz\" (UID: \"9f2b7402-d357-4912-9cb3-b0976966ad20\") " pod="openshift-machine-config-operator/machine-config-server-kpldz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818696 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk94v\" (UniqueName: \"kubernetes.io/projected/79241a31-6d92-482a-9cc3-c2370d516cbf-kube-api-access-hk94v\") pod \"multus-admission-controller-857f4d67dd-x5gjs\" (UID: \"79241a31-6d92-482a-9cc3-c2370d516cbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818713 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67316e5c-6e24-45d1-8789-eec0570baa40-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r2fkt\" (UID: \"67316e5c-6e24-45d1-8789-eec0570baa40\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818740 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbszl\" (UniqueName: \"kubernetes.io/projected/58eda1c5-bb10-4843-9ed4-64ee38d040ee-kube-api-access-jbszl\") pod \"marketplace-operator-79b997595-zbsjj\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818758 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47gkv\" (UniqueName: \"kubernetes.io/projected/9e8605e9-eb8b-4676-8d27-b5f424cb94d1-kube-api-access-47gkv\") pod \"catalog-operator-68c6474976-2sjdm\" (UID: \"9e8605e9-eb8b-4676-8d27-b5f424cb94d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818775 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9cdd76-7711-48de-a449-98e306ba4fe7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lb5s8\" (UID: \"4d9cdd76-7711-48de-a449-98e306ba4fe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818794 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4q4t\" (UniqueName: \"kubernetes.io/projected/99f91e54-9dc7-43ca-86ee-4258c22c6004-kube-api-access-v4q4t\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818826 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67316e5c-6e24-45d1-8789-eec0570baa40-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r2fkt\" (UID: \"67316e5c-6e24-45d1-8789-eec0570baa40\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818842 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-socket-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818864 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-csi-data-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818949 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99f91e54-9dc7-43ca-86ee-4258c22c6004-stats-auth\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818972 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.818987 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fb3d4992-003e-4411-a1e4-357e176545f8-images\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819005 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-mountpoint-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819021 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d9cdd76-7711-48de-a449-98e306ba4fe7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lb5s8\" (UID: \"4d9cdd76-7711-48de-a449-98e306ba4fe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819037 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9pmz\" (UniqueName: \"kubernetes.io/projected/f85c45d5-d9fa-418c-91ea-0f054181f4c7-kube-api-access-p9pmz\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819052 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e8605e9-eb8b-4676-8d27-b5f424cb94d1-srv-cert\") pod \"catalog-operator-68c6474976-2sjdm\" (UID: \"9e8605e9-eb8b-4676-8d27-b5f424cb94d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819065 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6bb106-46e5-4ae3-8bf2-4c07c9106a92-config\") pod \"service-ca-operator-777779d784-95fhm\" (UID: \"ea6bb106-46e5-4ae3-8bf2-4c07c9106a92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819081 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/20bd2a89-b1af-4bb2-9e28-862273221604-srv-cert\") pod \"olm-operator-6b444d44fb-zrfsb\" (UID: \"20bd2a89-b1af-4bb2-9e28-862273221604\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819096 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/79241a31-6d92-482a-9cc3-c2370d516cbf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x5gjs\" (UID: \"79241a31-6d92-482a-9cc3-c2370d516cbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819114 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn6zt\" (UniqueName: \"kubernetes.io/projected/54bf04d4-958e-4105-ab04-73920ec14b8f-kube-api-access-kn6zt\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819130 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b2z6\" (UniqueName: \"kubernetes.io/projected/ea6bb106-46e5-4ae3-8bf2-4c07c9106a92-kube-api-access-4b2z6\") pod \"service-ca-operator-777779d784-95fhm\" (UID: \"ea6bb106-46e5-4ae3-8bf2-4c07c9106a92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819146 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b80223e8-1b01-446b-8db3-6f39c3455dca-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4n59b\" (UID: \"b80223e8-1b01-446b-8db3-6f39c3455dca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819164 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32959a40-5fd2-485b-9fff-373e5c39a86b-signing-cabundle\") pod \"service-ca-9c57cc56f-lm8br\" (UID: \"32959a40-5fd2-485b-9fff-373e5c39a86b\") " pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819179 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj84f\" (UniqueName: \"kubernetes.io/projected/250ef544-c81b-4487-9eba-fcce62841a3d-kube-api-access-rj84f\") pod \"migrator-59844c95c7-f2pn5\" (UID: \"250ef544-c81b-4487-9eba-fcce62841a3d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819194 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67316e5c-6e24-45d1-8789-eec0570baa40-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r2fkt\" (UID: \"67316e5c-6e24-45d1-8789-eec0570baa40\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819208 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54bf04d4-958e-4105-ab04-73920ec14b8f-trusted-ca\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819222 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e408545-84a5-4b62-ab02-e213a58d1c53-config-volume\") pod \"collect-profiles-29421240-5jmzz\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819236 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78m8f\" (UniqueName: \"kubernetes.io/projected/495d9e9a-51ee-49cc-b870-6c990157bdf5-kube-api-access-78m8f\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819259 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d9cdd76-7711-48de-a449-98e306ba4fe7-config\") pod \"kube-controller-manager-operator-78b949d7b-lb5s8\" (UID: \"4d9cdd76-7711-48de-a449-98e306ba4fe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819274 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb3d4992-003e-4411-a1e4-357e176545f8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819539 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zbsjj\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819565 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggkpf\" (UniqueName: \"kubernetes.io/projected/20bd2a89-b1af-4bb2-9e28-862273221604-kube-api-access-ggkpf\") pod \"olm-operator-6b444d44fb-zrfsb\" (UID: \"20bd2a89-b1af-4bb2-9e28-862273221604\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819582 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32959a40-5fd2-485b-9fff-373e5c39a86b-signing-key\") pod \"service-ca-9c57cc56f-lm8br\" (UID: \"32959a40-5fd2-485b-9fff-373e5c39a86b\") " pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819600 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b80223e8-1b01-446b-8db3-6f39c3455dca-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4n59b\" (UID: \"b80223e8-1b01-446b-8db3-6f39c3455dca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819616 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99f91e54-9dc7-43ca-86ee-4258c22c6004-default-certificate\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819631 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54bf04d4-958e-4105-ab04-73920ec14b8f-metrics-tls\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819647 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed519358-adaa-4396-a436-0695d79eb551-proxy-tls\") pod \"machine-config-controller-84d6567774-mbrln\" (UID: \"ed519358-adaa-4396-a436-0695d79eb551\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819662 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24664ba4-8500-4f9f-bd2a-c7ecd2b114dc-cert\") pod \"ingress-canary-fm26s\" (UID: \"24664ba4-8500-4f9f-bd2a-c7ecd2b114dc\") " pod="openshift-ingress-canary/ingress-canary-fm26s" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819678 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqqc8\" (UniqueName: \"kubernetes.io/projected/ed519358-adaa-4396-a436-0695d79eb551-kube-api-access-lqqc8\") pod \"machine-config-controller-84d6567774-mbrln\" (UID: \"ed519358-adaa-4396-a436-0695d79eb551\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819693 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99f91e54-9dc7-43ca-86ee-4258c22c6004-service-ca-bundle\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819708 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/495d9e9a-51ee-49cc-b870-6c990157bdf5-tmpfs\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819724 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-plugins-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819739 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0af4f459-58c9-4433-9faf-5e669262fb2e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r22tz\" (UID: \"0af4f459-58c9-4433-9faf-5e669262fb2e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819755 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/495d9e9a-51ee-49cc-b870-6c990157bdf5-webhook-cert\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819771 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/20bd2a89-b1af-4bb2-9e28-862273221604-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zrfsb\" (UID: \"20bd2a89-b1af-4bb2-9e28-862273221604\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819789 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zbsjj\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.819804 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb3d4992-003e-4411-a1e4-357e176545f8-proxy-tls\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:34 crc kubenswrapper[5002]: E1209 10:03:34.820228 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.320214366 +0000 UTC m=+147.712265557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.820853 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d9cdd76-7711-48de-a449-98e306ba4fe7-config\") pod \"kube-controller-manager-operator-78b949d7b-lb5s8\" (UID: \"4d9cdd76-7711-48de-a449-98e306ba4fe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.821402 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fb3d4992-003e-4411-a1e4-357e176545f8-images\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.822535 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb3d4992-003e-4411-a1e4-357e176545f8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.826605 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54bf04d4-958e-4105-ab04-73920ec14b8f-trusted-ca\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.826724 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e408545-84a5-4b62-ab02-e213a58d1c53-config-volume\") pod \"collect-profiles-29421240-5jmzz\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.827798 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-csi-data-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.828022 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-socket-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.828073 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-registration-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.828949 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f579d242-665c-49be-b592-b6e539eb99c0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dhngp\" (UID: \"f579d242-665c-49be-b592-b6e539eb99c0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.829935 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b80223e8-1b01-446b-8db3-6f39c3455dca-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4n59b\" (UID: \"b80223e8-1b01-446b-8db3-6f39c3455dca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.830570 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24664ba4-8500-4f9f-bd2a-c7ecd2b114dc-cert\") pod \"ingress-canary-fm26s\" (UID: \"24664ba4-8500-4f9f-bd2a-c7ecd2b114dc\") " pod="openshift-ingress-canary/ingress-canary-fm26s" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.830881 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67316e5c-6e24-45d1-8789-eec0570baa40-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r2fkt\" (UID: \"67316e5c-6e24-45d1-8789-eec0570baa40\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.831232 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99f91e54-9dc7-43ca-86ee-4258c22c6004-service-ca-bundle\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.831583 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/495d9e9a-51ee-49cc-b870-6c990157bdf5-tmpfs\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.831646 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-plugins-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.831945 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f85c45d5-d9fa-418c-91ea-0f054181f4c7-mountpoint-dir\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.834440 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj"] Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.834869 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32959a40-5fd2-485b-9fff-373e5c39a86b-signing-cabundle\") pod \"service-ca-9c57cc56f-lm8br\" (UID: \"32959a40-5fd2-485b-9fff-373e5c39a86b\") " pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.835376 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb3d4992-003e-4411-a1e4-357e176545f8-proxy-tls\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.837247 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab422596-7d3f-4a87-b7ac-c62e68465bb1-config-volume\") pod \"dns-default-pm68f\" (UID: \"ab422596-7d3f-4a87-b7ac-c62e68465bb1\") " pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.838052 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6bb106-46e5-4ae3-8bf2-4c07c9106a92-config\") pod \"service-ca-operator-777779d784-95fhm\" (UID: \"ea6bb106-46e5-4ae3-8bf2-4c07c9106a92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.839165 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32959a40-5fd2-485b-9fff-373e5c39a86b-signing-key\") pod \"service-ca-9c57cc56f-lm8br\" (UID: \"32959a40-5fd2-485b-9fff-373e5c39a86b\") " pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.841444 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zbsjj\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.842109 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e8605e9-eb8b-4676-8d27-b5f424cb94d1-srv-cert\") pod \"catalog-operator-68c6474976-2sjdm\" (UID: \"9e8605e9-eb8b-4676-8d27-b5f424cb94d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.842878 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d9cdd76-7711-48de-a449-98e306ba4fe7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lb5s8\" (UID: \"4d9cdd76-7711-48de-a449-98e306ba4fe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.844237 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67316e5c-6e24-45d1-8789-eec0570baa40-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r2fkt\" (UID: \"67316e5c-6e24-45d1-8789-eec0570baa40\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.846565 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/20bd2a89-b1af-4bb2-9e28-862273221604-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zrfsb\" (UID: \"20bd2a89-b1af-4bb2-9e28-862273221604\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.848217 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0357ba06-3248-4036-86c9-975788e67bdf-config\") pod \"kube-apiserver-operator-766d6c64bb-9pdrd\" (UID: \"0357ba06-3248-4036-86c9-975788e67bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.843324 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed519358-adaa-4396-a436-0695d79eb551-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mbrln\" (UID: \"ed519358-adaa-4396-a436-0695d79eb551\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.852756 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0af4f459-58c9-4433-9faf-5e669262fb2e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r22tz\" (UID: \"0af4f459-58c9-4433-9faf-5e669262fb2e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.859492 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b80223e8-1b01-446b-8db3-6f39c3455dca-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4n59b\" (UID: \"b80223e8-1b01-446b-8db3-6f39c3455dca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.864429 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zbsjj\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.864454 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e408545-84a5-4b62-ab02-e213a58d1c53-secret-volume\") pod \"collect-profiles-29421240-5jmzz\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.864966 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea6bb106-46e5-4ae3-8bf2-4c07c9106a92-serving-cert\") pod \"service-ca-operator-777779d784-95fhm\" (UID: \"ea6bb106-46e5-4ae3-8bf2-4c07c9106a92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.865545 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e8605e9-eb8b-4676-8d27-b5f424cb94d1-profile-collector-cert\") pod \"catalog-operator-68c6474976-2sjdm\" (UID: \"9e8605e9-eb8b-4676-8d27-b5f424cb94d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.866318 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/20bd2a89-b1af-4bb2-9e28-862273221604-srv-cert\") pod \"olm-operator-6b444d44fb-zrfsb\" (UID: \"20bd2a89-b1af-4bb2-9e28-862273221604\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.873004 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99f91e54-9dc7-43ca-86ee-4258c22c6004-stats-auth\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.874413 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9f2b7402-d357-4912-9cb3-b0976966ad20-node-bootstrap-token\") pod \"machine-config-server-kpldz\" (UID: \"9f2b7402-d357-4912-9cb3-b0976966ad20\") " pod="openshift-machine-config-operator/machine-config-server-kpldz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.874998 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99f91e54-9dc7-43ca-86ee-4258c22c6004-metrics-certs\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.875644 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed519358-adaa-4396-a436-0695d79eb551-proxy-tls\") pod \"machine-config-controller-84d6567774-mbrln\" (UID: \"ed519358-adaa-4396-a436-0695d79eb551\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.876161 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9f2b7402-d357-4912-9cb3-b0976966ad20-certs\") pod \"machine-config-server-kpldz\" (UID: \"9f2b7402-d357-4912-9cb3-b0976966ad20\") " pod="openshift-machine-config-operator/machine-config-server-kpldz" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.879268 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0357ba06-3248-4036-86c9-975788e67bdf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9pdrd\" (UID: \"0357ba06-3248-4036-86c9-975788e67bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.882012 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99f91e54-9dc7-43ca-86ee-4258c22c6004-default-certificate\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.883770 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ab422596-7d3f-4a87-b7ac-c62e68465bb1-metrics-tls\") pod \"dns-default-pm68f\" (UID: \"ab422596-7d3f-4a87-b7ac-c62e68465bb1\") " pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.889644 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krdfc\" (UniqueName: \"kubernetes.io/projected/ab422596-7d3f-4a87-b7ac-c62e68465bb1-kube-api-access-krdfc\") pod \"dns-default-pm68f\" (UID: \"ab422596-7d3f-4a87-b7ac-c62e68465bb1\") " pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:34 crc kubenswrapper[5002]: W1209 10:03:34.898466 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84005632_49e7_400a_8748_5f16909cfda1.slice/crio-d92d197e27535eb7c4717e2200ef74a59a0863549150c8b805e7b7659440dfb3 WatchSource:0}: Error finding container d92d197e27535eb7c4717e2200ef74a59a0863549150c8b805e7b7659440dfb3: Status 404 returned error can't find the container with id d92d197e27535eb7c4717e2200ef74a59a0863549150c8b805e7b7659440dfb3 Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.899158 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54bf04d4-958e-4105-ab04-73920ec14b8f-metrics-tls\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.899565 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/79241a31-6d92-482a-9cc3-c2370d516cbf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x5gjs\" (UID: \"79241a31-6d92-482a-9cc3-c2370d516cbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.899570 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8"] Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.900151 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/495d9e9a-51ee-49cc-b870-6c990157bdf5-apiservice-cert\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.907364 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/495d9e9a-51ee-49cc-b870-6c990157bdf5-webhook-cert\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.910185 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b2z6\" (UniqueName: \"kubernetes.io/projected/ea6bb106-46e5-4ae3-8bf2-4c07c9106a92-kube-api-access-4b2z6\") pod \"service-ca-operator-777779d784-95fhm\" (UID: \"ea6bb106-46e5-4ae3-8bf2-4c07c9106a92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.920890 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:34 crc kubenswrapper[5002]: E1209 10:03:34.921051 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.421018708 +0000 UTC m=+147.813069789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.921528 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.921548 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj84f\" (UniqueName: \"kubernetes.io/projected/250ef544-c81b-4487-9eba-fcce62841a3d-kube-api-access-rj84f\") pod \"migrator-59844c95c7-f2pn5\" (UID: \"250ef544-c81b-4487-9eba-fcce62841a3d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5" Dec 09 10:03:34 crc kubenswrapper[5002]: E1209 10:03:34.921945 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.421921384 +0000 UTC m=+147.813972465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.934503 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lxrwt"] Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.940192 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnf5n\" (UniqueName: \"kubernetes.io/projected/f579d242-665c-49be-b592-b6e539eb99c0-kube-api-access-rnf5n\") pod \"package-server-manager-789f6589d5-dhngp\" (UID: \"f579d242-665c-49be-b592-b6e539eb99c0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.951504 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78m8f\" (UniqueName: \"kubernetes.io/projected/495d9e9a-51ee-49cc-b870-6c990157bdf5-kube-api-access-78m8f\") pod \"packageserver-d55dfcdfc-r24z6\" (UID: \"495d9e9a-51ee-49cc-b870-6c990157bdf5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:34 crc kubenswrapper[5002]: I1209 10:03:34.996948 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0357ba06-3248-4036-86c9-975788e67bdf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9pdrd\" (UID: \"0357ba06-3248-4036-86c9-975788e67bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.007298 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.009692 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bclmz\" (UniqueName: \"kubernetes.io/projected/0af4f459-58c9-4433-9faf-5e669262fb2e-kube-api-access-bclmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-r22tz\" (UID: \"0af4f459-58c9-4433-9faf-5e669262fb2e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.022765 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:35 crc kubenswrapper[5002]: E1209 10:03:35.023355 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.523339223 +0000 UTC m=+147.915390304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.034782 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqqc8\" (UniqueName: \"kubernetes.io/projected/ed519358-adaa-4396-a436-0695d79eb551-kube-api-access-lqqc8\") pod \"machine-config-controller-84d6567774-mbrln\" (UID: \"ed519358-adaa-4396-a436-0695d79eb551\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.044742 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-899rx"] Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.055091 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs9vj\" (UniqueName: \"kubernetes.io/projected/9f2b7402-d357-4912-9cb3-b0976966ad20-kube-api-access-zs9vj\") pod \"machine-config-server-kpldz\" (UID: \"9f2b7402-d357-4912-9cb3-b0976966ad20\") " pod="openshift-machine-config-operator/machine-config-server-kpldz" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.069431 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psrns\" (UniqueName: \"kubernetes.io/projected/24664ba4-8500-4f9f-bd2a-c7ecd2b114dc-kube-api-access-psrns\") pod \"ingress-canary-fm26s\" (UID: \"24664ba4-8500-4f9f-bd2a-c7ecd2b114dc\") " pod="openshift-ingress-canary/ingress-canary-fm26s" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.078461 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47gkv\" (UniqueName: \"kubernetes.io/projected/9e8605e9-eb8b-4676-8d27-b5f424cb94d1-kube-api-access-47gkv\") pod \"catalog-operator-68c6474976-2sjdm\" (UID: \"9e8605e9-eb8b-4676-8d27-b5f424cb94d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.124442 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.124722 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9cdd76-7711-48de-a449-98e306ba4fe7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lb5s8\" (UID: \"4d9cdd76-7711-48de-a449-98e306ba4fe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" Dec 09 10:03:35 crc kubenswrapper[5002]: E1209 10:03:35.124861 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.624847205 +0000 UTC m=+148.016898286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.131711 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4q4t\" (UniqueName: \"kubernetes.io/projected/99f91e54-9dc7-43ca-86ee-4258c22c6004-kube-api-access-v4q4t\") pod \"router-default-5444994796-5gtc8\" (UID: \"99f91e54-9dc7-43ca-86ee-4258c22c6004\") " pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.140936 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" Dec 09 10:03:35 crc kubenswrapper[5002]: W1209 10:03:35.158078 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ca986cd_d4df_4af5_8918_33f445a52f49.slice/crio-1e3b22ba36047a47fc190cf3dc7a9f9f40577eac526cf474a0c58344d108e4e7 WatchSource:0}: Error finding container 1e3b22ba36047a47fc190cf3dc7a9f9f40577eac526cf474a0c58344d108e4e7: Status 404 returned error can't find the container with id 1e3b22ba36047a47fc190cf3dc7a9f9f40577eac526cf474a0c58344d108e4e7 Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.161456 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggkpf\" (UniqueName: \"kubernetes.io/projected/20bd2a89-b1af-4bb2-9e28-862273221604-kube-api-access-ggkpf\") pod \"olm-operator-6b444d44fb-zrfsb\" (UID: \"20bd2a89-b1af-4bb2-9e28-862273221604\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.165870 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn6zt\" (UniqueName: \"kubernetes.io/projected/54bf04d4-958e-4105-ab04-73920ec14b8f-kube-api-access-kn6zt\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.168355 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.186502 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shw9f\" (UniqueName: \"kubernetes.io/projected/fb3d4992-003e-4411-a1e4-357e176545f8-kube-api-access-shw9f\") pod \"machine-config-operator-74547568cd-rbp2p\" (UID: \"fb3d4992-003e-4411-a1e4-357e176545f8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.189116 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.197358 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.202503 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.204860 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcq5j\" (UniqueName: \"kubernetes.io/projected/6e408545-84a5-4b62-ab02-e213a58d1c53-kube-api-access-tcq5j\") pod \"collect-profiles-29421240-5jmzz\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.217682 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.218257 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6"] Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.222525 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwwtz\" (UniqueName: \"kubernetes.io/projected/32959a40-5fd2-485b-9fff-373e5c39a86b-kube-api-access-xwwtz\") pod \"service-ca-9c57cc56f-lm8br\" (UID: \"32959a40-5fd2-485b-9fff-373e5c39a86b\") " pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.223688 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.225863 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:35 crc kubenswrapper[5002]: E1209 10:03:35.226318 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.726298895 +0000 UTC m=+148.118349976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.229608 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.239956 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.245351 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.246266 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbszl\" (UniqueName: \"kubernetes.io/projected/58eda1c5-bb10-4843-9ed4-64ee38d040ee-kube-api-access-jbszl\") pod \"marketplace-operator-79b997595-zbsjj\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.260271 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.273198 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.277968 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kpldz" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.282270 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk94v\" (UniqueName: \"kubernetes.io/projected/79241a31-6d92-482a-9cc3-c2370d516cbf-kube-api-access-hk94v\") pod \"multus-admission-controller-857f4d67dd-x5gjs\" (UID: \"79241a31-6d92-482a-9cc3-c2370d516cbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.292031 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" event={"ID":"8a5ea199-20b4-419b-97bc-0845ac5bb720","Type":"ContainerStarted","Data":"eb964ee33f379534e6fe698557f0113f67ed8f08065107709fcc7a0a83024c02"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.305417 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67316e5c-6e24-45d1-8789-eec0570baa40-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r2fkt\" (UID: \"67316e5c-6e24-45d1-8789-eec0570baa40\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.309232 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2s9w5"] Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.312077 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9pmz\" (UniqueName: \"kubernetes.io/projected/f85c45d5-d9fa-418c-91ea-0f054181f4c7-kube-api-access-p9pmz\") pod \"csi-hostpathplugin-5z75v\" (UID: \"f85c45d5-d9fa-418c-91ea-0f054181f4c7\") " pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.314327 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwldf\" (UniqueName: \"kubernetes.io/projected/b80223e8-1b01-446b-8db3-6f39c3455dca-kube-api-access-rwldf\") pod \"kube-storage-version-migrator-operator-b67b599dd-4n59b\" (UID: \"b80223e8-1b01-446b-8db3-6f39c3455dca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.337463 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54bf04d4-958e-4105-ab04-73920ec14b8f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xv9r5\" (UID: \"54bf04d4-958e-4105-ab04-73920ec14b8f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.338239 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:35 crc kubenswrapper[5002]: E1209 10:03:35.338656 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.838641787 +0000 UTC m=+148.230692868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.350538 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fm26s" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.364472 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"343b8429c2f2896a912fdddf42f6b8e1c18e1b7449a38c4b22111fc0cf556571"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.364522 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"65c9ded000d69788e626eb8a5c3727bda07e55254d6d3fde9a02512aa3821762"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.367294 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" event={"ID":"84005632-49e7-400a-8748-5f16909cfda1","Type":"ContainerStarted","Data":"d92d197e27535eb7c4717e2200ef74a59a0863549150c8b805e7b7659440dfb3"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.400176 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" event={"ID":"ae53c0af-26d8-4db2-8885-86b643bf2ee1","Type":"ContainerStarted","Data":"bad62c812098db740748f873f1eff54a9fe95c067b9d1183cfed78980ef8835b"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.400219 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.418127 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.427788 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ceb81fcf95623a6a0ba31b2f03c818555de200ffa62b1a7294d55b8d52c5281d"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.427858 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d5f7a7cc01984ee009f328f18e0ec5813cf04a46d0d11de90fb0d3ca2b3055dc"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.440201 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:35 crc kubenswrapper[5002]: E1209 10:03:35.441831 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:35.941795976 +0000 UTC m=+148.333847057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.451394 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.461087 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.470287 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" event={"ID":"dcdecdfe-61c0-4d85-99b5-e1fe25727259","Type":"ContainerStarted","Data":"0d9e6afb64adb24073978431b7062abb17f94cdfadff066476baaf3442ee2d43"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.470339 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" event={"ID":"dcdecdfe-61c0-4d85-99b5-e1fe25727259","Type":"ContainerStarted","Data":"957ac60ef42bf89e412cd28ce90585ae58975a71b4a14fc31aa934473849ab46"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.481065 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.494858 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.515741 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" event={"ID":"bf7272a5-afe4-4ad1-bc7e-2c2397b84a95","Type":"ContainerStarted","Data":"c5d557d1ca8e0a9e60237bca8c8fe8cdb93158bf1d7b196ffa17795d579e03b9"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.515784 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.544194 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:35 crc kubenswrapper[5002]: E1209 10:03:35.550357 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.050341459 +0000 UTC m=+148.442392540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.557474 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.564757 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.589752 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" event={"ID":"14ca2bab-e020-4c2d-be3c-e4138095c149","Type":"ContainerStarted","Data":"213b004614e093c0fced9bc06553986b4e50791faa83bcf34e61878d9d3233b2"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.596044 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8f7fe6b243bab9057867237754751394e9b0f941af19ab3f7a00b8b2b3fdc87d"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.598669 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" event={"ID":"2ca986cd-d4df-4af5-8918-33f445a52f49","Type":"ContainerStarted","Data":"1e3b22ba36047a47fc190cf3dc7a9f9f40577eac526cf474a0c58344d108e4e7"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.601126 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5z75v" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.646326 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n967v" event={"ID":"95a7b196-74b4-4d67-a0e0-3e4b92e468ed","Type":"ContainerStarted","Data":"09059a35ce6aa88786547afb643646c7fe5bec07def4529d5f5ece5863215bd5"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.646458 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:35 crc kubenswrapper[5002]: E1209 10:03:35.656642 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.156617217 +0000 UTC m=+148.548668308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.676042 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" event={"ID":"870c64cb-d3e9-4491-9c83-31c51100256a","Type":"ContainerStarted","Data":"b4acfe51d1bbd7af667809a198260f1f3f94373c4f4bf4f5138b90512f226fc2"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.676090 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" event={"ID":"870c64cb-d3e9-4491-9c83-31c51100256a","Type":"ContainerStarted","Data":"e5b32e56acc5e57f83ef9a7a907a3c8e49e6ace315fa0f3545027201edbd4b38"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.709080 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" event={"ID":"bb35474f-f199-45c9-8a0d-3c7c93268ef6","Type":"ContainerStarted","Data":"bc8efc74ce016dcc2926f0e0258ac352b92eddc14d1a2c08e6d8aeb09d55f737"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.709123 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" event={"ID":"bb35474f-f199-45c9-8a0d-3c7c93268ef6","Type":"ContainerStarted","Data":"fec5ddae4181271d678df193a58f480aa2d804d57429852c7c08b785a234d08f"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.717678 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" event={"ID":"0da3a8de-5a81-4bfb-87d0-756273bfefb3","Type":"ContainerStarted","Data":"9510a3dbcae387babc7eac76aba26cb3b62e7187ddc15c495b56d5e07b3b0028"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.752888 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:35 crc kubenswrapper[5002]: E1209 10:03:35.754363 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.254350331 +0000 UTC m=+148.646401412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.779647 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8"] Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.796108 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" event={"ID":"e2598866-d004-41a0-b058-01fb9a379df5","Type":"ContainerStarted","Data":"34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.796751 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.834017 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" event={"ID":"fef799ad-c87c-4f60-9b8e-afb51958013d","Type":"ContainerStarted","Data":"1953c98e8c09876af85ad4075d814391ca089b221cd8d169608db865976b9f86"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.835001 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.855121 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:35 crc kubenswrapper[5002]: E1209 10:03:35.855249 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.355207414 +0000 UTC m=+148.747258495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.855481 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:35 crc kubenswrapper[5002]: E1209 10:03:35.857607 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.357594812 +0000 UTC m=+148.749645893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.858331 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" event={"ID":"66daadf1-5a22-4a90-a004-68fc7848d557","Type":"ContainerStarted","Data":"88bcc5c82ca50d2c5e949b0d4ae53991aea82b162ca10bb08e8be1ce2b9dfcce"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.876140 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.878338 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5"] Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.903447 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xp9jk" event={"ID":"58ef0377-76a5-4299-999f-7bdd55606533","Type":"ContainerStarted","Data":"76618f79a2a893eef2a62c7b2bd6f8a0670cad5988c0584cc68d5ea595bc5492"} Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.903496 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xp9jk" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.933346 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-mtwnb" Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.943525 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pm68f"] Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.958222 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:35 crc kubenswrapper[5002]: E1209 10:03:35.959913 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.459890316 +0000 UTC m=+148.851941447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.962038 5002 patch_prober.go:28] interesting pod/downloads-7954f5f757-xp9jk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 09 10:03:35 crc kubenswrapper[5002]: I1209 10:03:35.962093 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xp9jk" podUID="58ef0377-76a5-4299-999f-7bdd55606533" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.064562 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:36 crc kubenswrapper[5002]: E1209 10:03:36.064970 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.56495348 +0000 UTC m=+148.957004561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.162616 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.166860 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:36 crc kubenswrapper[5002]: E1209 10:03:36.169455 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.669433177 +0000 UTC m=+149.061484258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.210625 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lm8br"] Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.246884 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz"] Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.270780 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:36 crc kubenswrapper[5002]: E1209 10:03:36.272458 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.772443911 +0000 UTC m=+149.164494992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.373070 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:36 crc kubenswrapper[5002]: E1209 10:03:36.373326 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.873293804 +0000 UTC m=+149.265344885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.379208 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:36 crc kubenswrapper[5002]: E1209 10:03:36.379549 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.879538853 +0000 UTC m=+149.271589934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.485252 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:36 crc kubenswrapper[5002]: E1209 10:03:36.485952 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:36.985931434 +0000 UTC m=+149.377982515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.591624 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:36 crc kubenswrapper[5002]: E1209 10:03:36.591980 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:37.091968316 +0000 UTC m=+149.484019397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.592880 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6"] Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.688331 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-mtwnb" podStartSLOduration=126.68831278 podStartE2EDuration="2m6.68831278s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:36.683918044 +0000 UTC m=+149.075969125" watchObservedRunningTime="2025-12-09 10:03:36.68831278 +0000 UTC m=+149.080363941" Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.692793 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:36 crc kubenswrapper[5002]: E1209 10:03:36.696476 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:37.196449902 +0000 UTC m=+149.588500993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.699480 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:36 crc kubenswrapper[5002]: E1209 10:03:36.699859 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:37.19984907 +0000 UTC m=+149.591900151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.728304 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" podStartSLOduration=126.728289623 podStartE2EDuration="2m6.728289623s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:36.726949384 +0000 UTC m=+149.119000475" watchObservedRunningTime="2025-12-09 10:03:36.728289623 +0000 UTC m=+149.120340704" Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.782125 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" podStartSLOduration=126.782103841 podStartE2EDuration="2m6.782103841s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:36.778059605 +0000 UTC m=+149.170110686" watchObservedRunningTime="2025-12-09 10:03:36.782103841 +0000 UTC m=+149.174154922" Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.803230 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:36 crc kubenswrapper[5002]: E1209 10:03:36.803544 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:37.303529663 +0000 UTC m=+149.695580744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.819913 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-xp9jk" podStartSLOduration=126.819889541 podStartE2EDuration="2m6.819889541s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:36.819520841 +0000 UTC m=+149.211571932" watchObservedRunningTime="2025-12-09 10:03:36.819889541 +0000 UTC m=+149.211940622" Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.897114 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmm6f" podStartSLOduration=125.897089898 podStartE2EDuration="2m5.897089898s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:36.861037507 +0000 UTC m=+149.253088588" watchObservedRunningTime="2025-12-09 10:03:36.897089898 +0000 UTC m=+149.289140979" Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.897888 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" podStartSLOduration=125.897882111 podStartE2EDuration="2m5.897882111s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:36.89646203 +0000 UTC m=+149.288513111" watchObservedRunningTime="2025-12-09 10:03:36.897882111 +0000 UTC m=+149.289933192" Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.904206 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:36 crc kubenswrapper[5002]: E1209 10:03:36.904551 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:37.404539191 +0000 UTC m=+149.796590272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.923156 5002 generic.go:334] "Generic (PLEG): container finished" podID="2ca986cd-d4df-4af5-8918-33f445a52f49" containerID="6a343d1cd70448ece06bc77942b8895e4f5fb37f13410be1de80f35eeab10c5e" exitCode=0 Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.923213 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" event={"ID":"2ca986cd-d4df-4af5-8918-33f445a52f49","Type":"ContainerDied","Data":"6a343d1cd70448ece06bc77942b8895e4f5fb37f13410be1de80f35eeab10c5e"} Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.925967 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5gtc8" event={"ID":"99f91e54-9dc7-43ca-86ee-4258c22c6004","Type":"ContainerStarted","Data":"856ae2aeb19437a4799e4ebf7a0bb757abae737450a73e8c66cd7234ad85bff8"} Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.928447 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" event={"ID":"4d9cdd76-7711-48de-a449-98e306ba4fe7","Type":"ContainerStarted","Data":"b165d98c2baa0b75dbe8928b7faf20ee4ab4bcf19bea8703c74db079537e8862"} Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.929466 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz" event={"ID":"0af4f459-58c9-4433-9faf-5e669262fb2e","Type":"ContainerStarted","Data":"dd24ad34b25377056a71f327de1ed561595a1f897b2056796a98f4ccb9729238"} Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.939993 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" event={"ID":"e8a05f51-2e2c-4918-80a6-201afd3d14a5","Type":"ContainerStarted","Data":"7f3732304009b7fa24bdcfe56cf7737fe5dfc3abc60a15c8250a052a240b136f"} Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.971101 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pm68f" event={"ID":"ab422596-7d3f-4a87-b7ac-c62e68465bb1","Type":"ContainerStarted","Data":"8c5681b04fb0c735f4649a976033531c59713b39d6a4429e8f94aefc1b356fb6"} Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.974598 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" event={"ID":"bf7272a5-afe4-4ad1-bc7e-2c2397b84a95","Type":"ContainerStarted","Data":"0ae73abf821c54d4119676d7400b53225a009bb128f8e515e010e84de42d0d65"} Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.984100 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp"] Dec 09 10:03:36 crc kubenswrapper[5002]: I1209 10:03:36.984352 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" podStartSLOduration=126.984322112 podStartE2EDuration="2m6.984322112s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:36.968088638 +0000 UTC m=+149.360139719" watchObservedRunningTime="2025-12-09 10:03:36.984322112 +0000 UTC m=+149.376373193" Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.001281 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffzj8" event={"ID":"14ca2bab-e020-4c2d-be3c-e4138095c149","Type":"ContainerStarted","Data":"915a60efe4bb94a31ffa3f40bcdd85800607042f5cc7150bf2e06cdc5e659b69"} Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.008665 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:37 crc kubenswrapper[5002]: E1209 10:03:37.009887 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:37.509867372 +0000 UTC m=+149.901918463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.029327 5002 generic.go:334] "Generic (PLEG): container finished" podID="8a5ea199-20b4-419b-97bc-0845ac5bb720" containerID="bbd5beee4de9609175417105361d216b6bf6d2202965c57806299eb0ca1d0529" exitCode=0 Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.029408 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" event={"ID":"8a5ea199-20b4-419b-97bc-0845ac5bb720","Type":"ContainerDied","Data":"bbd5beee4de9609175417105361d216b6bf6d2202965c57806299eb0ca1d0529"} Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.058377 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n967v" event={"ID":"95a7b196-74b4-4d67-a0e0-3e4b92e468ed","Type":"ContainerStarted","Data":"e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a"} Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.118849 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:37 crc kubenswrapper[5002]: E1209 10:03:37.122742 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:37.622712428 +0000 UTC m=+150.014763519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.125050 5002 generic.go:334] "Generic (PLEG): container finished" podID="84005632-49e7-400a-8748-5f16909cfda1" containerID="f9c39d5e658a2e24d8413126c060b406b376c9058e5666c3bc774ec0eb3e12b8" exitCode=0 Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.125139 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" event={"ID":"84005632-49e7-400a-8748-5f16909cfda1","Type":"ContainerDied","Data":"f9c39d5e658a2e24d8413126c060b406b376c9058e5666c3bc774ec0eb3e12b8"} Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.167150 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5" event={"ID":"250ef544-c81b-4487-9eba-fcce62841a3d","Type":"ContainerStarted","Data":"763dba9667ed4b212c1ba3ae7ab5fb69d233e0bd91a5e534e72e23f98415514e"} Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.223060 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:37 crc kubenswrapper[5002]: E1209 10:03:37.223611 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:37.723597272 +0000 UTC m=+150.115648353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.232130 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" event={"ID":"0da3a8de-5a81-4bfb-87d0-756273bfefb3","Type":"ContainerStarted","Data":"d567e14fc7a1b80a71437fb31e6759cac9aa9f05f478843b71de61f8e9c35b47"} Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.237664 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kpldz" event={"ID":"9f2b7402-d357-4912-9cb3-b0976966ad20","Type":"ContainerStarted","Data":"388f82b7229896f3064d5f07eab4e9c386611ace5e7435fd4f440aaa77d76533"} Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.238574 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" event={"ID":"32959a40-5fd2-485b-9fff-373e5c39a86b","Type":"ContainerStarted","Data":"633dad07997cc703d4edfc2cd877a4c74bb000f634114ee72ecef26c8873b7a6"} Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.243148 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" event={"ID":"a70e2957-fa0c-4393-adaf-93a0759263d1","Type":"ContainerStarted","Data":"cc4707fadb0f1774f1b5e9ade5ff53cdd60eaf8cb6c19e5d51ec24e99d5d0a34"} Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.257232 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-np52q" event={"ID":"870c64cb-d3e9-4491-9c83-31c51100256a","Type":"ContainerStarted","Data":"4d7f11f73d872a86312f89ffc2d5e33ec384a9427645e355ac72b29f33819e75"} Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.260608 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7fc1bd34863a41b2aa909539a84bcf03fda5b68b409517c79174c471154abb17"} Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.260648 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.264645 5002 patch_prober.go:28] interesting pod/downloads-7954f5f757-xp9jk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.264700 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xp9jk" podUID="58ef0377-76a5-4299-999f-7bdd55606533" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.311396 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-rwv2p" podStartSLOduration=127.311371271 podStartE2EDuration="2m7.311371271s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:37.277433821 +0000 UTC m=+149.669484902" watchObservedRunningTime="2025-12-09 10:03:37.311371271 +0000 UTC m=+149.703422452" Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.312115 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvz4c" podStartSLOduration=127.312110092 podStartE2EDuration="2m7.312110092s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:37.311029961 +0000 UTC m=+149.703081062" watchObservedRunningTime="2025-12-09 10:03:37.312110092 +0000 UTC m=+149.704161163" Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.325971 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:37 crc kubenswrapper[5002]: E1209 10:03:37.329855 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:37.829837299 +0000 UTC m=+150.221888380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.408197 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-n967v" podStartSLOduration=127.408173728 podStartE2EDuration="2m7.408173728s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:37.40647207 +0000 UTC m=+149.798523171" watchObservedRunningTime="2025-12-09 10:03:37.408173728 +0000 UTC m=+149.800224809" Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.409631 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" podStartSLOduration=127.40961837 podStartE2EDuration="2m7.40961837s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:37.343483989 +0000 UTC m=+149.735535070" watchObservedRunningTime="2025-12-09 10:03:37.40961837 +0000 UTC m=+149.801669451" Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.428491 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-95fhm"] Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.431763 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:37 crc kubenswrapper[5002]: E1209 10:03:37.432537 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:37.932517994 +0000 UTC m=+150.324569075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.485360 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm"] Dec 09 10:03:37 crc kubenswrapper[5002]: W1209 10:03:37.497898 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6bb106_46e5_4ae3_8bf2_4c07c9106a92.slice/crio-b4492b09d8a64618753f5be477e53d591673f8ab490be1cc506214c321afca2f WatchSource:0}: Error finding container b4492b09d8a64618753f5be477e53d591673f8ab490be1cc506214c321afca2f: Status 404 returned error can't find the container with id b4492b09d8a64618753f5be477e53d591673f8ab490be1cc506214c321afca2f Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.533550 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:37 crc kubenswrapper[5002]: E1209 10:03:37.534003 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:38.033986855 +0000 UTC m=+150.426037936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.552318 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b"] Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.571095 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln"] Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.599880 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb"] Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.601869 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" podStartSLOduration=127.601845835 podStartE2EDuration="2m7.601845835s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:37.595684529 +0000 UTC m=+149.987735610" watchObservedRunningTime="2025-12-09 10:03:37.601845835 +0000 UTC m=+149.993896926" Dec 09 10:03:37 crc kubenswrapper[5002]: W1209 10:03:37.628988 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb80223e8_1b01_446b_8db3_6f39c3455dca.slice/crio-11d6b5e38f8d86f8ea53402224ff436fd36bc51506395cc6ae3914b25cc794a0 WatchSource:0}: Error finding container 11d6b5e38f8d86f8ea53402224ff436fd36bc51506395cc6ae3914b25cc794a0: Status 404 returned error can't find the container with id 11d6b5e38f8d86f8ea53402224ff436fd36bc51506395cc6ae3914b25cc794a0 Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.637298 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:37 crc kubenswrapper[5002]: E1209 10:03:37.637669 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:38.137655018 +0000 UTC m=+150.529706099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:37 crc kubenswrapper[5002]: W1209 10:03:37.672878 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded519358_adaa_4396_a436_0695d79eb551.slice/crio-6d5da747f7fab8542fbdd7df8d3aada84f341da29ca310054c1c5c6e570a21ca WatchSource:0}: Error finding container 6d5da747f7fab8542fbdd7df8d3aada84f341da29ca310054c1c5c6e570a21ca: Status 404 returned error can't find the container with id 6d5da747f7fab8542fbdd7df8d3aada84f341da29ca310054c1c5c6e570a21ca Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.681044 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt"] Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.738589 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:37 crc kubenswrapper[5002]: E1209 10:03:37.738954 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:38.238943393 +0000 UTC m=+150.630994474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.795015 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5"] Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.841106 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:37 crc kubenswrapper[5002]: E1209 10:03:37.858061 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:38.358019797 +0000 UTC m=+150.750070878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.921773 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p"] Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.946113 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz"] Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.948640 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:37 crc kubenswrapper[5002]: E1209 10:03:37.949036 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:38.449022649 +0000 UTC m=+150.841073730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.978026 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.978092 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:03:37 crc kubenswrapper[5002]: I1209 10:03:37.989412 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd"] Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.001292 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5z75v"] Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.027724 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fm26s"] Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.050045 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:38 crc kubenswrapper[5002]: E1209 10:03:38.050376 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:38.550362436 +0000 UTC m=+150.942413517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.097150 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbsjj"] Dec 09 10:03:38 crc kubenswrapper[5002]: W1209 10:03:38.140019 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24664ba4_8500_4f9f_bd2a_c7ecd2b114dc.slice/crio-413026deb2354a87da96769d6146dcbdf440e284e3e2062a77e3532320b00285 WatchSource:0}: Error finding container 413026deb2354a87da96769d6146dcbdf440e284e3e2062a77e3532320b00285: Status 404 returned error can't find the container with id 413026deb2354a87da96769d6146dcbdf440e284e3e2062a77e3532320b00285 Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.153998 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:38 crc kubenswrapper[5002]: E1209 10:03:38.158216 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:38.658198088 +0000 UTC m=+151.050249169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.169299 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x5gjs"] Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.265422 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:38 crc kubenswrapper[5002]: E1209 10:03:38.266253 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:38.766236117 +0000 UTC m=+151.158287198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.325126 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5" event={"ID":"250ef544-c81b-4487-9eba-fcce62841a3d","Type":"ContainerStarted","Data":"dc38a1a11c6e4de47b9c4af6e335420b252d933898b4dc2a4ab7965b95a2eda6"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.325185 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5" event={"ID":"250ef544-c81b-4487-9eba-fcce62841a3d","Type":"ContainerStarted","Data":"8192898c4c9bc8a1ce21a16946a0ec303112bea5589adbb611454ec596dcd6c8"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.358986 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" event={"ID":"bf7272a5-afe4-4ad1-bc7e-2c2397b84a95","Type":"ContainerStarted","Data":"05c613cd2167d9d386a69cbf7585e4ad6a2972a6acf07455882bc0257b755e41"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.369759 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:38 crc kubenswrapper[5002]: E1209 10:03:38.370977 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:38.870964091 +0000 UTC m=+151.263015182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.407370 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" event={"ID":"84005632-49e7-400a-8748-5f16909cfda1","Type":"ContainerStarted","Data":"b1a2316dbedbecaaf663b567fe867927d5b83452dcb504db3aacaddbe34a7575"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.408772 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.410460 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.412871 5002 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-nmgxj container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.412931 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" podUID="84005632-49e7-400a-8748-5f16909cfda1" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.471297 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:38 crc kubenswrapper[5002]: E1209 10:03:38.472506 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:38.972490283 +0000 UTC m=+151.364541364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.499366 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5gtc8" event={"ID":"99f91e54-9dc7-43ca-86ee-4258c22c6004","Type":"ContainerStarted","Data":"5dbc7f1dee195d7cebf13aa5f4d1ba1ea7e3a61e40bc46ecf510717ffa6df580"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.537674 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" event={"ID":"8a5ea199-20b4-419b-97bc-0845ac5bb720","Type":"ContainerStarted","Data":"63ecd720ada3f699e678f1c87826bfaced85d675994b3a5d49161bcd299e3f76"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.567844 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" event={"ID":"e8a05f51-2e2c-4918-80a6-201afd3d14a5","Type":"ContainerStarted","Data":"698372557d2a277458a34407d7e73bc8200f34bf25da8e399edd78caad9cf936"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.577755 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:38 crc kubenswrapper[5002]: E1209 10:03:38.579174 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.079161242 +0000 UTC m=+151.471212323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.600529 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" event={"ID":"32959a40-5fd2-485b-9fff-373e5c39a86b","Type":"ContainerStarted","Data":"ef803bbedaf206373ee73dadb9220fd2c0408daa96df71a04a5acdd34979a12e"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.629350 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" event={"ID":"ed519358-adaa-4396-a436-0695d79eb551","Type":"ContainerStarted","Data":"6d5da747f7fab8542fbdd7df8d3aada84f341da29ca310054c1c5c6e570a21ca"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.644528 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" event={"ID":"2ca986cd-d4df-4af5-8918-33f445a52f49","Type":"ContainerStarted","Data":"f0833959ea5c9d07c2e68a65d125e2c26df65ddfa0de7d44ef9643b6cfa01d54"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.644914 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.678548 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:38 crc kubenswrapper[5002]: E1209 10:03:38.678992 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.178961475 +0000 UTC m=+151.571012566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.687891 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" event={"ID":"495d9e9a-51ee-49cc-b870-6c990157bdf5","Type":"ContainerStarted","Data":"323dd01e88f30f7eaf50196b35f2bb92571c75425eb1c76e26f499214f593eae"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.687938 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" event={"ID":"495d9e9a-51ee-49cc-b870-6c990157bdf5","Type":"ContainerStarted","Data":"f08bfd5f8825763f079be6b2a057788de4595cc588ca3380644cc76b7ed8281b"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.688933 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.721998 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" event={"ID":"b80223e8-1b01-446b-8db3-6f39c3455dca","Type":"ContainerStarted","Data":"11d6b5e38f8d86f8ea53402224ff436fd36bc51506395cc6ae3914b25cc794a0"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.759914 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" event={"ID":"67316e5c-6e24-45d1-8789-eec0570baa40","Type":"ContainerStarted","Data":"f6a7b08d7014a73728fac423162a22e0b67788bba6b622d8155fb22742d8feab"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.788232 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:38 crc kubenswrapper[5002]: E1209 10:03:38.789734 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.289720802 +0000 UTC m=+151.681771893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.797533 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" event={"ID":"f579d242-665c-49be-b592-b6e539eb99c0","Type":"ContainerStarted","Data":"60e1ed61bc7edc1044feaf7e29ca6ffdff4417a300504689720a4e0f11acedc7"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.797583 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" event={"ID":"f579d242-665c-49be-b592-b6e539eb99c0","Type":"ContainerStarted","Data":"988e2a8b3890b627a531fa9e97b736d8a29e3fa9bb267af66b466d90a65ca942"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.860895 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" event={"ID":"4d9cdd76-7711-48de-a449-98e306ba4fe7","Type":"ContainerStarted","Data":"f25691e9e42f5575920d5e38bf9494e37ea5455486b5af9e15d678cb87d3fc8d"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.873146 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz" event={"ID":"0af4f459-58c9-4433-9faf-5e669262fb2e","Type":"ContainerStarted","Data":"0c742cbb1fa262a672b1489fed4f71bd947d5905575252562e0bcd875a92906c"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.891349 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:38 crc kubenswrapper[5002]: E1209 10:03:38.891459 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.39144137 +0000 UTC m=+151.783492451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.891655 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:38 crc kubenswrapper[5002]: E1209 10:03:38.892853 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.39284284 +0000 UTC m=+151.784893921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.922522 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2pn5" podStartSLOduration=127.922504058 podStartE2EDuration="2m7.922504058s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:38.922500237 +0000 UTC m=+151.314551318" watchObservedRunningTime="2025-12-09 10:03:38.922504058 +0000 UTC m=+151.314555139" Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.927324 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" event={"ID":"ea6bb106-46e5-4ae3-8bf2-4c07c9106a92","Type":"ContainerStarted","Data":"37d245ab391ab9f68358a71d61aa06f4fce1100af821e223a543687e04456f64"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.927373 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" event={"ID":"ea6bb106-46e5-4ae3-8bf2-4c07c9106a92","Type":"ContainerStarted","Data":"b4492b09d8a64618753f5be477e53d591673f8ab490be1cc506214c321afca2f"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.949433 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wxfc6" podStartSLOduration=128.949418327 podStartE2EDuration="2m8.949418327s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:38.947651886 +0000 UTC m=+151.339702977" watchObservedRunningTime="2025-12-09 10:03:38.949418327 +0000 UTC m=+151.341469408" Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.992547 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:38 crc kubenswrapper[5002]: E1209 10:03:38.992657 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.492627232 +0000 UTC m=+151.884678303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.992788 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.994855 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" event={"ID":"9e8605e9-eb8b-4676-8d27-b5f424cb94d1","Type":"ContainerStarted","Data":"8ba7e83768fb7bbb1c4c5e48f516b79477173c3157236cb59e1a12813b3219d7"} Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.994888 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:38 crc kubenswrapper[5002]: I1209 10:03:38.994898 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" event={"ID":"9e8605e9-eb8b-4676-8d27-b5f424cb94d1","Type":"ContainerStarted","Data":"509f7d11f5da8e7df2482bb4553b39fb01081374a75d35623fd91128ca71a850"} Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.004148 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.504127961 +0000 UTC m=+151.896179042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.007846 5002 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2sjdm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.007888 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" podUID="9e8605e9-eb8b-4676-8d27-b5f424cb94d1" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.022002 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79h6t" event={"ID":"0da3a8de-5a81-4bfb-87d0-756273bfefb3","Type":"ContainerStarted","Data":"23d01488f1f937a779be04361593192eb4615bbb02b16d4a5dfe49e17a666a8d"} Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.029590 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" podStartSLOduration=128.029573258 podStartE2EDuration="2m8.029573258s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:38.995117493 +0000 UTC m=+151.387168574" watchObservedRunningTime="2025-12-09 10:03:39.029573258 +0000 UTC m=+151.421624339" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.036420 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5z75v" event={"ID":"f85c45d5-d9fa-418c-91ea-0f054181f4c7","Type":"ContainerStarted","Data":"6e2625429d9eff08f2f9881d19f2fd7d6faf23728233f845ed79549e2bc3d726"} Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.048885 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pm68f" event={"ID":"ab422596-7d3f-4a87-b7ac-c62e68465bb1","Type":"ContainerStarted","Data":"fafaeae3173a0eb305f70fd044c08f4f96d79f542b8561f77717391a9f7c7fcd"} Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.050542 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.089777 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" podStartSLOduration=129.089750619 podStartE2EDuration="2m9.089750619s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.083329005 +0000 UTC m=+151.475380086" watchObservedRunningTime="2025-12-09 10:03:39.089750619 +0000 UTC m=+151.481801720" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.091078 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" podStartSLOduration=128.091069466 podStartE2EDuration="2m8.091069466s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.034158059 +0000 UTC m=+151.426209170" watchObservedRunningTime="2025-12-09 10:03:39.091069466 +0000 UTC m=+151.483120547" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.096263 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.098417 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.598393806 +0000 UTC m=+151.990444887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.103436 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.105897 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.60588173 +0000 UTC m=+151.997932811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.109861 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" event={"ID":"fb3d4992-003e-4411-a1e4-357e176545f8","Type":"ContainerStarted","Data":"6a39d5f035019b2c8ba961091ea68059bfeef827ee9480ac18536966a58d34e6"} Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.112752 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r22tz" podStartSLOduration=128.112735906 podStartE2EDuration="2m8.112735906s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.112122588 +0000 UTC m=+151.504173669" watchObservedRunningTime="2025-12-09 10:03:39.112735906 +0000 UTC m=+151.504786987" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.124780 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kpldz" event={"ID":"9f2b7402-d357-4912-9cb3-b0976966ad20","Type":"ContainerStarted","Data":"55fca14be047e2892c5cad78405897260f382e038dbf8b22fbfd5be0443825e3"} Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.144764 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-lm8br" podStartSLOduration=128.144746041 podStartE2EDuration="2m8.144746041s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.138556464 +0000 UTC m=+151.530607545" watchObservedRunningTime="2025-12-09 10:03:39.144746041 +0000 UTC m=+151.536797122" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.152046 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" event={"ID":"a70e2957-fa0c-4393-adaf-93a0759263d1","Type":"ContainerStarted","Data":"3284bfc0e44bd02d81e8460b3fa0685d611798e29f4f806cc7b9d461304b0d95"} Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.178692 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" event={"ID":"54bf04d4-958e-4105-ab04-73920ec14b8f","Type":"ContainerStarted","Data":"92acbcf7d8ff715e86eaeea55f8d9c1016b85fae1134bdab36fb1473e700833a"} Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.198269 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-lxrwt" podStartSLOduration=129.19824553 podStartE2EDuration="2m9.19824553s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.175590073 +0000 UTC m=+151.567641154" watchObservedRunningTime="2025-12-09 10:03:39.19824553 +0000 UTC m=+151.590296611" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.206518 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.206637 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.70661519 +0000 UTC m=+152.098666271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.206994 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.208000 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lb5s8" podStartSLOduration=129.207979669 podStartE2EDuration="2m9.207979669s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.20031593 +0000 UTC m=+151.592367011" watchObservedRunningTime="2025-12-09 10:03:39.207979669 +0000 UTC m=+151.600030750" Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.211288 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.711270833 +0000 UTC m=+152.103321914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.242508 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" podStartSLOduration=128.242490585 podStartE2EDuration="2m8.242490585s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.241381713 +0000 UTC m=+151.633432794" watchObservedRunningTime="2025-12-09 10:03:39.242490585 +0000 UTC m=+151.634541666" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.243104 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.254114 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:39 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:39 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:39 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.254174 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.263183 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" event={"ID":"6e408545-84a5-4b62-ab02-e213a58d1c53","Type":"ContainerStarted","Data":"00a4fcf53a1aceb562e7a08fe6700ddd4530e5b1020944c74dbc60fd401ea8ac"} Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.289298 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5gtc8" podStartSLOduration=129.289275283 podStartE2EDuration="2m9.289275283s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.280378458 +0000 UTC m=+151.672429549" watchObservedRunningTime="2025-12-09 10:03:39.289275283 +0000 UTC m=+151.681326364" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.306113 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" event={"ID":"0357ba06-3248-4036-86c9-975788e67bdf","Type":"ContainerStarted","Data":"aa9e63c0432070638a559341db88763d0d36bdf070874fecaef78c33c0bb9533"} Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.311464 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.811433066 +0000 UTC m=+152.203484147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.308431 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.316241 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.316587 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.816572313 +0000 UTC m=+152.208623394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.329203 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fm26s" event={"ID":"24664ba4-8500-4f9f-bd2a-c7ecd2b114dc","Type":"ContainerStarted","Data":"413026deb2354a87da96769d6146dcbdf440e284e3e2062a77e3532320b00285"} Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.332093 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" podStartSLOduration=129.332077016 podStartE2EDuration="2m9.332077016s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.331183431 +0000 UTC m=+151.723234512" watchObservedRunningTime="2025-12-09 10:03:39.332077016 +0000 UTC m=+151.724128087" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.335640 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" event={"ID":"58eda1c5-bb10-4843-9ed4-64ee38d040ee","Type":"ContainerStarted","Data":"f997ec069e6e15f37849acdf17bc7ab4eb22ef7212b328411bd5bbfc3c757675"} Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.336774 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.346278 5002 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zbsjj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.346334 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" podUID="58eda1c5-bb10-4843-9ed4-64ee38d040ee" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.368863 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" event={"ID":"20bd2a89-b1af-4bb2-9e28-862273221604","Type":"ContainerStarted","Data":"accbadcbba39ceaf1d92d113d21a78b827511cb075c85d96202bedcc409bac26"} Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.369641 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.374666 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-kpldz" podStartSLOduration=7.374647533 podStartE2EDuration="7.374647533s" podCreationTimestamp="2025-12-09 10:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.373441309 +0000 UTC m=+151.765492410" watchObservedRunningTime="2025-12-09 10:03:39.374647533 +0000 UTC m=+151.766698614" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.380038 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.423283 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.424597 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:39.92458094 +0000 UTC m=+152.316632021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.448245 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2s9w5" podStartSLOduration=129.448218756 podStartE2EDuration="2m9.448218756s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.396838817 +0000 UTC m=+151.788889918" watchObservedRunningTime="2025-12-09 10:03:39.448218756 +0000 UTC m=+151.840269837" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.478901 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" podStartSLOduration=128.478885913 podStartE2EDuration="2m8.478885913s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.444110489 +0000 UTC m=+151.836161570" watchObservedRunningTime="2025-12-09 10:03:39.478885913 +0000 UTC m=+151.870936994" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.521366 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-95fhm" podStartSLOduration=128.521346697 podStartE2EDuration="2m8.521346697s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.481054605 +0000 UTC m=+151.873105676" watchObservedRunningTime="2025-12-09 10:03:39.521346697 +0000 UTC m=+151.913397768" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.532519 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.532804 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.032793644 +0000 UTC m=+152.424844725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.579982 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" podStartSLOduration=129.579962842 podStartE2EDuration="2m9.579962842s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.521490381 +0000 UTC m=+151.913541462" watchObservedRunningTime="2025-12-09 10:03:39.579962842 +0000 UTC m=+151.972013923" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.580952 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pm68f" podStartSLOduration=7.580946751 podStartE2EDuration="7.580946751s" podCreationTimestamp="2025-12-09 10:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.578968214 +0000 UTC m=+151.971019305" watchObservedRunningTime="2025-12-09 10:03:39.580946751 +0000 UTC m=+151.972997832" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.618807 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" podStartSLOduration=128.618789092 podStartE2EDuration="2m8.618789092s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.618564436 +0000 UTC m=+152.010615517" watchObservedRunningTime="2025-12-09 10:03:39.618789092 +0000 UTC m=+152.010840183" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.638360 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.638835 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.138798044 +0000 UTC m=+152.530849125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.691374 5002 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-r24z6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.691456 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" podUID="495d9e9a-51ee-49cc-b870-6c990157bdf5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.748671 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.749103 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.249088117 +0000 UTC m=+152.641139198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.759320 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" podStartSLOduration=129.759299159 podStartE2EDuration="2m9.759299159s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.70687312 +0000 UTC m=+152.098924201" watchObservedRunningTime="2025-12-09 10:03:39.759299159 +0000 UTC m=+152.151350240" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.806290 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" podStartSLOduration=128.806271692 podStartE2EDuration="2m8.806271692s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.759171495 +0000 UTC m=+152.151222586" watchObservedRunningTime="2025-12-09 10:03:39.806271692 +0000 UTC m=+152.198322773" Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.851303 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.851445 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.351420792 +0000 UTC m=+152.743471873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.851560 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.851977 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.351964568 +0000 UTC m=+152.744015639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.952576 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.952774 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.452751539 +0000 UTC m=+152.844802620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:39 crc kubenswrapper[5002]: I1209 10:03:39.952825 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:39 crc kubenswrapper[5002]: E1209 10:03:39.953205 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.453192602 +0000 UTC m=+152.845243693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.053515 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.053718 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.553687245 +0000 UTC m=+152.945738326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.053834 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.054142 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.554130107 +0000 UTC m=+152.946181188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.154534 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.154660 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.654643061 +0000 UTC m=+153.046694142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.154796 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.155065 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.655056982 +0000 UTC m=+153.047108063 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.244172 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:40 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:40 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:40 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.244495 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.255881 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.256326 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.756310807 +0000 UTC m=+153.148361888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.358006 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.358356 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.858341824 +0000 UTC m=+153.250392905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.373376 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" event={"ID":"79241a31-6d92-482a-9cc3-c2370d516cbf","Type":"ContainerStarted","Data":"4980284cd9fced061a0a350bdb1838a41bf38e17cd3b0c116556b3aebf1c016a"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.373426 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" event={"ID":"79241a31-6d92-482a-9cc3-c2370d516cbf","Type":"ContainerStarted","Data":"af2b83f724dc7b99648e4fe18be2f182a3242188efae755f17c48bf882ce85bc"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.375443 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" event={"ID":"8a5ea199-20b4-419b-97bc-0845ac5bb720","Type":"ContainerStarted","Data":"1f74ee09d6fafe59252568bf49fcffd7422c63c7ba6d03e240e3d33a8fdf576f"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.376671 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" event={"ID":"58eda1c5-bb10-4843-9ed4-64ee38d040ee","Type":"ContainerStarted","Data":"7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.378512 5002 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zbsjj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.378558 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" podUID="58eda1c5-bb10-4843-9ed4-64ee38d040ee" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.380267 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" event={"ID":"fb3d4992-003e-4411-a1e4-357e176545f8","Type":"ContainerStarted","Data":"968a8496d2e8a60fa39bcae3b381e29581f3e5b6865e0b422e56166155b84c3f"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.380304 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" event={"ID":"fb3d4992-003e-4411-a1e4-357e176545f8","Type":"ContainerStarted","Data":"4ec4c9cc821c8dcce1ecd97b66317408d2f89484b9a572d05636d3fd34cca6d9"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.382049 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zrfsb" event={"ID":"20bd2a89-b1af-4bb2-9e28-862273221604","Type":"ContainerStarted","Data":"1b2d0cb0c4d8cc698d1206996f82ae5c5ea0eb29d9508afd470a50d0da6c7748"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.387201 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pdrd" event={"ID":"0357ba06-3248-4036-86c9-975788e67bdf","Type":"ContainerStarted","Data":"db2d962c0f30884d3281207385b4d7d841f58cc056b8d2fd4dda3ef5a0fe4578"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.388917 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" event={"ID":"6e408545-84a5-4b62-ab02-e213a58d1c53","Type":"ContainerStarted","Data":"07d5e13d633c647f14baf9f5c3a61ce86e7e8f34a3d20dcdfa800af8231d6639"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.390772 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fm26s" event={"ID":"24664ba4-8500-4f9f-bd2a-c7ecd2b114dc","Type":"ContainerStarted","Data":"e422599dce98663267ef894b45f3a21f7d4d299beae779f2f6bbfd53738f1ef9"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.392538 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5z75v" event={"ID":"f85c45d5-d9fa-418c-91ea-0f054181f4c7","Type":"ContainerStarted","Data":"134a1ac2e7a4add6569b3a13aefbc7e90a1a391e49261a8cff420f35d8f0548e"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.393770 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" event={"ID":"ed519358-adaa-4396-a436-0695d79eb551","Type":"ContainerStarted","Data":"c92d6252adeb9c0b0f260f5b90c820c73e1897d1d1f52aae1d3f4ece74fa609a"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.393801 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" event={"ID":"ed519358-adaa-4396-a436-0695d79eb551","Type":"ContainerStarted","Data":"daa8d7b26b536c02b68d037ccf20bf121c7be46ab8e54718152106718e2bbf1e"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.396313 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" event={"ID":"54bf04d4-958e-4105-ab04-73920ec14b8f","Type":"ContainerStarted","Data":"64e908dcb844c451a0deff90375a876dcf3830ae677628fb7a55dcbda5504864"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.396351 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv9r5" event={"ID":"54bf04d4-958e-4105-ab04-73920ec14b8f","Type":"ContainerStarted","Data":"5f879a91db2da49cb92c9c4110997cbf9f790ba5bf62b3577655e3eef5cc3885"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.399265 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fm26s" podStartSLOduration=8.399249103 podStartE2EDuration="8.399249103s" podCreationTimestamp="2025-12-09 10:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:39.806152218 +0000 UTC m=+152.198203329" watchObservedRunningTime="2025-12-09 10:03:40.399249103 +0000 UTC m=+152.791300184" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.402268 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" event={"ID":"f579d242-665c-49be-b592-b6e539eb99c0","Type":"ContainerStarted","Data":"a1cdad49cdbd11f9e792dbcd51ae6d0fc6f1b4ca7d5b024f74fe84bfee9eef19"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.403195 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.405032 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pm68f" event={"ID":"ab422596-7d3f-4a87-b7ac-c62e68465bb1","Type":"ContainerStarted","Data":"36fb5a35f9c026b3d8c3c49ff65bc3bcda3f833fd1cc37dcd46303d45792886c"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.406147 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4n59b" event={"ID":"b80223e8-1b01-446b-8db3-6f39c3455dca","Type":"ContainerStarted","Data":"f9e716ee7eafefa6f9c4a97b5584446407a3eb8b19d7858fc3cb3b80078d8fea"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.408647 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" event={"ID":"67316e5c-6e24-45d1-8789-eec0570baa40","Type":"ContainerStarted","Data":"213b0a664cf33b11bb9692f080495fb8b2ae9e4845d278d2ca76f041ea297362"} Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.415191 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2sjdm" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.426960 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbrln" podStartSLOduration=129.426944845 podStartE2EDuration="2m9.426944845s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:40.426216074 +0000 UTC m=+152.818267155" watchObservedRunningTime="2025-12-09 10:03:40.426944845 +0000 UTC m=+152.818995916" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.428251 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" podStartSLOduration=130.428246882 podStartE2EDuration="2m10.428246882s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:40.40087192 +0000 UTC m=+152.792923021" watchObservedRunningTime="2025-12-09 10:03:40.428246882 +0000 UTC m=+152.820297963" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.448490 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbp2p" podStartSLOduration=129.44846878 podStartE2EDuration="2m9.44846878s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:40.445388222 +0000 UTC m=+152.837439303" watchObservedRunningTime="2025-12-09 10:03:40.44846878 +0000 UTC m=+152.840519861" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.461276 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.461432 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.96141104 +0000 UTC m=+153.353462121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.461520 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.462988 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:40.962969805 +0000 UTC m=+153.355020886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.483155 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r2fkt" podStartSLOduration=130.483134241 podStartE2EDuration="2m10.483134241s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:40.481202296 +0000 UTC m=+152.873253377" watchObservedRunningTime="2025-12-09 10:03:40.483134241 +0000 UTC m=+152.875185332" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.498197 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" podStartSLOduration=129.498178641 podStartE2EDuration="2m9.498178641s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:40.495777263 +0000 UTC m=+152.887828354" watchObservedRunningTime="2025-12-09 10:03:40.498178641 +0000 UTC m=+152.890229722" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.537397 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r24z6" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.562913 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.563088 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.063063186 +0000 UTC m=+153.455114267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.563741 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.565386 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.065368042 +0000 UTC m=+153.457419243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.668051 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.668385 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.168369407 +0000 UTC m=+153.560420488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.764731 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2t8ql"] Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.765909 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.773227 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.773538 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.273527383 +0000 UTC m=+153.665578464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.774071 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.782722 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2t8ql"] Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.874692 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.874964 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.374934422 +0000 UTC m=+153.766985503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.875155 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-catalog-content\") pod \"community-operators-2t8ql\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.875292 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-utilities\") pod \"community-operators-2t8ql\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.875335 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whdbq\" (UniqueName: \"kubernetes.io/projected/39e38e20-dd9f-43b8-848f-e5618d52f6af-kube-api-access-whdbq\") pod \"community-operators-2t8ql\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.875402 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.875764 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.375753845 +0000 UTC m=+153.767804926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.939247 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lwx7v"] Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.940148 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.945765 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.958012 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lwx7v"] Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.976493 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.976659 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.476634519 +0000 UTC m=+153.868685600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.976769 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-catalog-content\") pod \"community-operators-2t8ql\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.976877 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-utilities\") pod \"community-operators-2t8ql\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.976909 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whdbq\" (UniqueName: \"kubernetes.io/projected/39e38e20-dd9f-43b8-848f-e5618d52f6af-kube-api-access-whdbq\") pod \"community-operators-2t8ql\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.976943 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.977194 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-catalog-content\") pod \"community-operators-2t8ql\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:03:40 crc kubenswrapper[5002]: E1209 10:03:40.977245 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.477230506 +0000 UTC m=+153.869281587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:40 crc kubenswrapper[5002]: I1209 10:03:40.977342 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-utilities\") pod \"community-operators-2t8ql\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.015032 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whdbq\" (UniqueName: \"kubernetes.io/projected/39e38e20-dd9f-43b8-848f-e5618d52f6af-kube-api-access-whdbq\") pod \"community-operators-2t8ql\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.077556 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.077782 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-utilities\") pod \"certified-operators-lwx7v\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.077903 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wl4j\" (UniqueName: \"kubernetes.io/projected/2c7271b9-c21f-44a0-ad27-f4158f36f317-kube-api-access-6wl4j\") pod \"certified-operators-lwx7v\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.077961 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-catalog-content\") pod \"certified-operators-lwx7v\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.078091 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.578073089 +0000 UTC m=+153.970124170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.090037 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.142220 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kk56c"] Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.143109 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.158702 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk56c"] Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.177936 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-899rx" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.179018 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-catalog-content\") pod \"certified-operators-lwx7v\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.179094 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.179133 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-utilities\") pod \"certified-operators-lwx7v\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.179158 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wl4j\" (UniqueName: \"kubernetes.io/projected/2c7271b9-c21f-44a0-ad27-f4158f36f317-kube-api-access-6wl4j\") pod \"certified-operators-lwx7v\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.179577 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.67956239 +0000 UTC m=+154.071613471 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.179705 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-utilities\") pod \"certified-operators-lwx7v\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.179753 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-catalog-content\") pod \"certified-operators-lwx7v\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.202994 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wl4j\" (UniqueName: \"kubernetes.io/projected/2c7271b9-c21f-44a0-ad27-f4158f36f317-kube-api-access-6wl4j\") pod \"certified-operators-lwx7v\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.252506 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.266448 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:41 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:41 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:41 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.266496 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.280065 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.280360 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-utilities\") pod \"community-operators-kk56c\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.280394 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-catalog-content\") pod \"community-operators-kk56c\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.280494 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr7m4\" (UniqueName: \"kubernetes.io/projected/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-kube-api-access-pr7m4\") pod \"community-operators-kk56c\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.280642 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.780623718 +0000 UTC m=+154.172674799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.365069 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zsnb5"] Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.366196 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.370955 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zsnb5"] Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.382476 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr7m4\" (UniqueName: \"kubernetes.io/projected/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-kube-api-access-pr7m4\") pod \"community-operators-kk56c\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.382525 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.382545 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-utilities\") pod \"community-operators-kk56c\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.382563 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-catalog-content\") pod \"community-operators-kk56c\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.383002 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-catalog-content\") pod \"community-operators-kk56c\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.383432 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.883423087 +0000 UTC m=+154.275474168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.383751 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-utilities\") pod \"community-operators-kk56c\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.423596 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr7m4\" (UniqueName: \"kubernetes.io/projected/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-kube-api-access-pr7m4\") pod \"community-operators-kk56c\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.462151 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.482134 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5z75v" event={"ID":"f85c45d5-d9fa-418c-91ea-0f054181f4c7","Type":"ContainerStarted","Data":"c1664fe228dbb37cacb9b7bb7ab09a61c7d997f06f742b23c61bdc8425512167"} Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.486391 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.486665 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-utilities\") pod \"certified-operators-zsnb5\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.486711 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjl4c\" (UniqueName: \"kubernetes.io/projected/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-kube-api-access-cjl4c\") pod \"certified-operators-zsnb5\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.486748 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.98673034 +0000 UTC m=+154.378781431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.486773 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-catalog-content\") pod \"certified-operators-zsnb5\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.486829 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.487105 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:41.98709276 +0000 UTC m=+154.379143841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.495326 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" event={"ID":"79241a31-6d92-482a-9cc3-c2370d516cbf","Type":"ContainerStarted","Data":"ce1f7d820fb4977ca03c623fef24cd144ae3fa4ce5b755f5673d8e4f19b78be2"} Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.496941 5002 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zbsjj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.496976 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" podUID="58eda1c5-bb10-4843-9ed4-64ee38d040ee" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.545406 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-x5gjs" podStartSLOduration=130.545381437 podStartE2EDuration="2m10.545381437s" podCreationTimestamp="2025-12-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:41.538963753 +0000 UTC m=+153.931014854" watchObservedRunningTime="2025-12-09 10:03:41.545381437 +0000 UTC m=+153.937432518" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.557823 5002 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.587688 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.588151 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-utilities\") pod \"certified-operators-zsnb5\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.588277 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjl4c\" (UniqueName: \"kubernetes.io/projected/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-kube-api-access-cjl4c\") pod \"certified-operators-zsnb5\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.588435 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-catalog-content\") pod \"certified-operators-zsnb5\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.590532 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:42.090506357 +0000 UTC m=+154.482557448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.590935 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-utilities\") pod \"certified-operators-zsnb5\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.608210 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-catalog-content\") pod \"certified-operators-zsnb5\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.627162 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjl4c\" (UniqueName: \"kubernetes.io/projected/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-kube-api-access-cjl4c\") pod \"certified-operators-zsnb5\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.659341 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2t8ql"] Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.694629 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.695056 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:42.195042255 +0000 UTC m=+154.587093336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.713862 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:03:41 crc kubenswrapper[5002]: W1209 10:03:41.714422 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39e38e20_dd9f_43b8_848f_e5618d52f6af.slice/crio-4e273412823b2a2334e2f0b11bf3f1ad0219e3cead84719ca1fa8ed58a9c55ea WatchSource:0}: Error finding container 4e273412823b2a2334e2f0b11bf3f1ad0219e3cead84719ca1fa8ed58a9c55ea: Status 404 returned error can't find the container with id 4e273412823b2a2334e2f0b11bf3f1ad0219e3cead84719ca1fa8ed58a9c55ea Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.795642 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.796009 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:42.295986151 +0000 UTC m=+154.688037232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.796115 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.796483 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:42.296473295 +0000 UTC m=+154.688524376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.883637 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lwx7v"] Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.897795 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.898082 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:42.398066139 +0000 UTC m=+154.790117220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: I1209 10:03:41.898328 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:41 crc kubenswrapper[5002]: E1209 10:03:41.898762 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:42.398749888 +0000 UTC m=+154.790800969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:41 crc kubenswrapper[5002]: W1209 10:03:41.927189 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c7271b9_c21f_44a0_ad27_f4158f36f317.slice/crio-e7d0c686101ed9ef230922365b540600dcb072af37dc3a21e4470f962c84f315 WatchSource:0}: Error finding container e7d0c686101ed9ef230922365b540600dcb072af37dc3a21e4470f962c84f315: Status 404 returned error can't find the container with id e7d0c686101ed9ef230922365b540600dcb072af37dc3a21e4470f962c84f315 Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:41.999742 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:42 crc kubenswrapper[5002]: E1209 10:03:42.000230 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:42.500212509 +0000 UTC m=+154.892263590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.100183 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk56c"] Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.101036 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:42 crc kubenswrapper[5002]: E1209 10:03:42.101336 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 10:03:42.601320069 +0000 UTC m=+154.993371150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-82dlm" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.114965 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zsnb5"] Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.201900 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:42 crc kubenswrapper[5002]: E1209 10:03:42.203550 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 10:03:42.703530261 +0000 UTC m=+155.095581342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.240151 5002 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-09T10:03:41.557851853Z","Handler":null,"Name":""} Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.246040 5002 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.246102 5002 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.247461 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:42 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:42 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:42 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.247507 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.304745 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.310764 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.310795 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.335087 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-82dlm\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.408943 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.415670 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.501431 5002 generic.go:334] "Generic (PLEG): container finished" podID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" containerID="5ab93d92bda564c722949a7c11beadd44373acc2a3de4e23147199b19226977e" exitCode=0 Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.501489 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsnb5" event={"ID":"a3c9ac96-2913-4efd-b800-4f5c18dc08ab","Type":"ContainerDied","Data":"5ab93d92bda564c722949a7c11beadd44373acc2a3de4e23147199b19226977e"} Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.501514 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsnb5" event={"ID":"a3c9ac96-2913-4efd-b800-4f5c18dc08ab","Type":"ContainerStarted","Data":"6d96b6833a67266ebad3d256fe7bae460344f49c5ac7bad4a5a3120bf2413b1c"} Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.502776 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.506187 5002 generic.go:334] "Generic (PLEG): container finished" podID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerID="8716a18b6cf69a4d331c466e77fc5276e9f028a821ebe3f9bfde4b0bb599d9e0" exitCode=0 Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.506239 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwx7v" event={"ID":"2c7271b9-c21f-44a0-ad27-f4158f36f317","Type":"ContainerDied","Data":"8716a18b6cf69a4d331c466e77fc5276e9f028a821ebe3f9bfde4b0bb599d9e0"} Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.506256 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwx7v" event={"ID":"2c7271b9-c21f-44a0-ad27-f4158f36f317","Type":"ContainerStarted","Data":"e7d0c686101ed9ef230922365b540600dcb072af37dc3a21e4470f962c84f315"} Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.508517 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5z75v" event={"ID":"f85c45d5-d9fa-418c-91ea-0f054181f4c7","Type":"ContainerStarted","Data":"09589ffa06fe8467756616ed67621bc0c52413e0b1e63a693145db85ba797aa1"} Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.508543 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5z75v" event={"ID":"f85c45d5-d9fa-418c-91ea-0f054181f4c7","Type":"ContainerStarted","Data":"8b7bddca3cbb90a2dfef37a0d4bfaebef5fc38e67fbb7642e0309250d0b720b4"} Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.510418 5002 generic.go:334] "Generic (PLEG): container finished" podID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerID="f1401b3b4b61efe0f5bd2c77722361f032246bc600229a0e0b7237691d356867" exitCode=0 Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.510475 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2t8ql" event={"ID":"39e38e20-dd9f-43b8-848f-e5618d52f6af","Type":"ContainerDied","Data":"f1401b3b4b61efe0f5bd2c77722361f032246bc600229a0e0b7237691d356867"} Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.510492 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2t8ql" event={"ID":"39e38e20-dd9f-43b8-848f-e5618d52f6af","Type":"ContainerStarted","Data":"4e273412823b2a2334e2f0b11bf3f1ad0219e3cead84719ca1fa8ed58a9c55ea"} Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.514774 5002 generic.go:334] "Generic (PLEG): container finished" podID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" containerID="b2b398af59565a2a35760cf022c3e26366a729d6a1a2b37fbb89cf5f677d38f2" exitCode=0 Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.514979 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk56c" event={"ID":"a788f35f-cb43-4452-a6e0-f7fb2d040d6f","Type":"ContainerDied","Data":"b2b398af59565a2a35760cf022c3e26366a729d6a1a2b37fbb89cf5f677d38f2"} Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.515017 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk56c" event={"ID":"a788f35f-cb43-4452-a6e0-f7fb2d040d6f","Type":"ContainerStarted","Data":"ac1f0c776ff75b2b74398e76a07d08fbc7aca29be4a79d21624d6b080f4f2ce3"} Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.593059 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5z75v" podStartSLOduration=10.593038596 podStartE2EDuration="10.593038596s" podCreationTimestamp="2025-12-09 10:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:42.571885321 +0000 UTC m=+154.963936402" watchObservedRunningTime="2025-12-09 10:03:42.593038596 +0000 UTC m=+154.985089677" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.616119 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.742365 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ptdnb"] Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.746093 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.748448 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.762979 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptdnb"] Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.814511 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-82dlm"] Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.814831 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-utilities\") pod \"redhat-marketplace-ptdnb\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.814874 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z2pz\" (UniqueName: \"kubernetes.io/projected/038508b9-13ca-43a3-8414-09097229bc83-kube-api-access-5z2pz\") pod \"redhat-marketplace-ptdnb\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.814926 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-catalog-content\") pod \"redhat-marketplace-ptdnb\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.915666 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-utilities\") pod \"redhat-marketplace-ptdnb\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.915987 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z2pz\" (UniqueName: \"kubernetes.io/projected/038508b9-13ca-43a3-8414-09097229bc83-kube-api-access-5z2pz\") pod \"redhat-marketplace-ptdnb\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.916026 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-catalog-content\") pod \"redhat-marketplace-ptdnb\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.916152 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-utilities\") pod \"redhat-marketplace-ptdnb\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.916457 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-catalog-content\") pod \"redhat-marketplace-ptdnb\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.935459 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z2pz\" (UniqueName: \"kubernetes.io/projected/038508b9-13ca-43a3-8414-09097229bc83-kube-api-access-5z2pz\") pod \"redhat-marketplace-ptdnb\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.942780 5002 patch_prober.go:28] interesting pod/downloads-7954f5f757-xp9jk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.942891 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xp9jk" podUID="58ef0377-76a5-4299-999f-7bdd55606533" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.942878 5002 patch_prober.go:28] interesting pod/downloads-7954f5f757-xp9jk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 09 10:03:42 crc kubenswrapper[5002]: I1209 10:03:42.942960 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xp9jk" podUID="58ef0377-76a5-4299-999f-7bdd55606533" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.039786 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.039837 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.045118 5002 patch_prober.go:28] interesting pod/apiserver-76f77b778f-99l4h container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]log ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]etcd ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]poststarthook/generic-apiserver-start-informers ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]poststarthook/max-in-flight-filter ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 09 10:03:43 crc kubenswrapper[5002]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 09 10:03:43 crc kubenswrapper[5002]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]poststarthook/project.openshift.io-projectcache ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]poststarthook/openshift.io-startinformers ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 09 10:03:43 crc kubenswrapper[5002]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 09 10:03:43 crc kubenswrapper[5002]: livez check failed Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.045182 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" podUID="8a5ea199-20b4-419b-97bc-0845ac5bb720" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.063728 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.150378 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2dc2h"] Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.151640 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.171333 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dc2h"] Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.220460 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-utilities\") pod \"redhat-marketplace-2dc2h\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.221050 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-catalog-content\") pod \"redhat-marketplace-2dc2h\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.221082 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8zrj\" (UniqueName: \"kubernetes.io/projected/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-kube-api-access-b8zrj\") pod \"redhat-marketplace-2dc2h\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.243996 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:43 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:43 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:43 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.244076 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.319578 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptdnb"] Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.322577 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-utilities\") pod \"redhat-marketplace-2dc2h\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.322648 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-catalog-content\") pod \"redhat-marketplace-2dc2h\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.322677 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8zrj\" (UniqueName: \"kubernetes.io/projected/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-kube-api-access-b8zrj\") pod \"redhat-marketplace-2dc2h\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.323614 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-utilities\") pod \"redhat-marketplace-2dc2h\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.325885 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-catalog-content\") pod \"redhat-marketplace-2dc2h\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:03:43 crc kubenswrapper[5002]: W1209 10:03:43.333102 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod038508b9_13ca_43a3_8414_09097229bc83.slice/crio-110eefe8216fc554134c9a753cf2bd8e2d88ef78e50a66c4abcdfb5380faf93a WatchSource:0}: Error finding container 110eefe8216fc554134c9a753cf2bd8e2d88ef78e50a66c4abcdfb5380faf93a: Status 404 returned error can't find the container with id 110eefe8216fc554134c9a753cf2bd8e2d88ef78e50a66c4abcdfb5380faf93a Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.339541 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8zrj\" (UniqueName: \"kubernetes.io/projected/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-kube-api-access-b8zrj\") pod \"redhat-marketplace-2dc2h\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.378283 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.385441 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nmgxj" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.482598 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.536345 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.536400 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.548504 5002 patch_prober.go:28] interesting pod/console-f9d7485db-n967v container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.548571 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-n967v" podUID="95a7b196-74b4-4d67-a0e0-3e4b92e468ed" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.555624 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" event={"ID":"cac4bbbb-1f9f-4107-99ee-cb71771ec460","Type":"ContainerStarted","Data":"deeb54b469ee1431d93f7d5a33c9c094bd182a9b86e0812938c9ee0141909463"} Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.555678 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" event={"ID":"cac4bbbb-1f9f-4107-99ee-cb71771ec460","Type":"ContainerStarted","Data":"7a39326b13b9b0c10f2706750a200be8136c35c004bfa8e788f9075df02b0dab"} Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.555741 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.565397 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptdnb" event={"ID":"038508b9-13ca-43a3-8414-09097229bc83","Type":"ContainerStarted","Data":"110eefe8216fc554134c9a753cf2bd8e2d88ef78e50a66c4abcdfb5380faf93a"} Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.584521 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" podStartSLOduration=133.584497529 podStartE2EDuration="2m13.584497529s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:43.581223655 +0000 UTC m=+155.973274756" watchObservedRunningTime="2025-12-09 10:03:43.584497529 +0000 UTC m=+155.976548610" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.801410 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.802050 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.809249 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.809461 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.809683 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.943326 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dlr44"] Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.944513 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.946802 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.947978 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38935473-5396-4d7b-a458-c35e34a2ce39-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38935473-5396-4d7b-a458-c35e34a2ce39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.948045 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38935473-5396-4d7b-a458-c35e34a2ce39-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38935473-5396-4d7b-a458-c35e34a2ce39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 10:03:43 crc kubenswrapper[5002]: I1209 10:03:43.972106 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dlr44"] Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.046710 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dc2h"] Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.049791 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38935473-5396-4d7b-a458-c35e34a2ce39-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38935473-5396-4d7b-a458-c35e34a2ce39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.049868 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-catalog-content\") pod \"redhat-operators-dlr44\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.049937 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38935473-5396-4d7b-a458-c35e34a2ce39-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38935473-5396-4d7b-a458-c35e34a2ce39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.049994 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-utilities\") pod \"redhat-operators-dlr44\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.050021 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfffc\" (UniqueName: \"kubernetes.io/projected/eee6148b-020b-46a1-9066-4cc2fb430d13-kube-api-access-rfffc\") pod \"redhat-operators-dlr44\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.050110 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38935473-5396-4d7b-a458-c35e34a2ce39-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38935473-5396-4d7b-a458-c35e34a2ce39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.069438 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38935473-5396-4d7b-a458-c35e34a2ce39-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38935473-5396-4d7b-a458-c35e34a2ce39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.082075 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.151658 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-utilities\") pod \"redhat-operators-dlr44\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.151709 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfffc\" (UniqueName: \"kubernetes.io/projected/eee6148b-020b-46a1-9066-4cc2fb430d13-kube-api-access-rfffc\") pod \"redhat-operators-dlr44\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.151759 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-catalog-content\") pod \"redhat-operators-dlr44\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.152202 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-catalog-content\") pod \"redhat-operators-dlr44\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.152504 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-utilities\") pod \"redhat-operators-dlr44\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.160495 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.189366 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfffc\" (UniqueName: \"kubernetes.io/projected/eee6148b-020b-46a1-9066-4cc2fb430d13-kube-api-access-rfffc\") pod \"redhat-operators-dlr44\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.244269 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:44 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:44 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:44 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.244327 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.287760 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.342484 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lmx5d"] Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.343548 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.358546 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmx5d"] Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.455624 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs2wc\" (UniqueName: \"kubernetes.io/projected/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-kube-api-access-xs2wc\") pod \"redhat-operators-lmx5d\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.456548 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-catalog-content\") pod \"redhat-operators-lmx5d\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.456725 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-utilities\") pod \"redhat-operators-lmx5d\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.566586 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs2wc\" (UniqueName: \"kubernetes.io/projected/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-kube-api-access-xs2wc\") pod \"redhat-operators-lmx5d\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.566627 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-catalog-content\") pod \"redhat-operators-lmx5d\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.566673 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-utilities\") pod \"redhat-operators-lmx5d\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.569244 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-utilities\") pod \"redhat-operators-lmx5d\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.569290 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-catalog-content\") pod \"redhat-operators-lmx5d\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.580763 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.584107 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs2wc\" (UniqueName: \"kubernetes.io/projected/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-kube-api-access-xs2wc\") pod \"redhat-operators-lmx5d\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.601298 5002 generic.go:334] "Generic (PLEG): container finished" podID="038508b9-13ca-43a3-8414-09097229bc83" containerID="97ca49eb7ee783f45d7c09dda4ace63c3d6f47600a0f7028959d427ba1b6e1f7" exitCode=0 Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.601372 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptdnb" event={"ID":"038508b9-13ca-43a3-8414-09097229bc83","Type":"ContainerDied","Data":"97ca49eb7ee783f45d7c09dda4ace63c3d6f47600a0f7028959d427ba1b6e1f7"} Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.641455 5002 generic.go:334] "Generic (PLEG): container finished" podID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerID="29fd90e99a76aff7f225c49dd389222f67aa6e33cc01f9d10a746de39da1e858" exitCode=0 Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.642865 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dc2h" event={"ID":"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c","Type":"ContainerDied","Data":"29fd90e99a76aff7f225c49dd389222f67aa6e33cc01f9d10a746de39da1e858"} Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.642897 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dc2h" event={"ID":"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c","Type":"ContainerStarted","Data":"882f19544ed280a07c75e5afab55a47dc4cea53af7a9a5be15fb1143f3531965"} Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.650076 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dlr44"] Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.683013 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:03:44 crc kubenswrapper[5002]: I1209 10:03:44.901077 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmx5d"] Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.245200 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.248947 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:45 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:45 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:45 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.249017 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.500374 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.652216 5002 generic.go:334] "Generic (PLEG): container finished" podID="eee6148b-020b-46a1-9066-4cc2fb430d13" containerID="27b1443e54c6d8eafba4a3e3be1fa051be438ef8ade31c3ab7e8491c1731a8df" exitCode=0 Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.652305 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlr44" event={"ID":"eee6148b-020b-46a1-9066-4cc2fb430d13","Type":"ContainerDied","Data":"27b1443e54c6d8eafba4a3e3be1fa051be438ef8ade31c3ab7e8491c1731a8df"} Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.652352 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlr44" event={"ID":"eee6148b-020b-46a1-9066-4cc2fb430d13","Type":"ContainerStarted","Data":"ec00257f9c62aaeac26b4afaf67d3e28276590708523ae92d7340822adedb31a"} Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.661112 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38935473-5396-4d7b-a458-c35e34a2ce39","Type":"ContainerStarted","Data":"29eb8e13f63f6a6e87957e2b54b4f9858c645d961e0910f265b0b96f0bd9410f"} Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.661156 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38935473-5396-4d7b-a458-c35e34a2ce39","Type":"ContainerStarted","Data":"98ad83484100847036d776d78bcfb7bbafe8ecc09c3cb7cc7e0f9cd4cca036a5"} Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.665280 5002 generic.go:334] "Generic (PLEG): container finished" podID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerID="7d71c5dd30974758cf12343414a0705a13b36009432ec1e7bb6d946e865ffa4b" exitCode=0 Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.665323 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx5d" event={"ID":"bfc9ae06-2b91-4708-8ea6-e8e8789b2475","Type":"ContainerDied","Data":"7d71c5dd30974758cf12343414a0705a13b36009432ec1e7bb6d946e865ffa4b"} Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.665360 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx5d" event={"ID":"bfc9ae06-2b91-4708-8ea6-e8e8789b2475","Type":"ContainerStarted","Data":"9570f3bf8a1cde7ea5598683447e729186333218d9d66b59b5b73e4d237e54dc"} Dec 09 10:03:45 crc kubenswrapper[5002]: I1209 10:03:45.686726 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.686708254 podStartE2EDuration="2.686708254s" podCreationTimestamp="2025-12-09 10:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:03:45.685903591 +0000 UTC m=+158.077954692" watchObservedRunningTime="2025-12-09 10:03:45.686708254 +0000 UTC m=+158.078759335" Dec 09 10:03:46 crc kubenswrapper[5002]: I1209 10:03:46.246400 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:46 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:46 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:46 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:46 crc kubenswrapper[5002]: I1209 10:03:46.246476 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:46 crc kubenswrapper[5002]: I1209 10:03:46.682873 5002 generic.go:334] "Generic (PLEG): container finished" podID="38935473-5396-4d7b-a458-c35e34a2ce39" containerID="29eb8e13f63f6a6e87957e2b54b4f9858c645d961e0910f265b0b96f0bd9410f" exitCode=0 Dec 09 10:03:46 crc kubenswrapper[5002]: I1209 10:03:46.682936 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38935473-5396-4d7b-a458-c35e34a2ce39","Type":"ContainerDied","Data":"29eb8e13f63f6a6e87957e2b54b4f9858c645d961e0910f265b0b96f0bd9410f"} Dec 09 10:03:46 crc kubenswrapper[5002]: I1209 10:03:46.686120 5002 generic.go:334] "Generic (PLEG): container finished" podID="6e408545-84a5-4b62-ab02-e213a58d1c53" containerID="07d5e13d633c647f14baf9f5c3a61ce86e7e8f34a3d20dcdfa800af8231d6639" exitCode=0 Dec 09 10:03:46 crc kubenswrapper[5002]: I1209 10:03:46.686206 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" event={"ID":"6e408545-84a5-4b62-ab02-e213a58d1c53","Type":"ContainerDied","Data":"07d5e13d633c647f14baf9f5c3a61ce86e7e8f34a3d20dcdfa800af8231d6639"} Dec 09 10:03:47 crc kubenswrapper[5002]: I1209 10:03:47.016424 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pm68f" Dec 09 10:03:47 crc kubenswrapper[5002]: I1209 10:03:47.243575 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:47 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:47 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:47 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:47 crc kubenswrapper[5002]: I1209 10:03:47.243997 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.046954 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.054689 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-99l4h" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.243432 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:48 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:48 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:48 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.243483 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.286274 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.286999 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.289646 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.290632 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.291480 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.337393 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87bc4a85-f705-4635-b8a2-594809da87cc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"87bc4a85-f705-4635-b8a2-594809da87cc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.337549 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87bc4a85-f705-4635-b8a2-594809da87cc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"87bc4a85-f705-4635-b8a2-594809da87cc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.451572 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87bc4a85-f705-4635-b8a2-594809da87cc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"87bc4a85-f705-4635-b8a2-594809da87cc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.451686 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87bc4a85-f705-4635-b8a2-594809da87cc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"87bc4a85-f705-4635-b8a2-594809da87cc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.451766 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87bc4a85-f705-4635-b8a2-594809da87cc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"87bc4a85-f705-4635-b8a2-594809da87cc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.486643 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87bc4a85-f705-4635-b8a2-594809da87cc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"87bc4a85-f705-4635-b8a2-594809da87cc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 10:03:48 crc kubenswrapper[5002]: I1209 10:03:48.612184 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 10:03:49 crc kubenswrapper[5002]: I1209 10:03:49.243517 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:49 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:49 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:49 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:49 crc kubenswrapper[5002]: I1209 10:03:49.243577 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:50 crc kubenswrapper[5002]: I1209 10:03:50.249436 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:50 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:50 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:50 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:50 crc kubenswrapper[5002]: I1209 10:03:50.249509 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:51 crc kubenswrapper[5002]: I1209 10:03:51.244443 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:51 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:51 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:51 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:51 crc kubenswrapper[5002]: I1209 10:03:51.244737 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:52 crc kubenswrapper[5002]: I1209 10:03:52.243656 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:52 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:52 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:52 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:52 crc kubenswrapper[5002]: I1209 10:03:52.243730 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:52 crc kubenswrapper[5002]: I1209 10:03:52.947424 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-xp9jk" Dec 09 10:03:53 crc kubenswrapper[5002]: I1209 10:03:53.242947 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:53 crc kubenswrapper[5002]: [-]has-synced failed: reason withheld Dec 09 10:03:53 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:53 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:53 crc kubenswrapper[5002]: I1209 10:03:53.243021 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:53 crc kubenswrapper[5002]: I1209 10:03:53.536396 5002 patch_prober.go:28] interesting pod/console-f9d7485db-n967v container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Dec 09 10:03:53 crc kubenswrapper[5002]: I1209 10:03:53.536446 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-n967v" podUID="95a7b196-74b4-4d67-a0e0-3e4b92e468ed" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Dec 09 10:03:53 crc kubenswrapper[5002]: I1209 10:03:53.904601 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:53 crc kubenswrapper[5002]: I1209 10:03:53.912059 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ffb94c3-624e-48aa-aaa9-450ace4e1862-metrics-certs\") pod \"network-metrics-daemon-98z2f\" (UID: \"7ffb94c3-624e-48aa-aaa9-450ace4e1862\") " pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:54 crc kubenswrapper[5002]: I1209 10:03:54.180277 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z2f" Dec 09 10:03:54 crc kubenswrapper[5002]: I1209 10:03:54.242888 5002 patch_prober.go:28] interesting pod/router-default-5444994796-5gtc8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 10:03:54 crc kubenswrapper[5002]: [+]has-synced ok Dec 09 10:03:54 crc kubenswrapper[5002]: [+]process-running ok Dec 09 10:03:54 crc kubenswrapper[5002]: healthz check failed Dec 09 10:03:54 crc kubenswrapper[5002]: I1209 10:03:54.243059 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gtc8" podUID="99f91e54-9dc7-43ca-86ee-4258c22c6004" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.207539 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.218289 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.272388 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.275783 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5gtc8" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.326634 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38935473-5396-4d7b-a458-c35e34a2ce39-kubelet-dir\") pod \"38935473-5396-4d7b-a458-c35e34a2ce39\" (UID: \"38935473-5396-4d7b-a458-c35e34a2ce39\") " Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.326759 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38935473-5396-4d7b-a458-c35e34a2ce39-kube-api-access\") pod \"38935473-5396-4d7b-a458-c35e34a2ce39\" (UID: \"38935473-5396-4d7b-a458-c35e34a2ce39\") " Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.326797 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e408545-84a5-4b62-ab02-e213a58d1c53-config-volume\") pod \"6e408545-84a5-4b62-ab02-e213a58d1c53\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.326850 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e408545-84a5-4b62-ab02-e213a58d1c53-secret-volume\") pod \"6e408545-84a5-4b62-ab02-e213a58d1c53\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.326880 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcq5j\" (UniqueName: \"kubernetes.io/projected/6e408545-84a5-4b62-ab02-e213a58d1c53-kube-api-access-tcq5j\") pod \"6e408545-84a5-4b62-ab02-e213a58d1c53\" (UID: \"6e408545-84a5-4b62-ab02-e213a58d1c53\") " Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.326983 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38935473-5396-4d7b-a458-c35e34a2ce39-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "38935473-5396-4d7b-a458-c35e34a2ce39" (UID: "38935473-5396-4d7b-a458-c35e34a2ce39"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.327475 5002 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38935473-5396-4d7b-a458-c35e34a2ce39-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.327834 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e408545-84a5-4b62-ab02-e213a58d1c53-config-volume" (OuterVolumeSpecName: "config-volume") pod "6e408545-84a5-4b62-ab02-e213a58d1c53" (UID: "6e408545-84a5-4b62-ab02-e213a58d1c53"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.340709 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e408545-84a5-4b62-ab02-e213a58d1c53-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6e408545-84a5-4b62-ab02-e213a58d1c53" (UID: "6e408545-84a5-4b62-ab02-e213a58d1c53"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.340832 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38935473-5396-4d7b-a458-c35e34a2ce39-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "38935473-5396-4d7b-a458-c35e34a2ce39" (UID: "38935473-5396-4d7b-a458-c35e34a2ce39"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.341364 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e408545-84a5-4b62-ab02-e213a58d1c53-kube-api-access-tcq5j" (OuterVolumeSpecName: "kube-api-access-tcq5j") pod "6e408545-84a5-4b62-ab02-e213a58d1c53" (UID: "6e408545-84a5-4b62-ab02-e213a58d1c53"). InnerVolumeSpecName "kube-api-access-tcq5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.429631 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38935473-5396-4d7b-a458-c35e34a2ce39-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.429712 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e408545-84a5-4b62-ab02-e213a58d1c53-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.429725 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e408545-84a5-4b62-ab02-e213a58d1c53-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.429737 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcq5j\" (UniqueName: \"kubernetes.io/projected/6e408545-84a5-4b62-ab02-e213a58d1c53-kube-api-access-tcq5j\") on node \"crc\" DevicePath \"\"" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.742102 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.742102 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38935473-5396-4d7b-a458-c35e34a2ce39","Type":"ContainerDied","Data":"98ad83484100847036d776d78bcfb7bbafe8ecc09c3cb7cc7e0f9cd4cca036a5"} Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.742227 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98ad83484100847036d776d78bcfb7bbafe8ecc09c3cb7cc7e0f9cd4cca036a5" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.743734 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" event={"ID":"6e408545-84a5-4b62-ab02-e213a58d1c53","Type":"ContainerDied","Data":"00a4fcf53a1aceb562e7a08fe6700ddd4530e5b1020944c74dbc60fd401ea8ac"} Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.743754 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz" Dec 09 10:03:55 crc kubenswrapper[5002]: I1209 10:03:55.743768 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00a4fcf53a1aceb562e7a08fe6700ddd4530e5b1020944c74dbc60fd401ea8ac" Dec 09 10:04:02 crc kubenswrapper[5002]: I1209 10:04:02.629277 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:04:03 crc kubenswrapper[5002]: I1209 10:04:03.541668 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:04:03 crc kubenswrapper[5002]: I1209 10:04:03.546303 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:04:03 crc kubenswrapper[5002]: I1209 10:04:03.834114 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-98z2f"] Dec 09 10:04:07 crc kubenswrapper[5002]: E1209 10:04:07.959521 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 09 10:04:07 crc kubenswrapper[5002]: E1209 10:04:07.960123 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-whdbq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2t8ql_openshift-marketplace(39e38e20-dd9f-43b8-848f-e5618d52f6af): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 10:04:07 crc kubenswrapper[5002]: E1209 10:04:07.961306 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2t8ql" podUID="39e38e20-dd9f-43b8-848f-e5618d52f6af" Dec 09 10:04:07 crc kubenswrapper[5002]: I1209 10:04:07.965285 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:04:07 crc kubenswrapper[5002]: I1209 10:04:07.965378 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:04:11 crc kubenswrapper[5002]: E1209 10:04:11.994661 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 09 10:04:11 crc kubenswrapper[5002]: E1209 10:04:11.995200 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjl4c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zsnb5_openshift-marketplace(a3c9ac96-2913-4efd-b800-4f5c18dc08ab): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 10:04:11 crc kubenswrapper[5002]: E1209 10:04:11.996471 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zsnb5" podUID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" Dec 09 10:04:13 crc kubenswrapper[5002]: E1209 10:04:13.352521 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 09 10:04:13 crc kubenswrapper[5002]: E1209 10:04:13.352792 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5z2pz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ptdnb_openshift-marketplace(038508b9-13ca-43a3-8414-09097229bc83): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 10:04:13 crc kubenswrapper[5002]: E1209 10:04:13.354246 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ptdnb" podUID="038508b9-13ca-43a3-8414-09097229bc83" Dec 09 10:04:13 crc kubenswrapper[5002]: I1209 10:04:13.838744 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-98z2f" event={"ID":"7ffb94c3-624e-48aa-aaa9-450ace4e1862","Type":"ContainerStarted","Data":"0a670ae92bf5c90ef1416b6c6c63c20e8f21f6c74e0a94650a6175c580e0824b"} Dec 09 10:04:14 crc kubenswrapper[5002]: I1209 10:04:14.783794 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 10:04:15 crc kubenswrapper[5002]: E1209 10:04:15.075107 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 09 10:04:15 crc kubenswrapper[5002]: E1209 10:04:15.075250 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6wl4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lwx7v_openshift-marketplace(2c7271b9-c21f-44a0-ad27-f4158f36f317): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 10:04:15 crc kubenswrapper[5002]: E1209 10:04:15.077278 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lwx7v" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" Dec 09 10:04:15 crc kubenswrapper[5002]: I1209 10:04:15.230843 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dhngp" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.782567 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 10:04:18 crc kubenswrapper[5002]: E1209 10:04:18.783267 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38935473-5396-4d7b-a458-c35e34a2ce39" containerName="pruner" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.783283 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="38935473-5396-4d7b-a458-c35e34a2ce39" containerName="pruner" Dec 09 10:04:18 crc kubenswrapper[5002]: E1209 10:04:18.783327 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e408545-84a5-4b62-ab02-e213a58d1c53" containerName="collect-profiles" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.783336 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e408545-84a5-4b62-ab02-e213a58d1c53" containerName="collect-profiles" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.783460 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="38935473-5396-4d7b-a458-c35e34a2ce39" containerName="pruner" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.783494 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e408545-84a5-4b62-ab02-e213a58d1c53" containerName="collect-profiles" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.784122 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.784666 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.830368 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e224295a-f0bd-4501-8bee-6f02ae064e78-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e224295a-f0bd-4501-8bee-6f02ae064e78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.830422 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e224295a-f0bd-4501-8bee-6f02ae064e78-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e224295a-f0bd-4501-8bee-6f02ae064e78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.931179 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e224295a-f0bd-4501-8bee-6f02ae064e78-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e224295a-f0bd-4501-8bee-6f02ae064e78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.931262 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e224295a-f0bd-4501-8bee-6f02ae064e78-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e224295a-f0bd-4501-8bee-6f02ae064e78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.931334 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e224295a-f0bd-4501-8bee-6f02ae064e78-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e224295a-f0bd-4501-8bee-6f02ae064e78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 10:04:18 crc kubenswrapper[5002]: I1209 10:04:18.957441 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e224295a-f0bd-4501-8bee-6f02ae064e78-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e224295a-f0bd-4501-8bee-6f02ae064e78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 10:04:19 crc kubenswrapper[5002]: I1209 10:04:19.154346 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 10:04:21 crc kubenswrapper[5002]: E1209 10:04:21.126371 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ptdnb" podUID="038508b9-13ca-43a3-8414-09097229bc83" Dec 09 10:04:21 crc kubenswrapper[5002]: E1209 10:04:21.126777 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lwx7v" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" Dec 09 10:04:21 crc kubenswrapper[5002]: E1209 10:04:21.127488 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zsnb5" podUID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" Dec 09 10:04:21 crc kubenswrapper[5002]: I1209 10:04:21.774243 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 10:04:21 crc kubenswrapper[5002]: I1209 10:04:21.871449 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 10:04:21 crc kubenswrapper[5002]: W1209 10:04:21.887975 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode224295a_f0bd_4501_8bee_6f02ae064e78.slice/crio-976fb3b31baf0299ac78185c60a3355f7fbc659acc11b2c411134d7746e0ef52 WatchSource:0}: Error finding container 976fb3b31baf0299ac78185c60a3355f7fbc659acc11b2c411134d7746e0ef52: Status 404 returned error can't find the container with id 976fb3b31baf0299ac78185c60a3355f7fbc659acc11b2c411134d7746e0ef52 Dec 09 10:04:21 crc kubenswrapper[5002]: I1209 10:04:21.892008 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx5d" event={"ID":"bfc9ae06-2b91-4708-8ea6-e8e8789b2475","Type":"ContainerStarted","Data":"7c113c8ab1b12477f5480bda4f3129627ce4a536fac90d2204ec5560ffb648b1"} Dec 09 10:04:21 crc kubenswrapper[5002]: I1209 10:04:21.901790 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlr44" event={"ID":"eee6148b-020b-46a1-9066-4cc2fb430d13","Type":"ContainerStarted","Data":"cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d"} Dec 09 10:04:21 crc kubenswrapper[5002]: I1209 10:04:21.916555 5002 generic.go:334] "Generic (PLEG): container finished" podID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerID="931a5b1abd320b957b303805787534e3557adaaba28e5eaa4dda68e4a762ccc7" exitCode=0 Dec 09 10:04:21 crc kubenswrapper[5002]: I1209 10:04:21.916614 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dc2h" event={"ID":"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c","Type":"ContainerDied","Data":"931a5b1abd320b957b303805787534e3557adaaba28e5eaa4dda68e4a762ccc7"} Dec 09 10:04:21 crc kubenswrapper[5002]: I1209 10:04:21.919772 5002 generic.go:334] "Generic (PLEG): container finished" podID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" containerID="37b655e008a9bc2e0d5a4e8059ee9d73c154c07c7b26ca95dbe9721ec5b0af5d" exitCode=0 Dec 09 10:04:21 crc kubenswrapper[5002]: I1209 10:04:21.919831 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk56c" event={"ID":"a788f35f-cb43-4452-a6e0-f7fb2d040d6f","Type":"ContainerDied","Data":"37b655e008a9bc2e0d5a4e8059ee9d73c154c07c7b26ca95dbe9721ec5b0af5d"} Dec 09 10:04:21 crc kubenswrapper[5002]: I1209 10:04:21.921500 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"87bc4a85-f705-4635-b8a2-594809da87cc","Type":"ContainerStarted","Data":"511c95ae0425b926d81f6ebb52c7e77be265c7ee60eda997ae1d9e99ea4d05da"} Dec 09 10:04:21 crc kubenswrapper[5002]: I1209 10:04:21.923992 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-98z2f" event={"ID":"7ffb94c3-624e-48aa-aaa9-450ace4e1862","Type":"ContainerStarted","Data":"8a9677a4d47b5fe83d9346018998121e7953887a2fdd49fb55277d00bb01c3ef"} Dec 09 10:04:22 crc kubenswrapper[5002]: I1209 10:04:22.979514 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"87bc4a85-f705-4635-b8a2-594809da87cc","Type":"ContainerStarted","Data":"3ab2eac0891b96382fa54f4901d599e081e2225606d2c7faf0a47816ed26b90d"} Dec 09 10:04:22 crc kubenswrapper[5002]: I1209 10:04:22.985931 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-98z2f" event={"ID":"7ffb94c3-624e-48aa-aaa9-450ace4e1862","Type":"ContainerStarted","Data":"4ae79cc1eee027f31e4a489bfbb80484020b46c3ba4cd933f39bc0ace220cc4f"} Dec 09 10:04:22 crc kubenswrapper[5002]: I1209 10:04:22.989914 5002 generic.go:334] "Generic (PLEG): container finished" podID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerID="7c113c8ab1b12477f5480bda4f3129627ce4a536fac90d2204ec5560ffb648b1" exitCode=0 Dec 09 10:04:22 crc kubenswrapper[5002]: I1209 10:04:22.989971 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx5d" event={"ID":"bfc9ae06-2b91-4708-8ea6-e8e8789b2475","Type":"ContainerDied","Data":"7c113c8ab1b12477f5480bda4f3129627ce4a536fac90d2204ec5560ffb648b1"} Dec 09 10:04:22 crc kubenswrapper[5002]: I1209 10:04:22.999246 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=34.999232042 podStartE2EDuration="34.999232042s" podCreationTimestamp="2025-12-09 10:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:04:22.996116833 +0000 UTC m=+195.388167914" watchObservedRunningTime="2025-12-09 10:04:22.999232042 +0000 UTC m=+195.391283123" Dec 09 10:04:23 crc kubenswrapper[5002]: I1209 10:04:23.002658 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2t8ql" event={"ID":"39e38e20-dd9f-43b8-848f-e5618d52f6af","Type":"ContainerStarted","Data":"c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2"} Dec 09 10:04:23 crc kubenswrapper[5002]: I1209 10:04:23.017422 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e224295a-f0bd-4501-8bee-6f02ae064e78","Type":"ContainerStarted","Data":"01add78670bdcaf29b65a55a14dc4f5b6bc3ffdd1a1db2ae323aa04e6e60a9af"} Dec 09 10:04:23 crc kubenswrapper[5002]: I1209 10:04:23.017466 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e224295a-f0bd-4501-8bee-6f02ae064e78","Type":"ContainerStarted","Data":"976fb3b31baf0299ac78185c60a3355f7fbc659acc11b2c411134d7746e0ef52"} Dec 09 10:04:23 crc kubenswrapper[5002]: I1209 10:04:23.065936 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-98z2f" podStartSLOduration=173.065914319 podStartE2EDuration="2m53.065914319s" podCreationTimestamp="2025-12-09 10:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:04:23.038200976 +0000 UTC m=+195.430252047" watchObservedRunningTime="2025-12-09 10:04:23.065914319 +0000 UTC m=+195.457965400" Dec 09 10:04:23 crc kubenswrapper[5002]: I1209 10:04:23.086835 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2dc2h" podStartSLOduration=2.333855068 podStartE2EDuration="40.086798516s" podCreationTimestamp="2025-12-09 10:03:43 +0000 UTC" firstStartedPulling="2025-12-09 10:03:44.654297141 +0000 UTC m=+157.046348222" lastFinishedPulling="2025-12-09 10:04:22.407240589 +0000 UTC m=+194.799291670" observedRunningTime="2025-12-09 10:04:23.082264026 +0000 UTC m=+195.474315107" watchObservedRunningTime="2025-12-09 10:04:23.086798516 +0000 UTC m=+195.478849597" Dec 09 10:04:23 crc kubenswrapper[5002]: I1209 10:04:23.118225 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=5.118203463 podStartE2EDuration="5.118203463s" podCreationTimestamp="2025-12-09 10:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:04:23.117980457 +0000 UTC m=+195.510031538" watchObservedRunningTime="2025-12-09 10:04:23.118203463 +0000 UTC m=+195.510254544" Dec 09 10:04:23 crc kubenswrapper[5002]: I1209 10:04:23.483664 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:04:23 crc kubenswrapper[5002]: I1209 10:04:23.484004 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.022997 5002 generic.go:334] "Generic (PLEG): container finished" podID="e224295a-f0bd-4501-8bee-6f02ae064e78" containerID="01add78670bdcaf29b65a55a14dc4f5b6bc3ffdd1a1db2ae323aa04e6e60a9af" exitCode=0 Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.023220 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e224295a-f0bd-4501-8bee-6f02ae064e78","Type":"ContainerDied","Data":"01add78670bdcaf29b65a55a14dc4f5b6bc3ffdd1a1db2ae323aa04e6e60a9af"} Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.025432 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk56c" event={"ID":"a788f35f-cb43-4452-a6e0-f7fb2d040d6f","Type":"ContainerStarted","Data":"2fd5de55329c978f6cdae85fc4a9b78284d145d496b9b7638fbf2b2c2828bd22"} Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.026829 5002 generic.go:334] "Generic (PLEG): container finished" podID="87bc4a85-f705-4635-b8a2-594809da87cc" containerID="3ab2eac0891b96382fa54f4901d599e081e2225606d2c7faf0a47816ed26b90d" exitCode=0 Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.026900 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"87bc4a85-f705-4635-b8a2-594809da87cc","Type":"ContainerDied","Data":"3ab2eac0891b96382fa54f4901d599e081e2225606d2c7faf0a47816ed26b90d"} Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.029330 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx5d" event={"ID":"bfc9ae06-2b91-4708-8ea6-e8e8789b2475","Type":"ContainerStarted","Data":"7f8454abdd12407ec6d82ba47e5acfdda1e1b42a2f80a8508390ffa13b15c4a2"} Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.031403 5002 generic.go:334] "Generic (PLEG): container finished" podID="eee6148b-020b-46a1-9066-4cc2fb430d13" containerID="cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d" exitCode=0 Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.031501 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlr44" event={"ID":"eee6148b-020b-46a1-9066-4cc2fb430d13","Type":"ContainerDied","Data":"cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d"} Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.033183 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2t8ql" event={"ID":"39e38e20-dd9f-43b8-848f-e5618d52f6af","Type":"ContainerDied","Data":"c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2"} Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.033190 5002 generic.go:334] "Generic (PLEG): container finished" podID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerID="c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2" exitCode=0 Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.036364 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dc2h" event={"ID":"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c","Type":"ContainerStarted","Data":"7575d4f5dda12d2d07e1bd94c1050a65e279b69669939ba1dd92a33cc88f2920"} Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.063598 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kk56c" podStartSLOduration=2.912495856 podStartE2EDuration="43.063576759s" podCreationTimestamp="2025-12-09 10:03:41 +0000 UTC" firstStartedPulling="2025-12-09 10:03:42.516037655 +0000 UTC m=+154.908088736" lastFinishedPulling="2025-12-09 10:04:22.667118548 +0000 UTC m=+195.059169639" observedRunningTime="2025-12-09 10:04:24.063442845 +0000 UTC m=+196.455493926" watchObservedRunningTime="2025-12-09 10:04:24.063576759 +0000 UTC m=+196.455627840" Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.080402 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lmx5d" podStartSLOduration=3.029657605 podStartE2EDuration="40.080383739s" podCreationTimestamp="2025-12-09 10:03:44 +0000 UTC" firstStartedPulling="2025-12-09 10:03:46.690801578 +0000 UTC m=+159.082852659" lastFinishedPulling="2025-12-09 10:04:23.741527722 +0000 UTC m=+196.133578793" observedRunningTime="2025-12-09 10:04:24.078776513 +0000 UTC m=+196.470827594" watchObservedRunningTime="2025-12-09 10:04:24.080383739 +0000 UTC m=+196.472434830" Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.683404 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.683464 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:04:24 crc kubenswrapper[5002]: I1209 10:04:24.751494 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-2dc2h" podUID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerName="registry-server" probeResult="failure" output=< Dec 09 10:04:24 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 10:04:24 crc kubenswrapper[5002]: > Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.050543 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlr44" event={"ID":"eee6148b-020b-46a1-9066-4cc2fb430d13","Type":"ContainerStarted","Data":"3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d"} Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.052946 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2t8ql" event={"ID":"39e38e20-dd9f-43b8-848f-e5618d52f6af","Type":"ContainerStarted","Data":"bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b"} Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.074981 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dlr44" podStartSLOduration=3.2616292639999998 podStartE2EDuration="42.074964465s" podCreationTimestamp="2025-12-09 10:03:43 +0000 UTC" firstStartedPulling="2025-12-09 10:03:45.654962456 +0000 UTC m=+158.047013537" lastFinishedPulling="2025-12-09 10:04:24.468297657 +0000 UTC m=+196.860348738" observedRunningTime="2025-12-09 10:04:25.073271135 +0000 UTC m=+197.465322216" watchObservedRunningTime="2025-12-09 10:04:25.074964465 +0000 UTC m=+197.467015546" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.175747 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2t8ql" podStartSLOduration=3.256255695 podStartE2EDuration="45.17571911s" podCreationTimestamp="2025-12-09 10:03:40 +0000 UTC" firstStartedPulling="2025-12-09 10:03:42.511634809 +0000 UTC m=+154.903685890" lastFinishedPulling="2025-12-09 10:04:24.431098224 +0000 UTC m=+196.823149305" observedRunningTime="2025-12-09 10:04:25.097188932 +0000 UTC m=+197.489240013" watchObservedRunningTime="2025-12-09 10:04:25.17571911 +0000 UTC m=+197.567770191" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.179601 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.213066 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-var-lock\") pod \"installer-9-crc\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.213592 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18877059-32eb-40f8-98c9-3cca1a075ec6-kube-api-access\") pod \"installer-9-crc\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.213690 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.216468 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.217998 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.314943 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.315065 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-var-lock\") pod \"installer-9-crc\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.315102 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18877059-32eb-40f8-98c9-3cca1a075ec6-kube-api-access\") pod \"installer-9-crc\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.315404 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.315405 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-var-lock\") pod \"installer-9-crc\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.354400 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18877059-32eb-40f8-98c9-3cca1a075ec6-kube-api-access\") pod \"installer-9-crc\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.533408 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.536513 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.580196 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.618466 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e224295a-f0bd-4501-8bee-6f02ae064e78-kube-api-access\") pod \"e224295a-f0bd-4501-8bee-6f02ae064e78\" (UID: \"e224295a-f0bd-4501-8bee-6f02ae064e78\") " Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.618533 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87bc4a85-f705-4635-b8a2-594809da87cc-kube-api-access\") pod \"87bc4a85-f705-4635-b8a2-594809da87cc\" (UID: \"87bc4a85-f705-4635-b8a2-594809da87cc\") " Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.618577 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87bc4a85-f705-4635-b8a2-594809da87cc-kubelet-dir\") pod \"87bc4a85-f705-4635-b8a2-594809da87cc\" (UID: \"87bc4a85-f705-4635-b8a2-594809da87cc\") " Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.618682 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e224295a-f0bd-4501-8bee-6f02ae064e78-kubelet-dir\") pod \"e224295a-f0bd-4501-8bee-6f02ae064e78\" (UID: \"e224295a-f0bd-4501-8bee-6f02ae064e78\") " Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.619109 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e224295a-f0bd-4501-8bee-6f02ae064e78-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e224295a-f0bd-4501-8bee-6f02ae064e78" (UID: "e224295a-f0bd-4501-8bee-6f02ae064e78"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.619470 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87bc4a85-f705-4635-b8a2-594809da87cc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "87bc4a85-f705-4635-b8a2-594809da87cc" (UID: "87bc4a85-f705-4635-b8a2-594809da87cc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.624081 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87bc4a85-f705-4635-b8a2-594809da87cc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "87bc4a85-f705-4635-b8a2-594809da87cc" (UID: "87bc4a85-f705-4635-b8a2-594809da87cc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.624567 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e224295a-f0bd-4501-8bee-6f02ae064e78-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e224295a-f0bd-4501-8bee-6f02ae064e78" (UID: "e224295a-f0bd-4501-8bee-6f02ae064e78"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.720233 5002 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e224295a-f0bd-4501-8bee-6f02ae064e78-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.720277 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e224295a-f0bd-4501-8bee-6f02ae064e78-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.720290 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87bc4a85-f705-4635-b8a2-594809da87cc-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.720301 5002 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87bc4a85-f705-4635-b8a2-594809da87cc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.735517 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmx5d" podUID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerName="registry-server" probeResult="failure" output=< Dec 09 10:04:25 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 10:04:25 crc kubenswrapper[5002]: > Dec 09 10:04:25 crc kubenswrapper[5002]: I1209 10:04:25.979018 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 10:04:25 crc kubenswrapper[5002]: W1209 10:04:25.987220 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod18877059_32eb_40f8_98c9_3cca1a075ec6.slice/crio-11482b11764290cc8ab2266eed16234bc4b2bb823e42208c76fc76ec9fa3223a WatchSource:0}: Error finding container 11482b11764290cc8ab2266eed16234bc4b2bb823e42208c76fc76ec9fa3223a: Status 404 returned error can't find the container with id 11482b11764290cc8ab2266eed16234bc4b2bb823e42208c76fc76ec9fa3223a Dec 09 10:04:26 crc kubenswrapper[5002]: I1209 10:04:26.058965 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e224295a-f0bd-4501-8bee-6f02ae064e78","Type":"ContainerDied","Data":"976fb3b31baf0299ac78185c60a3355f7fbc659acc11b2c411134d7746e0ef52"} Dec 09 10:04:26 crc kubenswrapper[5002]: I1209 10:04:26.059273 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="976fb3b31baf0299ac78185c60a3355f7fbc659acc11b2c411134d7746e0ef52" Dec 09 10:04:26 crc kubenswrapper[5002]: I1209 10:04:26.058997 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 10:04:26 crc kubenswrapper[5002]: I1209 10:04:26.062459 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 10:04:26 crc kubenswrapper[5002]: I1209 10:04:26.067620 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"18877059-32eb-40f8-98c9-3cca1a075ec6","Type":"ContainerStarted","Data":"11482b11764290cc8ab2266eed16234bc4b2bb823e42208c76fc76ec9fa3223a"} Dec 09 10:04:26 crc kubenswrapper[5002]: I1209 10:04:26.067655 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"87bc4a85-f705-4635-b8a2-594809da87cc","Type":"ContainerDied","Data":"511c95ae0425b926d81f6ebb52c7e77be265c7ee60eda997ae1d9e99ea4d05da"} Dec 09 10:04:26 crc kubenswrapper[5002]: I1209 10:04:26.067669 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="511c95ae0425b926d81f6ebb52c7e77be265c7ee60eda997ae1d9e99ea4d05da" Dec 09 10:04:27 crc kubenswrapper[5002]: I1209 10:04:27.069598 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"18877059-32eb-40f8-98c9-3cca1a075ec6","Type":"ContainerStarted","Data":"22aa4a36fdaf89584ca58ac5e7551c46230411d4df2219a5d8baa691cc72b16c"} Dec 09 10:04:27 crc kubenswrapper[5002]: I1209 10:04:27.089255 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.089239034 podStartE2EDuration="2.089239034s" podCreationTimestamp="2025-12-09 10:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:04:27.085590166 +0000 UTC m=+199.477641247" watchObservedRunningTime="2025-12-09 10:04:27.089239034 +0000 UTC m=+199.481290115" Dec 09 10:04:27 crc kubenswrapper[5002]: I1209 10:04:27.714617 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9fjq6"] Dec 09 10:04:31 crc kubenswrapper[5002]: I1209 10:04:31.091099 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:04:31 crc kubenswrapper[5002]: I1209 10:04:31.091525 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:04:31 crc kubenswrapper[5002]: I1209 10:04:31.137029 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:04:31 crc kubenswrapper[5002]: I1209 10:04:31.463240 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:04:31 crc kubenswrapper[5002]: I1209 10:04:31.463291 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:04:31 crc kubenswrapper[5002]: I1209 10:04:31.508919 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:04:32 crc kubenswrapper[5002]: I1209 10:04:32.131835 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:04:32 crc kubenswrapper[5002]: I1209 10:04:32.133285 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:04:33 crc kubenswrapper[5002]: I1209 10:04:33.523652 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:04:33 crc kubenswrapper[5002]: I1209 10:04:33.563252 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:04:34 crc kubenswrapper[5002]: I1209 10:04:34.288632 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:04:34 crc kubenswrapper[5002]: I1209 10:04:34.288967 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:04:34 crc kubenswrapper[5002]: I1209 10:04:34.324001 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:04:34 crc kubenswrapper[5002]: I1209 10:04:34.568751 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk56c"] Dec 09 10:04:34 crc kubenswrapper[5002]: I1209 10:04:34.569478 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kk56c" podUID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" containerName="registry-server" containerID="cri-o://2fd5de55329c978f6cdae85fc4a9b78284d145d496b9b7638fbf2b2c2828bd22" gracePeriod=2 Dec 09 10:04:34 crc kubenswrapper[5002]: I1209 10:04:34.736748 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:04:34 crc kubenswrapper[5002]: I1209 10:04:34.778823 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:04:35 crc kubenswrapper[5002]: I1209 10:04:35.169343 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.132098 5002 generic.go:334] "Generic (PLEG): container finished" podID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" containerID="2fd5de55329c978f6cdae85fc4a9b78284d145d496b9b7638fbf2b2c2828bd22" exitCode=0 Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.132454 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk56c" event={"ID":"a788f35f-cb43-4452-a6e0-f7fb2d040d6f","Type":"ContainerDied","Data":"2fd5de55329c978f6cdae85fc4a9b78284d145d496b9b7638fbf2b2c2828bd22"} Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.368958 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dc2h"] Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.369198 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2dc2h" podUID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerName="registry-server" containerID="cri-o://7575d4f5dda12d2d07e1bd94c1050a65e279b69669939ba1dd92a33cc88f2920" gracePeriod=2 Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.557844 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.754062 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-utilities\") pod \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.754154 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-catalog-content\") pod \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.754237 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr7m4\" (UniqueName: \"kubernetes.io/projected/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-kube-api-access-pr7m4\") pod \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\" (UID: \"a788f35f-cb43-4452-a6e0-f7fb2d040d6f\") " Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.757124 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-utilities" (OuterVolumeSpecName: "utilities") pod "a788f35f-cb43-4452-a6e0-f7fb2d040d6f" (UID: "a788f35f-cb43-4452-a6e0-f7fb2d040d6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.761073 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-kube-api-access-pr7m4" (OuterVolumeSpecName: "kube-api-access-pr7m4") pod "a788f35f-cb43-4452-a6e0-f7fb2d040d6f" (UID: "a788f35f-cb43-4452-a6e0-f7fb2d040d6f"). InnerVolumeSpecName "kube-api-access-pr7m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.855250 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr7m4\" (UniqueName: \"kubernetes.io/projected/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-kube-api-access-pr7m4\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.855280 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.969385 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmx5d"] Dec 09 10:04:36 crc kubenswrapper[5002]: I1209 10:04:36.969629 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lmx5d" podUID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerName="registry-server" containerID="cri-o://7f8454abdd12407ec6d82ba47e5acfdda1e1b42a2f80a8508390ffa13b15c4a2" gracePeriod=2 Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.027871 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a788f35f-cb43-4452-a6e0-f7fb2d040d6f" (UID: "a788f35f-cb43-4452-a6e0-f7fb2d040d6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.057288 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a788f35f-cb43-4452-a6e0-f7fb2d040d6f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.138470 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk56c" Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.138496 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk56c" event={"ID":"a788f35f-cb43-4452-a6e0-f7fb2d040d6f","Type":"ContainerDied","Data":"ac1f0c776ff75b2b74398e76a07d08fbc7aca29be4a79d21624d6b080f4f2ce3"} Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.138555 5002 scope.go:117] "RemoveContainer" containerID="2fd5de55329c978f6cdae85fc4a9b78284d145d496b9b7638fbf2b2c2828bd22" Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.140596 5002 generic.go:334] "Generic (PLEG): container finished" podID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerID="7575d4f5dda12d2d07e1bd94c1050a65e279b69669939ba1dd92a33cc88f2920" exitCode=0 Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.140628 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dc2h" event={"ID":"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c","Type":"ContainerDied","Data":"7575d4f5dda12d2d07e1bd94c1050a65e279b69669939ba1dd92a33cc88f2920"} Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.162085 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk56c"] Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.171566 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kk56c"] Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.965176 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.965240 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.965286 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.965863 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:04:37 crc kubenswrapper[5002]: I1209 10:04:37.965980 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c" gracePeriod=600 Dec 09 10:04:38 crc kubenswrapper[5002]: I1209 10:04:38.067072 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" path="/var/lib/kubelet/pods/a788f35f-cb43-4452-a6e0-f7fb2d040d6f/volumes" Dec 09 10:04:39 crc kubenswrapper[5002]: I1209 10:04:39.152782 5002 generic.go:334] "Generic (PLEG): container finished" podID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerID="7f8454abdd12407ec6d82ba47e5acfdda1e1b42a2f80a8508390ffa13b15c4a2" exitCode=0 Dec 09 10:04:39 crc kubenswrapper[5002]: I1209 10:04:39.152864 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx5d" event={"ID":"bfc9ae06-2b91-4708-8ea6-e8e8789b2475","Type":"ContainerDied","Data":"7f8454abdd12407ec6d82ba47e5acfdda1e1b42a2f80a8508390ffa13b15c4a2"} Dec 09 10:04:39 crc kubenswrapper[5002]: I1209 10:04:39.154755 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c" exitCode=0 Dec 09 10:04:39 crc kubenswrapper[5002]: I1209 10:04:39.154778 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c"} Dec 09 10:04:40 crc kubenswrapper[5002]: I1209 10:04:40.735494 5002 scope.go:117] "RemoveContainer" containerID="37b655e008a9bc2e0d5a4e8059ee9d73c154c07c7b26ca95dbe9721ec5b0af5d" Dec 09 10:04:40 crc kubenswrapper[5002]: I1209 10:04:40.760691 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:04:40 crc kubenswrapper[5002]: I1209 10:04:40.816316 5002 scope.go:117] "RemoveContainer" containerID="b2b398af59565a2a35760cf022c3e26366a729d6a1a2b37fbb89cf5f677d38f2" Dec 09 10:04:40 crc kubenswrapper[5002]: I1209 10:04:40.937988 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-catalog-content\") pod \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " Dec 09 10:04:40 crc kubenswrapper[5002]: I1209 10:04:40.938414 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-utilities\") pod \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " Dec 09 10:04:40 crc kubenswrapper[5002]: I1209 10:04:40.938478 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8zrj\" (UniqueName: \"kubernetes.io/projected/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-kube-api-access-b8zrj\") pod \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\" (UID: \"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c\") " Dec 09 10:04:40 crc kubenswrapper[5002]: I1209 10:04:40.943249 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-utilities" (OuterVolumeSpecName: "utilities") pod "9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" (UID: "9c47a6b7-a5f5-4aea-aec6-a90e1c75859c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:04:40 crc kubenswrapper[5002]: I1209 10:04:40.953631 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-kube-api-access-b8zrj" (OuterVolumeSpecName: "kube-api-access-b8zrj") pod "9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" (UID: "9c47a6b7-a5f5-4aea-aec6-a90e1c75859c"). InnerVolumeSpecName "kube-api-access-b8zrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:04:40 crc kubenswrapper[5002]: I1209 10:04:40.970946 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" (UID: "9c47a6b7-a5f5-4aea-aec6-a90e1c75859c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:04:40 crc kubenswrapper[5002]: I1209 10:04:40.972472 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.039963 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.040336 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.040476 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8zrj\" (UniqueName: \"kubernetes.io/projected/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c-kube-api-access-b8zrj\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.141733 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs2wc\" (UniqueName: \"kubernetes.io/projected/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-kube-api-access-xs2wc\") pod \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.141782 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-utilities\") pod \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.141841 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-catalog-content\") pod \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\" (UID: \"bfc9ae06-2b91-4708-8ea6-e8e8789b2475\") " Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.144363 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-utilities" (OuterVolumeSpecName: "utilities") pod "bfc9ae06-2b91-4708-8ea6-e8e8789b2475" (UID: "bfc9ae06-2b91-4708-8ea6-e8e8789b2475"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.146256 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-kube-api-access-xs2wc" (OuterVolumeSpecName: "kube-api-access-xs2wc") pod "bfc9ae06-2b91-4708-8ea6-e8e8789b2475" (UID: "bfc9ae06-2b91-4708-8ea6-e8e8789b2475"). InnerVolumeSpecName "kube-api-access-xs2wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.167457 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dc2h" event={"ID":"9c47a6b7-a5f5-4aea-aec6-a90e1c75859c","Type":"ContainerDied","Data":"882f19544ed280a07c75e5afab55a47dc4cea53af7a9a5be15fb1143f3531965"} Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.167494 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dc2h" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.167514 5002 scope.go:117] "RemoveContainer" containerID="7575d4f5dda12d2d07e1bd94c1050a65e279b69669939ba1dd92a33cc88f2920" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.171207 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsnb5" event={"ID":"a3c9ac96-2913-4efd-b800-4f5c18dc08ab","Type":"ContainerStarted","Data":"42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16"} Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.193891 5002 scope.go:117] "RemoveContainer" containerID="931a5b1abd320b957b303805787534e3557adaaba28e5eaa4dda68e4a762ccc7" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.195738 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwx7v" event={"ID":"2c7271b9-c21f-44a0-ad27-f4158f36f317","Type":"ContainerStarted","Data":"1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17"} Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.220782 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dc2h"] Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.222972 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx5d" event={"ID":"bfc9ae06-2b91-4708-8ea6-e8e8789b2475","Type":"ContainerDied","Data":"9570f3bf8a1cde7ea5598683447e729186333218d9d66b59b5b73e4d237e54dc"} Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.223249 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmx5d" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.224907 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dc2h"] Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.225617 5002 generic.go:334] "Generic (PLEG): container finished" podID="038508b9-13ca-43a3-8414-09097229bc83" containerID="428ae6ca9b62f545763de424679882ca418bb5328308202cbe5e76a963b7548f" exitCode=0 Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.225678 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptdnb" event={"ID":"038508b9-13ca-43a3-8414-09097229bc83","Type":"ContainerDied","Data":"428ae6ca9b62f545763de424679882ca418bb5328308202cbe5e76a963b7548f"} Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.232473 5002 scope.go:117] "RemoveContainer" containerID="29fd90e99a76aff7f225c49dd389222f67aa6e33cc01f9d10a746de39da1e858" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.234012 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"a8e4e15cd2cad8dac6467aa0019df2b17c907b359be99120bdb37a1c05087210"} Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.242964 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs2wc\" (UniqueName: \"kubernetes.io/projected/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-kube-api-access-xs2wc\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.242991 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.255258 5002 scope.go:117] "RemoveContainer" containerID="7f8454abdd12407ec6d82ba47e5acfdda1e1b42a2f80a8508390ffa13b15c4a2" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.313433 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfc9ae06-2b91-4708-8ea6-e8e8789b2475" (UID: "bfc9ae06-2b91-4708-8ea6-e8e8789b2475"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.344605 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfc9ae06-2b91-4708-8ea6-e8e8789b2475-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.385613 5002 scope.go:117] "RemoveContainer" containerID="7c113c8ab1b12477f5480bda4f3129627ce4a536fac90d2204ec5560ffb648b1" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.402647 5002 scope.go:117] "RemoveContainer" containerID="7d71c5dd30974758cf12343414a0705a13b36009432ec1e7bb6d946e865ffa4b" Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.549024 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmx5d"] Dec 09 10:04:41 crc kubenswrapper[5002]: I1209 10:04:41.553085 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lmx5d"] Dec 09 10:04:42 crc kubenswrapper[5002]: I1209 10:04:42.068974 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" path="/var/lib/kubelet/pods/9c47a6b7-a5f5-4aea-aec6-a90e1c75859c/volumes" Dec 09 10:04:42 crc kubenswrapper[5002]: I1209 10:04:42.069550 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" path="/var/lib/kubelet/pods/bfc9ae06-2b91-4708-8ea6-e8e8789b2475/volumes" Dec 09 10:04:42 crc kubenswrapper[5002]: I1209 10:04:42.240523 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptdnb" event={"ID":"038508b9-13ca-43a3-8414-09097229bc83","Type":"ContainerStarted","Data":"5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa"} Dec 09 10:04:42 crc kubenswrapper[5002]: I1209 10:04:42.243077 5002 generic.go:334] "Generic (PLEG): container finished" podID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" containerID="42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16" exitCode=0 Dec 09 10:04:42 crc kubenswrapper[5002]: I1209 10:04:42.243148 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsnb5" event={"ID":"a3c9ac96-2913-4efd-b800-4f5c18dc08ab","Type":"ContainerDied","Data":"42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16"} Dec 09 10:04:42 crc kubenswrapper[5002]: I1209 10:04:42.245893 5002 generic.go:334] "Generic (PLEG): container finished" podID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerID="1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17" exitCode=0 Dec 09 10:04:42 crc kubenswrapper[5002]: I1209 10:04:42.245950 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwx7v" event={"ID":"2c7271b9-c21f-44a0-ad27-f4158f36f317","Type":"ContainerDied","Data":"1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17"} Dec 09 10:04:42 crc kubenswrapper[5002]: I1209 10:04:42.263883 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ptdnb" podStartSLOduration=3.261009565 podStartE2EDuration="1m0.263866546s" podCreationTimestamp="2025-12-09 10:03:42 +0000 UTC" firstStartedPulling="2025-12-09 10:03:44.603001355 +0000 UTC m=+156.995052436" lastFinishedPulling="2025-12-09 10:04:41.605858336 +0000 UTC m=+213.997909417" observedRunningTime="2025-12-09 10:04:42.259904469 +0000 UTC m=+214.651955550" watchObservedRunningTime="2025-12-09 10:04:42.263866546 +0000 UTC m=+214.655917647" Dec 09 10:04:43 crc kubenswrapper[5002]: I1209 10:04:43.064169 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:04:43 crc kubenswrapper[5002]: I1209 10:04:43.064498 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:04:43 crc kubenswrapper[5002]: I1209 10:04:43.106011 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:04:43 crc kubenswrapper[5002]: I1209 10:04:43.257080 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsnb5" event={"ID":"a3c9ac96-2913-4efd-b800-4f5c18dc08ab","Type":"ContainerStarted","Data":"21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb"} Dec 09 10:04:43 crc kubenswrapper[5002]: I1209 10:04:43.273312 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zsnb5" podStartSLOduration=1.860032516 podStartE2EDuration="1m2.273297603s" podCreationTimestamp="2025-12-09 10:03:41 +0000 UTC" firstStartedPulling="2025-12-09 10:03:42.50257441 +0000 UTC m=+154.894625491" lastFinishedPulling="2025-12-09 10:04:42.915839497 +0000 UTC m=+215.307890578" observedRunningTime="2025-12-09 10:04:43.270761908 +0000 UTC m=+215.662812989" watchObservedRunningTime="2025-12-09 10:04:43.273297603 +0000 UTC m=+215.665348684" Dec 09 10:04:44 crc kubenswrapper[5002]: I1209 10:04:44.263374 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwx7v" event={"ID":"2c7271b9-c21f-44a0-ad27-f4158f36f317","Type":"ContainerStarted","Data":"df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da"} Dec 09 10:04:44 crc kubenswrapper[5002]: I1209 10:04:44.282820 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lwx7v" podStartSLOduration=3.707251468 podStartE2EDuration="1m4.282799772s" podCreationTimestamp="2025-12-09 10:03:40 +0000 UTC" firstStartedPulling="2025-12-09 10:03:42.507189982 +0000 UTC m=+154.899241063" lastFinishedPulling="2025-12-09 10:04:43.082738286 +0000 UTC m=+215.474789367" observedRunningTime="2025-12-09 10:04:44.279762522 +0000 UTC m=+216.671813603" watchObservedRunningTime="2025-12-09 10:04:44.282799772 +0000 UTC m=+216.674850853" Dec 09 10:04:51 crc kubenswrapper[5002]: I1209 10:04:51.253859 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:04:51 crc kubenswrapper[5002]: I1209 10:04:51.254385 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:04:51 crc kubenswrapper[5002]: I1209 10:04:51.298900 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:04:51 crc kubenswrapper[5002]: I1209 10:04:51.340464 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:04:51 crc kubenswrapper[5002]: I1209 10:04:51.715471 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:04:51 crc kubenswrapper[5002]: I1209 10:04:51.715555 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:04:51 crc kubenswrapper[5002]: I1209 10:04:51.755481 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:04:52 crc kubenswrapper[5002]: I1209 10:04:52.336292 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:04:52 crc kubenswrapper[5002]: I1209 10:04:52.756625 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" podUID="e2598866-d004-41a0-b058-01fb9a379df5" containerName="oauth-openshift" containerID="cri-o://34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce" gracePeriod=15 Dec 09 10:04:52 crc kubenswrapper[5002]: I1209 10:04:52.911669 5002 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-9fjq6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Dec 09 10:04:52 crc kubenswrapper[5002]: I1209 10:04:52.911729 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" podUID="e2598866-d004-41a0-b058-01fb9a379df5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Dec 09 10:04:53 crc kubenswrapper[5002]: I1209 10:04:53.099458 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:04:53 crc kubenswrapper[5002]: I1209 10:04:53.926511 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zsnb5"] Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.238675 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290375 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8"] Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290569 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerName="extract-content" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290581 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerName="extract-content" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290588 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerName="extract-utilities" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290595 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerName="extract-utilities" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290602 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2598866-d004-41a0-b058-01fb9a379df5" containerName="oauth-openshift" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290608 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2598866-d004-41a0-b058-01fb9a379df5" containerName="oauth-openshift" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290617 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerName="registry-server" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290623 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerName="registry-server" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290632 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" containerName="extract-content" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290638 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" containerName="extract-content" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290645 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" containerName="registry-server" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290653 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" containerName="registry-server" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290662 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" containerName="extract-utilities" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290668 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" containerName="extract-utilities" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290679 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87bc4a85-f705-4635-b8a2-594809da87cc" containerName="pruner" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290685 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="87bc4a85-f705-4635-b8a2-594809da87cc" containerName="pruner" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290692 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerName="registry-server" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290697 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerName="registry-server" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290707 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e224295a-f0bd-4501-8bee-6f02ae064e78" containerName="pruner" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290712 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e224295a-f0bd-4501-8bee-6f02ae064e78" containerName="pruner" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290721 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerName="extract-content" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290727 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerName="extract-content" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.290733 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerName="extract-utilities" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290739 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerName="extract-utilities" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290838 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="87bc4a85-f705-4635-b8a2-594809da87cc" containerName="pruner" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290849 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c47a6b7-a5f5-4aea-aec6-a90e1c75859c" containerName="registry-server" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290861 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a788f35f-cb43-4452-a6e0-f7fb2d040d6f" containerName="registry-server" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290868 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e224295a-f0bd-4501-8bee-6f02ae064e78" containerName="pruner" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290875 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc9ae06-2b91-4708-8ea6-e8e8789b2475" containerName="registry-server" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.290884 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2598866-d004-41a0-b058-01fb9a379df5" containerName="oauth-openshift" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.291200 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.345230 5002 generic.go:334] "Generic (PLEG): container finished" podID="e2598866-d004-41a0-b058-01fb9a379df5" containerID="34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce" exitCode=0 Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.345336 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.345386 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" event={"ID":"e2598866-d004-41a0-b058-01fb9a379df5","Type":"ContainerDied","Data":"34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce"} Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.345427 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9fjq6" event={"ID":"e2598866-d004-41a0-b058-01fb9a379df5","Type":"ContainerDied","Data":"567fbf93663e94a933f79457f9afaa944177b75e3ae16dfb87fc19fa0e90f5af"} Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.345444 5002 scope.go:117] "RemoveContainer" containerID="34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.345472 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zsnb5" podUID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" containerName="registry-server" containerID="cri-o://21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb" gracePeriod=2 Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.369938 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8"] Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.391966 5002 scope.go:117] "RemoveContainer" containerID="34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce" Dec 09 10:04:54 crc kubenswrapper[5002]: E1209 10:04:54.395851 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce\": container with ID starting with 34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce not found: ID does not exist" containerID="34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.395893 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce"} err="failed to get container status \"34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce\": rpc error: code = NotFound desc = could not find container \"34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce\": container with ID starting with 34ade61866cdd6a700bff8d377b79d4e7de7f055a5300de170e234c2d76a07ce not found: ID does not exist" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411301 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr4w7\" (UniqueName: \"kubernetes.io/projected/e2598866-d004-41a0-b058-01fb9a379df5-kube-api-access-jr4w7\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411339 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-audit-policies\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411369 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-service-ca\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411389 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-provider-selection\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411407 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-login\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411429 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-cliconfig\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411471 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-error\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411494 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-router-certs\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411510 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-trusted-ca-bundle\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411535 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-session\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411561 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-serving-cert\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411592 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e2598866-d004-41a0-b058-01fb9a379df5-audit-dir\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411619 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-idp-0-file-data\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411645 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-ocp-branding-template\") pod \"e2598866-d004-41a0-b058-01fb9a379df5\" (UID: \"e2598866-d004-41a0-b058-01fb9a379df5\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411756 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411777 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-template-login\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411804 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-audit-dir\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411842 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-audit-policies\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411858 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411891 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411906 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-template-error\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411932 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411952 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411971 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.411985 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5fd4\" (UniqueName: \"kubernetes.io/projected/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-kube-api-access-b5fd4\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.412004 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-session\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.412022 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.412043 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.412130 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.417886 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.418480 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.421893 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2598866-d004-41a0-b058-01fb9a379df5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.422516 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.422697 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.429977 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2598866-d004-41a0-b058-01fb9a379df5-kube-api-access-jr4w7" (OuterVolumeSpecName: "kube-api-access-jr4w7") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "kube-api-access-jr4w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.430520 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.430691 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.430979 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.430990 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.432756 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.432846 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.433430 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e2598866-d004-41a0-b058-01fb9a379df5" (UID: "e2598866-d004-41a0-b058-01fb9a379df5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513304 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513369 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513399 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513422 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5fd4\" (UniqueName: \"kubernetes.io/projected/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-kube-api-access-b5fd4\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513479 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-session\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513508 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513542 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513570 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513594 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-template-login\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513686 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-audit-dir\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513716 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-audit-policies\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513739 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513785 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513809 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-template-error\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513892 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513909 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr4w7\" (UniqueName: \"kubernetes.io/projected/e2598866-d004-41a0-b058-01fb9a379df5-kube-api-access-jr4w7\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513923 5002 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513936 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513950 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513962 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513977 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.513991 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.514006 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.514020 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.514033 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.514045 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.514056 5002 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e2598866-d004-41a0-b058-01fb9a379df5-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.514068 5002 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e2598866-d004-41a0-b058-01fb9a379df5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.514686 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-audit-dir\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.514945 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.516393 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.516583 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-audit-policies\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.516984 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.517493 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-template-error\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.517499 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.518506 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.518673 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.519918 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-user-template-login\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.520754 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.521140 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.521572 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-v4-0-config-system-session\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.531946 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5fd4\" (UniqueName: \"kubernetes.io/projected/4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61-kube-api-access-b5fd4\") pod \"oauth-openshift-6bbf4c9fdf-vngt8\" (UID: \"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.619245 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.684330 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9fjq6"] Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.689233 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9fjq6"] Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.690039 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.817783 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-utilities\") pod \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.818216 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-catalog-content\") pod \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.818356 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjl4c\" (UniqueName: \"kubernetes.io/projected/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-kube-api-access-cjl4c\") pod \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\" (UID: \"a3c9ac96-2913-4efd-b800-4f5c18dc08ab\") " Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.818715 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-utilities" (OuterVolumeSpecName: "utilities") pod "a3c9ac96-2913-4efd-b800-4f5c18dc08ab" (UID: "a3c9ac96-2913-4efd-b800-4f5c18dc08ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.822854 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-kube-api-access-cjl4c" (OuterVolumeSpecName: "kube-api-access-cjl4c") pod "a3c9ac96-2913-4efd-b800-4f5c18dc08ab" (UID: "a3c9ac96-2913-4efd-b800-4f5c18dc08ab"). InnerVolumeSpecName "kube-api-access-cjl4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.864428 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3c9ac96-2913-4efd-b800-4f5c18dc08ab" (UID: "a3c9ac96-2913-4efd-b800-4f5c18dc08ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.919469 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjl4c\" (UniqueName: \"kubernetes.io/projected/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-kube-api-access-cjl4c\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.919502 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:54 crc kubenswrapper[5002]: I1209 10:04:54.919513 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c9ac96-2913-4efd-b800-4f5c18dc08ab-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:04:55 crc kubenswrapper[5002]: W1209 10:04:55.004357 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b16d5e0_54d2_4cd4_a3b6_267e3fa3ee61.slice/crio-30c76755f39db71289154d7785f6475e3e7adc99149a729fcb99ba04f40f117c WatchSource:0}: Error finding container 30c76755f39db71289154d7785f6475e3e7adc99149a729fcb99ba04f40f117c: Status 404 returned error can't find the container with id 30c76755f39db71289154d7785f6475e3e7adc99149a729fcb99ba04f40f117c Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.004599 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8"] Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.351855 5002 generic.go:334] "Generic (PLEG): container finished" podID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" containerID="21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb" exitCode=0 Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.351906 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsnb5" event={"ID":"a3c9ac96-2913-4efd-b800-4f5c18dc08ab","Type":"ContainerDied","Data":"21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb"} Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.351928 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsnb5" event={"ID":"a3c9ac96-2913-4efd-b800-4f5c18dc08ab","Type":"ContainerDied","Data":"6d96b6833a67266ebad3d256fe7bae460344f49c5ac7bad4a5a3120bf2413b1c"} Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.351944 5002 scope.go:117] "RemoveContainer" containerID="21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.352051 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsnb5" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.358052 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" event={"ID":"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61","Type":"ContainerStarted","Data":"9f5532a6d380d96db55f379c6bec0a3ec6b27bbd84614f58465331b9dec08552"} Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.358082 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" event={"ID":"4b16d5e0-54d2-4cd4-a3b6-267e3fa3ee61","Type":"ContainerStarted","Data":"30c76755f39db71289154d7785f6475e3e7adc99149a729fcb99ba04f40f117c"} Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.358402 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.396283 5002 scope.go:117] "RemoveContainer" containerID="42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.398728 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" podStartSLOduration=28.398713369 podStartE2EDuration="28.398713369s" podCreationTimestamp="2025-12-09 10:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:04:55.381796669 +0000 UTC m=+227.773847770" watchObservedRunningTime="2025-12-09 10:04:55.398713369 +0000 UTC m=+227.790764450" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.398935 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zsnb5"] Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.402423 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zsnb5"] Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.411761 5002 scope.go:117] "RemoveContainer" containerID="5ab93d92bda564c722949a7c11beadd44373acc2a3de4e23147199b19226977e" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.427533 5002 scope.go:117] "RemoveContainer" containerID="21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb" Dec 09 10:04:55 crc kubenswrapper[5002]: E1209 10:04:55.427937 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb\": container with ID starting with 21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb not found: ID does not exist" containerID="21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.427982 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb"} err="failed to get container status \"21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb\": rpc error: code = NotFound desc = could not find container \"21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb\": container with ID starting with 21f84c02a592057ba6c728094afa66d57af3215208d4c22360ca663b21f84bdb not found: ID does not exist" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.428016 5002 scope.go:117] "RemoveContainer" containerID="42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16" Dec 09 10:04:55 crc kubenswrapper[5002]: E1209 10:04:55.428350 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16\": container with ID starting with 42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16 not found: ID does not exist" containerID="42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.428396 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16"} err="failed to get container status \"42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16\": rpc error: code = NotFound desc = could not find container \"42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16\": container with ID starting with 42ccb0e24cab331f4554a70440d115beb6f8e70d844555549f7859ec42602e16 not found: ID does not exist" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.428419 5002 scope.go:117] "RemoveContainer" containerID="5ab93d92bda564c722949a7c11beadd44373acc2a3de4e23147199b19226977e" Dec 09 10:04:55 crc kubenswrapper[5002]: E1209 10:04:55.428832 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab93d92bda564c722949a7c11beadd44373acc2a3de4e23147199b19226977e\": container with ID starting with 5ab93d92bda564c722949a7c11beadd44373acc2a3de4e23147199b19226977e not found: ID does not exist" containerID="5ab93d92bda564c722949a7c11beadd44373acc2a3de4e23147199b19226977e" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.428863 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab93d92bda564c722949a7c11beadd44373acc2a3de4e23147199b19226977e"} err="failed to get container status \"5ab93d92bda564c722949a7c11beadd44373acc2a3de4e23147199b19226977e\": rpc error: code = NotFound desc = could not find container \"5ab93d92bda564c722949a7c11beadd44373acc2a3de4e23147199b19226977e\": container with ID starting with 5ab93d92bda564c722949a7c11beadd44373acc2a3de4e23147199b19226977e not found: ID does not exist" Dec 09 10:04:55 crc kubenswrapper[5002]: I1209 10:04:55.685301 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-vngt8" Dec 09 10:04:56 crc kubenswrapper[5002]: I1209 10:04:56.071754 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" path="/var/lib/kubelet/pods/a3c9ac96-2913-4efd-b800-4f5c18dc08ab/volumes" Dec 09 10:04:56 crc kubenswrapper[5002]: I1209 10:04:56.072980 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2598866-d004-41a0-b058-01fb9a379df5" path="/var/lib/kubelet/pods/e2598866-d004-41a0-b058-01fb9a379df5/volumes" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.959409 5002 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 10:05:03 crc kubenswrapper[5002]: E1209 10:05:03.960118 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" containerName="extract-content" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.960133 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" containerName="extract-content" Dec 09 10:05:03 crc kubenswrapper[5002]: E1209 10:05:03.960151 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" containerName="registry-server" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.960156 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" containerName="registry-server" Dec 09 10:05:03 crc kubenswrapper[5002]: E1209 10:05:03.960169 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" containerName="extract-utilities" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.960175 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" containerName="extract-utilities" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.960282 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c9ac96-2913-4efd-b800-4f5c18dc08ab" containerName="registry-server" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.960645 5002 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.960977 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.960938 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9" gracePeriod=15 Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.961112 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48" gracePeriod=15 Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.961093 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9" gracePeriod=15 Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.961093 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680" gracePeriod=15 Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.961089 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7" gracePeriod=15 Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.961597 5002 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 10:05:03 crc kubenswrapper[5002]: E1209 10:05:03.961848 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.961861 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 10:05:03 crc kubenswrapper[5002]: E1209 10:05:03.961872 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.961878 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 09 10:05:03 crc kubenswrapper[5002]: E1209 10:05:03.961890 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.961914 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 10:05:03 crc kubenswrapper[5002]: E1209 10:05:03.961929 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.961935 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 10:05:03 crc kubenswrapper[5002]: E1209 10:05:03.961945 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.961952 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 10:05:03 crc kubenswrapper[5002]: E1209 10:05:03.961962 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.961968 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 10:05:03 crc kubenswrapper[5002]: E1209 10:05:03.961993 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.962000 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.962112 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.962126 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.962132 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.962159 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.962169 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 10:05:03 crc kubenswrapper[5002]: I1209 10:05:03.962176 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.000493 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.135445 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.135503 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.135558 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.135587 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.135611 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.135651 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.135710 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.135733 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236646 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236727 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236745 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236753 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236766 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236787 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236805 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236854 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236876 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236895 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236905 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236920 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236926 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236951 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236978 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.236982 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.297557 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:04 crc kubenswrapper[5002]: E1209 10:05:04.329639 5002 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.132:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f83fda27f75b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 10:05:04.328750517 +0000 UTC m=+236.720801598,LastTimestamp:2025-12-09 10:05:04.328750517 +0000 UTC m=+236.720801598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.406108 5002 generic.go:334] "Generic (PLEG): container finished" podID="18877059-32eb-40f8-98c9-3cca1a075ec6" containerID="22aa4a36fdaf89584ca58ac5e7551c46230411d4df2219a5d8baa691cc72b16c" exitCode=0 Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.406194 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"18877059-32eb-40f8-98c9-3cca1a075ec6","Type":"ContainerDied","Data":"22aa4a36fdaf89584ca58ac5e7551c46230411d4df2219a5d8baa691cc72b16c"} Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.407057 5002 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.407543 5002 status_manager.go:851] "Failed to get status for pod" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.407706 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"75ba65061a4f7d5166de95b4bdfa3c39730db00b84c86798181fe76d09ac04f7"} Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.409201 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.410229 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.411103 5002 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7" exitCode=0 Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.411128 5002 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9" exitCode=0 Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.411136 5002 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680" exitCode=0 Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.411143 5002 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48" exitCode=2 Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.411186 5002 scope.go:117] "RemoveContainer" containerID="f7f8b636bb2b94c1869a80c0fdf6b143adc693e92248be6dc0a12d26bd03adf8" Dec 09 10:05:04 crc kubenswrapper[5002]: E1209 10:05:04.741376 5002 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:04 crc kubenswrapper[5002]: E1209 10:05:04.741592 5002 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:04 crc kubenswrapper[5002]: E1209 10:05:04.741767 5002 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:04 crc kubenswrapper[5002]: E1209 10:05:04.742087 5002 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:04 crc kubenswrapper[5002]: E1209 10:05:04.742473 5002 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:04 crc kubenswrapper[5002]: I1209 10:05:04.742495 5002 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 09 10:05:04 crc kubenswrapper[5002]: E1209 10:05:04.742718 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" interval="200ms" Dec 09 10:05:04 crc kubenswrapper[5002]: E1209 10:05:04.943551 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" interval="400ms" Dec 09 10:05:05 crc kubenswrapper[5002]: E1209 10:05:05.345249 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" interval="800ms" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.418994 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66"} Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.420067 5002 status_manager.go:851] "Failed to get status for pod" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.420878 5002 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.427884 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.648084 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.648877 5002 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.649042 5002 status_manager.go:851] "Failed to get status for pod" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.753327 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-var-lock\") pod \"18877059-32eb-40f8-98c9-3cca1a075ec6\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.753434 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18877059-32eb-40f8-98c9-3cca1a075ec6-kube-api-access\") pod \"18877059-32eb-40f8-98c9-3cca1a075ec6\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.753476 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-var-lock" (OuterVolumeSpecName: "var-lock") pod "18877059-32eb-40f8-98c9-3cca1a075ec6" (UID: "18877059-32eb-40f8-98c9-3cca1a075ec6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.753480 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-kubelet-dir\") pod \"18877059-32eb-40f8-98c9-3cca1a075ec6\" (UID: \"18877059-32eb-40f8-98c9-3cca1a075ec6\") " Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.753569 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "18877059-32eb-40f8-98c9-3cca1a075ec6" (UID: "18877059-32eb-40f8-98c9-3cca1a075ec6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.754015 5002 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-var-lock\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.754036 5002 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18877059-32eb-40f8-98c9-3cca1a075ec6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.758370 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18877059-32eb-40f8-98c9-3cca1a075ec6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "18877059-32eb-40f8-98c9-3cca1a075ec6" (UID: "18877059-32eb-40f8-98c9-3cca1a075ec6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:05:05 crc kubenswrapper[5002]: I1209 10:05:05.855345 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18877059-32eb-40f8-98c9-3cca1a075ec6-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:06 crc kubenswrapper[5002]: E1209 10:05:06.145939 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" interval="1.6s" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.437254 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.437942 5002 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9" exitCode=0 Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.438077 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="165dd357907d6f5eedaf8a706d99f80fade7a3d1705f752f73500f6b8ae70a15" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.440363 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.440707 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"18877059-32eb-40f8-98c9-3cca1a075ec6","Type":"ContainerDied","Data":"11482b11764290cc8ab2266eed16234bc4b2bb823e42208c76fc76ec9fa3223a"} Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.440736 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11482b11764290cc8ab2266eed16234bc4b2bb823e42208c76fc76ec9fa3223a" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.459352 5002 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.459661 5002 status_manager.go:851] "Failed to get status for pod" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.464782 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.465499 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.465927 5002 status_manager.go:851] "Failed to get status for pod" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.466136 5002 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.466465 5002 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.563523 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.563618 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.563646 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.563952 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.563990 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.564004 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.665375 5002 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.665417 5002 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:06 crc kubenswrapper[5002]: I1209 10:05:06.665429 5002 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:07 crc kubenswrapper[5002]: I1209 10:05:07.444455 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:07 crc kubenswrapper[5002]: I1209 10:05:07.458158 5002 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:07 crc kubenswrapper[5002]: I1209 10:05:07.458530 5002 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:07 crc kubenswrapper[5002]: I1209 10:05:07.458798 5002 status_manager.go:851] "Failed to get status for pod" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:07 crc kubenswrapper[5002]: E1209 10:05:07.747285 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" interval="3.2s" Dec 09 10:05:08 crc kubenswrapper[5002]: I1209 10:05:08.063182 5002 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:08 crc kubenswrapper[5002]: I1209 10:05:08.063793 5002 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:08 crc kubenswrapper[5002]: I1209 10:05:08.064123 5002 status_manager.go:851] "Failed to get status for pod" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:08 crc kubenswrapper[5002]: I1209 10:05:08.068374 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 09 10:05:08 crc kubenswrapper[5002]: E1209 10:05:08.977483 5002 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.132:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f83fda27f75b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 10:05:04.328750517 +0000 UTC m=+236.720801598,LastTimestamp:2025-12-09 10:05:04.328750517 +0000 UTC m=+236.720801598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 10:05:10 crc kubenswrapper[5002]: E1209 10:05:10.948258 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" interval="6.4s" Dec 09 10:05:14 crc kubenswrapper[5002]: E1209 10:05:14.069461 5002 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.132:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" volumeName="registry-storage" Dec 09 10:05:17 crc kubenswrapper[5002]: E1209 10:05:17.350257 5002 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.132:6443: connect: connection refused" interval="7s" Dec 09 10:05:18 crc kubenswrapper[5002]: I1209 10:05:18.062296 5002 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:18 crc kubenswrapper[5002]: I1209 10:05:18.063302 5002 status_manager.go:851] "Failed to get status for pod" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:18 crc kubenswrapper[5002]: E1209 10:05:18.978521 5002 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.132:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f83fda27f75b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 10:05:04.328750517 +0000 UTC m=+236.720801598,LastTimestamp:2025-12-09 10:05:04.328750517 +0000 UTC m=+236.720801598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.060117 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.061615 5002 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.062415 5002 status_manager.go:851] "Failed to get status for pod" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.077723 5002 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.077751 5002 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9" Dec 09 10:05:19 crc kubenswrapper[5002]: E1209 10:05:19.078182 5002 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.078692 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:19 crc kubenswrapper[5002]: W1209 10:05:19.107379 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-ecb2cc3829a882d602f232a35644d65ca8d2109d87036ba9e71bae282893be07 WatchSource:0}: Error finding container ecb2cc3829a882d602f232a35644d65ca8d2109d87036ba9e71bae282893be07: Status 404 returned error can't find the container with id ecb2cc3829a882d602f232a35644d65ca8d2109d87036ba9e71bae282893be07 Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.555372 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.555751 5002 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98" exitCode=1 Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.555845 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98"} Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.556304 5002 scope.go:117] "RemoveContainer" containerID="5a41e95e38748ef89ff0bc6429eb223b7821756bf1e0c84a3af512f4f0166a98" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.556942 5002 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.557563 5002 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.557802 5002 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="70752b4fdce0b0969978dc3e8d7f9dd0e39727110731eb2a22d39a2eca1abcc8" exitCode=0 Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.557890 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"70752b4fdce0b0969978dc3e8d7f9dd0e39727110731eb2a22d39a2eca1abcc8"} Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.557931 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ecb2cc3829a882d602f232a35644d65ca8d2109d87036ba9e71bae282893be07"} Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.558048 5002 status_manager.go:851] "Failed to get status for pod" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.558240 5002 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.558288 5002 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.558520 5002 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:19 crc kubenswrapper[5002]: E1209 10:05:19.558537 5002 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.559052 5002 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:19 crc kubenswrapper[5002]: I1209 10:05:19.559366 5002 status_manager.go:851] "Failed to get status for pod" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.132:6443: connect: connection refused" Dec 09 10:05:20 crc kubenswrapper[5002]: I1209 10:05:20.585728 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 09 10:05:20 crc kubenswrapper[5002]: I1209 10:05:20.586236 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0423af2370a51189e93a0a18b39efe1282a793428c78b007d7d03edc0d7dc2cb"} Dec 09 10:05:20 crc kubenswrapper[5002]: I1209 10:05:20.591843 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5e6d04d5d900c4eca56ea7c60606c9c12d8c3c3468340226e696e48fccc76623"} Dec 09 10:05:20 crc kubenswrapper[5002]: I1209 10:05:20.591893 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a4f8bd35d460234d6737c817335045be5b47360254d71c81f3844943feb19cbe"} Dec 09 10:05:20 crc kubenswrapper[5002]: I1209 10:05:20.591907 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7e6c9652f355d127adbdb71ee27036011240440c3deb217ca8820a91caa59c9e"} Dec 09 10:05:20 crc kubenswrapper[5002]: I1209 10:05:20.591920 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"69133d82cacc18506bb037bc7012f4f0861fa47127b0c9c0ecdcfdedea9a0934"} Dec 09 10:05:21 crc kubenswrapper[5002]: I1209 10:05:21.599499 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c757f1a96622db99d61edb6865650c3ff365e594b3b413149391c7c9ff92f2f4"} Dec 09 10:05:21 crc kubenswrapper[5002]: I1209 10:05:21.600086 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:21 crc kubenswrapper[5002]: I1209 10:05:21.600080 5002 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9" Dec 09 10:05:21 crc kubenswrapper[5002]: I1209 10:05:21.600118 5002 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9" Dec 09 10:05:21 crc kubenswrapper[5002]: I1209 10:05:21.876669 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:05:24 crc kubenswrapper[5002]: I1209 10:05:24.078833 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:24 crc kubenswrapper[5002]: I1209 10:05:24.079134 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:24 crc kubenswrapper[5002]: I1209 10:05:24.085134 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:26 crc kubenswrapper[5002]: I1209 10:05:26.610404 5002 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:27 crc kubenswrapper[5002]: I1209 10:05:27.625163 5002 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9" Dec 09 10:05:27 crc kubenswrapper[5002]: I1209 10:05:27.625197 5002 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9" Dec 09 10:05:27 crc kubenswrapper[5002]: I1209 10:05:27.630086 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:28 crc kubenswrapper[5002]: I1209 10:05:28.070613 5002 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ee095eb4-56f3-4c48-b0f5-e235dcce7a5b" Dec 09 10:05:28 crc kubenswrapper[5002]: I1209 10:05:28.630585 5002 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9" Dec 09 10:05:28 crc kubenswrapper[5002]: I1209 10:05:28.630622 5002 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b8dc78-0dbd-4f3b-b40b-7e4bb24998d9" Dec 09 10:05:28 crc kubenswrapper[5002]: I1209 10:05:28.633285 5002 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ee095eb4-56f3-4c48-b0f5-e235dcce7a5b" Dec 09 10:05:28 crc kubenswrapper[5002]: I1209 10:05:28.716121 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:05:28 crc kubenswrapper[5002]: I1209 10:05:28.719740 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:05:29 crc kubenswrapper[5002]: I1209 10:05:29.639791 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 10:05:37 crc kubenswrapper[5002]: I1209 10:05:37.500729 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 09 10:05:37 crc kubenswrapper[5002]: I1209 10:05:37.888736 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 09 10:05:38 crc kubenswrapper[5002]: I1209 10:05:38.041432 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 09 10:05:38 crc kubenswrapper[5002]: I1209 10:05:38.125615 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 09 10:05:38 crc kubenswrapper[5002]: I1209 10:05:38.134186 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 09 10:05:38 crc kubenswrapper[5002]: I1209 10:05:38.509058 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 09 10:05:38 crc kubenswrapper[5002]: I1209 10:05:38.617661 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 10:05:38 crc kubenswrapper[5002]: I1209 10:05:38.697651 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 09 10:05:38 crc kubenswrapper[5002]: I1209 10:05:38.780681 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 09 10:05:38 crc kubenswrapper[5002]: I1209 10:05:38.857909 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 10:05:38 crc kubenswrapper[5002]: I1209 10:05:38.905753 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.133904 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.180082 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.379626 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.404134 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.409421 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.438267 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.441747 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.784124 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.795035 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.834346 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.914200 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.917426 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.955228 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.964149 5002 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 09 10:05:39 crc kubenswrapper[5002]: I1209 10:05:39.972928 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.077578 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.095259 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.242802 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.319013 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.369771 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.422022 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.452484 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.505055 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.539637 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.588519 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.656602 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 09 10:05:40 crc kubenswrapper[5002]: I1209 10:05:40.806262 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.173183 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.175951 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.221703 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.349991 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.356953 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.385546 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.388585 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.482748 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.654427 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.689944 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.690588 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.705362 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.711504 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.824361 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.935985 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 09 10:05:41 crc kubenswrapper[5002]: I1209 10:05:41.948307 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.006142 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.118402 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.137911 5002 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.140307 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.14028778 podStartE2EDuration="39.14028778s" podCreationTimestamp="2025-12-09 10:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:05:26.235763287 +0000 UTC m=+258.627814368" watchObservedRunningTime="2025-12-09 10:05:42.14028778 +0000 UTC m=+274.532338861" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.142855 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.142933 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.148989 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.158963 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.15894447 podStartE2EDuration="16.15894447s" podCreationTimestamp="2025-12-09 10:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:05:42.15723689 +0000 UTC m=+274.549287971" watchObservedRunningTime="2025-12-09 10:05:42.15894447 +0000 UTC m=+274.550995551" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.169600 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.170530 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.337657 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.366797 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.408772 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.450384 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.451679 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.480510 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.493770 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.501196 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.586843 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.758227 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.788401 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.811068 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.846530 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.887454 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.887850 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 10:05:42 crc kubenswrapper[5002]: I1209 10:05:42.943524 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.030902 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.101523 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.194871 5002 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.202407 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.238957 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.243059 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.246199 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.292654 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.317881 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.354580 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.375921 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.392175 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.402439 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.478529 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.549291 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.756765 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.766083 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.832046 5002 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.837838 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.843120 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.928254 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.928361 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.991645 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 09 10:05:43 crc kubenswrapper[5002]: I1209 10:05:43.994676 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.015849 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.071162 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.075492 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.096975 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.111489 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.121278 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.125676 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.145456 5002 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.182612 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.298351 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.315093 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.379849 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.500136 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.672481 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.726895 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.759445 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.847918 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.901701 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.904093 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.970575 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.971651 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.973319 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.988297 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 09 10:05:44 crc kubenswrapper[5002]: I1209 10:05:44.992251 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.018047 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.037399 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.279422 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.298633 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.316071 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.348684 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.353640 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.447366 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.459768 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.471364 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.551950 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.599952 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.606063 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.661104 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.678138 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.724439 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.856922 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.923849 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.972419 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 10:05:45 crc kubenswrapper[5002]: I1209 10:05:45.981104 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.000915 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.063951 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.078345 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.091087 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.120296 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.221886 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.233832 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.325568 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.338624 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.347432 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.441734 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.634671 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.720828 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.897045 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 09 10:05:46 crc kubenswrapper[5002]: I1209 10:05:46.939447 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.047301 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.079513 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.134091 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.148123 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.156761 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.222384 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.239744 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.259205 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.284929 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.379426 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.397664 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.535166 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.543250 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.545068 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.595369 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.597120 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.650267 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.664978 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.672012 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.684006 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.693686 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.799074 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.823439 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.870336 5002 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 09 10:05:47 crc kubenswrapper[5002]: I1209 10:05:47.971970 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.063091 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.106269 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.209802 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.270767 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.286636 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.300894 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.350315 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.367212 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.445742 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.475584 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.534349 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.549591 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.591760 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.694411 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.718163 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.736018 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.832335 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.872113 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.901380 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.913559 5002 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.913852 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66" gracePeriod=5 Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.919745 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.942268 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 09 10:05:48 crc kubenswrapper[5002]: I1209 10:05:48.966418 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.084144 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.093231 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.147443 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.262326 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.364675 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.404897 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.405358 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.408916 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.416179 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.434406 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.512303 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.568095 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.605535 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.674492 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.866490 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 09 10:05:49 crc kubenswrapper[5002]: I1209 10:05:49.901221 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.015162 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.134492 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.175295 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.184681 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.380856 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.404502 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.440773 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.452304 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.466498 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.479292 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.547903 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.567655 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.608248 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.632958 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.919267 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.935431 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lwx7v"] Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.936392 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lwx7v" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerName="registry-server" containerID="cri-o://df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da" gracePeriod=30 Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.947194 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2t8ql"] Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.947992 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2t8ql" podUID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerName="registry-server" containerID="cri-o://bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b" gracePeriod=30 Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.955634 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbsjj"] Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.955961 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" podUID="58eda1c5-bb10-4843-9ed4-64ee38d040ee" containerName="marketplace-operator" containerID="cri-o://7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9" gracePeriod=30 Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.973760 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptdnb"] Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.974052 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ptdnb" podUID="038508b9-13ca-43a3-8414-09097229bc83" containerName="registry-server" containerID="cri-o://5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa" gracePeriod=30 Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.985228 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.988366 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dlr44"] Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.989364 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dlr44" podUID="eee6148b-020b-46a1-9066-4cc2fb430d13" containerName="registry-server" containerID="cri-o://3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d" gracePeriod=30 Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.998038 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ph5kb"] Dec 09 10:05:50 crc kubenswrapper[5002]: E1209 10:05:50.998303 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" containerName="installer" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.998325 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" containerName="installer" Dec 09 10:05:50 crc kubenswrapper[5002]: E1209 10:05:50.998346 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.998354 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.998491 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="18877059-32eb-40f8-98c9-3cca1a075ec6" containerName="installer" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.998511 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 10:05:50 crc kubenswrapper[5002]: I1209 10:05:50.998972 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.008032 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.010872 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ph5kb"] Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.011988 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.084411 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csgln\" (UniqueName: \"kubernetes.io/projected/fe698bbf-4fa3-4d21-880a-110c1eedf006-kube-api-access-csgln\") pod \"marketplace-operator-79b997595-ph5kb\" (UID: \"fe698bbf-4fa3-4d21-880a-110c1eedf006\") " pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.084482 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe698bbf-4fa3-4d21-880a-110c1eedf006-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ph5kb\" (UID: \"fe698bbf-4fa3-4d21-880a-110c1eedf006\") " pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.084526 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fe698bbf-4fa3-4d21-880a-110c1eedf006-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ph5kb\" (UID: \"fe698bbf-4fa3-4d21-880a-110c1eedf006\") " pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.091478 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b is running failed: container process not found" containerID="bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.095157 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b is running failed: container process not found" containerID="bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.095540 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b is running failed: container process not found" containerID="bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.095619 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-2t8ql" podUID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerName="registry-server" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.147953 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.186417 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csgln\" (UniqueName: \"kubernetes.io/projected/fe698bbf-4fa3-4d21-880a-110c1eedf006-kube-api-access-csgln\") pod \"marketplace-operator-79b997595-ph5kb\" (UID: \"fe698bbf-4fa3-4d21-880a-110c1eedf006\") " pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.186470 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe698bbf-4fa3-4d21-880a-110c1eedf006-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ph5kb\" (UID: \"fe698bbf-4fa3-4d21-880a-110c1eedf006\") " pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.186501 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fe698bbf-4fa3-4d21-880a-110c1eedf006-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ph5kb\" (UID: \"fe698bbf-4fa3-4d21-880a-110c1eedf006\") " pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.187982 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe698bbf-4fa3-4d21-880a-110c1eedf006-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ph5kb\" (UID: \"fe698bbf-4fa3-4d21-880a-110c1eedf006\") " pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.193115 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fe698bbf-4fa3-4d21-880a-110c1eedf006-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ph5kb\" (UID: \"fe698bbf-4fa3-4d21-880a-110c1eedf006\") " pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.206169 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csgln\" (UniqueName: \"kubernetes.io/projected/fe698bbf-4fa3-4d21-880a-110c1eedf006-kube-api-access-csgln\") pod \"marketplace-operator-79b997595-ph5kb\" (UID: \"fe698bbf-4fa3-4d21-880a-110c1eedf006\") " pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.254975 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da is running failed: container process not found" containerID="df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.255525 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da is running failed: container process not found" containerID="df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.256994 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da is running failed: container process not found" containerID="df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.257033 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-lwx7v" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerName="registry-server" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.329479 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.385060 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.389900 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.394612 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.429194 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.435444 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.465771 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.504805 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-catalog-content\") pod \"39e38e20-dd9f-43b8-848f-e5618d52f6af\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.504872 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-operator-metrics\") pod \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.504901 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-utilities\") pod \"038508b9-13ca-43a3-8414-09097229bc83\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.504925 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wl4j\" (UniqueName: \"kubernetes.io/projected/2c7271b9-c21f-44a0-ad27-f4158f36f317-kube-api-access-6wl4j\") pod \"2c7271b9-c21f-44a0-ad27-f4158f36f317\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.504969 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-utilities\") pod \"39e38e20-dd9f-43b8-848f-e5618d52f6af\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.505011 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-catalog-content\") pod \"038508b9-13ca-43a3-8414-09097229bc83\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.505051 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-utilities\") pod \"2c7271b9-c21f-44a0-ad27-f4158f36f317\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.505074 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z2pz\" (UniqueName: \"kubernetes.io/projected/038508b9-13ca-43a3-8414-09097229bc83-kube-api-access-5z2pz\") pod \"038508b9-13ca-43a3-8414-09097229bc83\" (UID: \"038508b9-13ca-43a3-8414-09097229bc83\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.505096 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbszl\" (UniqueName: \"kubernetes.io/projected/58eda1c5-bb10-4843-9ed4-64ee38d040ee-kube-api-access-jbszl\") pod \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.505127 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-trusted-ca\") pod \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\" (UID: \"58eda1c5-bb10-4843-9ed4-64ee38d040ee\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.505152 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whdbq\" (UniqueName: \"kubernetes.io/projected/39e38e20-dd9f-43b8-848f-e5618d52f6af-kube-api-access-whdbq\") pod \"39e38e20-dd9f-43b8-848f-e5618d52f6af\" (UID: \"39e38e20-dd9f-43b8-848f-e5618d52f6af\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.505182 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-catalog-content\") pod \"2c7271b9-c21f-44a0-ad27-f4158f36f317\" (UID: \"2c7271b9-c21f-44a0-ad27-f4158f36f317\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.511680 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-utilities" (OuterVolumeSpecName: "utilities") pod "39e38e20-dd9f-43b8-848f-e5618d52f6af" (UID: "39e38e20-dd9f-43b8-848f-e5618d52f6af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.512289 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-utilities" (OuterVolumeSpecName: "utilities") pod "038508b9-13ca-43a3-8414-09097229bc83" (UID: "038508b9-13ca-43a3-8414-09097229bc83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.512698 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-utilities" (OuterVolumeSpecName: "utilities") pod "2c7271b9-c21f-44a0-ad27-f4158f36f317" (UID: "2c7271b9-c21f-44a0-ad27-f4158f36f317"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.513438 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/038508b9-13ca-43a3-8414-09097229bc83-kube-api-access-5z2pz" (OuterVolumeSpecName: "kube-api-access-5z2pz") pod "038508b9-13ca-43a3-8414-09097229bc83" (UID: "038508b9-13ca-43a3-8414-09097229bc83"). InnerVolumeSpecName "kube-api-access-5z2pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.513641 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "58eda1c5-bb10-4843-9ed4-64ee38d040ee" (UID: "58eda1c5-bb10-4843-9ed4-64ee38d040ee"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.516307 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e38e20-dd9f-43b8-848f-e5618d52f6af-kube-api-access-whdbq" (OuterVolumeSpecName: "kube-api-access-whdbq") pod "39e38e20-dd9f-43b8-848f-e5618d52f6af" (UID: "39e38e20-dd9f-43b8-848f-e5618d52f6af"). InnerVolumeSpecName "kube-api-access-whdbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.516419 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c7271b9-c21f-44a0-ad27-f4158f36f317-kube-api-access-6wl4j" (OuterVolumeSpecName: "kube-api-access-6wl4j") pod "2c7271b9-c21f-44a0-ad27-f4158f36f317" (UID: "2c7271b9-c21f-44a0-ad27-f4158f36f317"). InnerVolumeSpecName "kube-api-access-6wl4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.517723 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58eda1c5-bb10-4843-9ed4-64ee38d040ee-kube-api-access-jbszl" (OuterVolumeSpecName: "kube-api-access-jbszl") pod "58eda1c5-bb10-4843-9ed4-64ee38d040ee" (UID: "58eda1c5-bb10-4843-9ed4-64ee38d040ee"). InnerVolumeSpecName "kube-api-access-jbszl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.529451 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.533054 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "58eda1c5-bb10-4843-9ed4-64ee38d040ee" (UID: "58eda1c5-bb10-4843-9ed4-64ee38d040ee"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.536651 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "038508b9-13ca-43a3-8414-09097229bc83" (UID: "038508b9-13ca-43a3-8414-09097229bc83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.547683 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.559294 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39e38e20-dd9f-43b8-848f-e5618d52f6af" (UID: "39e38e20-dd9f-43b8-848f-e5618d52f6af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.582252 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c7271b9-c21f-44a0-ad27-f4158f36f317" (UID: "2c7271b9-c21f-44a0-ad27-f4158f36f317"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.612317 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-utilities\") pod \"eee6148b-020b-46a1-9066-4cc2fb430d13\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.612372 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfffc\" (UniqueName: \"kubernetes.io/projected/eee6148b-020b-46a1-9066-4cc2fb430d13-kube-api-access-rfffc\") pod \"eee6148b-020b-46a1-9066-4cc2fb430d13\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.612444 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-catalog-content\") pod \"eee6148b-020b-46a1-9066-4cc2fb430d13\" (UID: \"eee6148b-020b-46a1-9066-4cc2fb430d13\") " Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.612692 5002 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.612709 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.612721 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.612732 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wl4j\" (UniqueName: \"kubernetes.io/projected/2c7271b9-c21f-44a0-ad27-f4158f36f317-kube-api-access-6wl4j\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.612745 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e38e20-dd9f-43b8-848f-e5618d52f6af-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.613104 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038508b9-13ca-43a3-8414-09097229bc83-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.613116 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.613127 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z2pz\" (UniqueName: \"kubernetes.io/projected/038508b9-13ca-43a3-8414-09097229bc83-kube-api-access-5z2pz\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.613138 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbszl\" (UniqueName: \"kubernetes.io/projected/58eda1c5-bb10-4843-9ed4-64ee38d040ee-kube-api-access-jbszl\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.613149 5002 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58eda1c5-bb10-4843-9ed4-64ee38d040ee-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.613160 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whdbq\" (UniqueName: \"kubernetes.io/projected/39e38e20-dd9f-43b8-848f-e5618d52f6af-kube-api-access-whdbq\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.613170 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c7271b9-c21f-44a0-ad27-f4158f36f317-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.613192 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-utilities" (OuterVolumeSpecName: "utilities") pod "eee6148b-020b-46a1-9066-4cc2fb430d13" (UID: "eee6148b-020b-46a1-9066-4cc2fb430d13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.619984 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee6148b-020b-46a1-9066-4cc2fb430d13-kube-api-access-rfffc" (OuterVolumeSpecName: "kube-api-access-rfffc") pod "eee6148b-020b-46a1-9066-4cc2fb430d13" (UID: "eee6148b-020b-46a1-9066-4cc2fb430d13"). InnerVolumeSpecName "kube-api-access-rfffc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.714752 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.714792 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfffc\" (UniqueName: \"kubernetes.io/projected/eee6148b-020b-46a1-9066-4cc2fb430d13-kube-api-access-rfffc\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.722665 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eee6148b-020b-46a1-9066-4cc2fb430d13" (UID: "eee6148b-020b-46a1-9066-4cc2fb430d13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.756728 5002 generic.go:334] "Generic (PLEG): container finished" podID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerID="bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b" exitCode=0 Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.756798 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2t8ql" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.756825 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2t8ql" event={"ID":"39e38e20-dd9f-43b8-848f-e5618d52f6af","Type":"ContainerDied","Data":"bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b"} Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.756858 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2t8ql" event={"ID":"39e38e20-dd9f-43b8-848f-e5618d52f6af","Type":"ContainerDied","Data":"4e273412823b2a2334e2f0b11bf3f1ad0219e3cead84719ca1fa8ed58a9c55ea"} Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.756877 5002 scope.go:117] "RemoveContainer" containerID="bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.761851 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwx7v" event={"ID":"2c7271b9-c21f-44a0-ad27-f4158f36f317","Type":"ContainerDied","Data":"df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da"} Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.762180 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lwx7v" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.762871 5002 generic.go:334] "Generic (PLEG): container finished" podID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerID="df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da" exitCode=0 Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.762937 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwx7v" event={"ID":"2c7271b9-c21f-44a0-ad27-f4158f36f317","Type":"ContainerDied","Data":"e7d0c686101ed9ef230922365b540600dcb072af37dc3a21e4470f962c84f315"} Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.764892 5002 generic.go:334] "Generic (PLEG): container finished" podID="58eda1c5-bb10-4843-9ed4-64ee38d040ee" containerID="7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9" exitCode=0 Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.764986 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.765019 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" event={"ID":"58eda1c5-bb10-4843-9ed4-64ee38d040ee","Type":"ContainerDied","Data":"7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9"} Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.765051 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbsjj" event={"ID":"58eda1c5-bb10-4843-9ed4-64ee38d040ee","Type":"ContainerDied","Data":"f997ec069e6e15f37849acdf17bc7ab4eb22ef7212b328411bd5bbfc3c757675"} Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.767899 5002 generic.go:334] "Generic (PLEG): container finished" podID="038508b9-13ca-43a3-8414-09097229bc83" containerID="5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa" exitCode=0 Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.767944 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptdnb" event={"ID":"038508b9-13ca-43a3-8414-09097229bc83","Type":"ContainerDied","Data":"5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa"} Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.767971 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptdnb" event={"ID":"038508b9-13ca-43a3-8414-09097229bc83","Type":"ContainerDied","Data":"110eefe8216fc554134c9a753cf2bd8e2d88ef78e50a66c4abcdfb5380faf93a"} Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.768027 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptdnb" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.777348 5002 generic.go:334] "Generic (PLEG): container finished" podID="eee6148b-020b-46a1-9066-4cc2fb430d13" containerID="3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d" exitCode=0 Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.777404 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlr44" event={"ID":"eee6148b-020b-46a1-9066-4cc2fb430d13","Type":"ContainerDied","Data":"3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d"} Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.777438 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlr44" event={"ID":"eee6148b-020b-46a1-9066-4cc2fb430d13","Type":"ContainerDied","Data":"ec00257f9c62aaeac26b4afaf67d3e28276590708523ae92d7340822adedb31a"} Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.777521 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlr44" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.786940 5002 scope.go:117] "RemoveContainer" containerID="c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.811540 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2t8ql"] Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.817559 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee6148b-020b-46a1-9066-4cc2fb430d13-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.834591 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2t8ql"] Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.839452 5002 scope.go:117] "RemoveContainer" containerID="f1401b3b4b61efe0f5bd2c77722361f032246bc600229a0e0b7237691d356867" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.850422 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbsjj"] Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.862339 5002 scope.go:117] "RemoveContainer" containerID="bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b" Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.862941 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b\": container with ID starting with bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b not found: ID does not exist" containerID="bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.862982 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b"} err="failed to get container status \"bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b\": rpc error: code = NotFound desc = could not find container \"bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b\": container with ID starting with bc662e51aa429f06840b5229e4ab31e576f663fbe9d9a33571f4b4a3b2eef90b not found: ID does not exist" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.863007 5002 scope.go:117] "RemoveContainer" containerID="c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.863014 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbsjj"] Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.863386 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2\": container with ID starting with c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2 not found: ID does not exist" containerID="c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.863483 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2"} err="failed to get container status \"c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2\": rpc error: code = NotFound desc = could not find container \"c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2\": container with ID starting with c8cf32ec45ce42190e0e3cb894d8f3b69c14539d9b0b4fbd7a2f498d773b3dc2 not found: ID does not exist" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.863569 5002 scope.go:117] "RemoveContainer" containerID="f1401b3b4b61efe0f5bd2c77722361f032246bc600229a0e0b7237691d356867" Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.864112 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1401b3b4b61efe0f5bd2c77722361f032246bc600229a0e0b7237691d356867\": container with ID starting with f1401b3b4b61efe0f5bd2c77722361f032246bc600229a0e0b7237691d356867 not found: ID does not exist" containerID="f1401b3b4b61efe0f5bd2c77722361f032246bc600229a0e0b7237691d356867" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.864174 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1401b3b4b61efe0f5bd2c77722361f032246bc600229a0e0b7237691d356867"} err="failed to get container status \"f1401b3b4b61efe0f5bd2c77722361f032246bc600229a0e0b7237691d356867\": rpc error: code = NotFound desc = could not find container \"f1401b3b4b61efe0f5bd2c77722361f032246bc600229a0e0b7237691d356867\": container with ID starting with f1401b3b4b61efe0f5bd2c77722361f032246bc600229a0e0b7237691d356867 not found: ID does not exist" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.864193 5002 scope.go:117] "RemoveContainer" containerID="df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.868272 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lwx7v"] Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.872926 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lwx7v"] Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.877155 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ph5kb"] Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.880065 5002 scope.go:117] "RemoveContainer" containerID="1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.881344 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptdnb"] Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.885102 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptdnb"] Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.888889 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dlr44"] Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.893029 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dlr44"] Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.897001 5002 scope.go:117] "RemoveContainer" containerID="8716a18b6cf69a4d331c466e77fc5276e9f028a821ebe3f9bfde4b0bb599d9e0" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.921576 5002 scope.go:117] "RemoveContainer" containerID="df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da" Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.922225 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da\": container with ID starting with df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da not found: ID does not exist" containerID="df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.922262 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da"} err="failed to get container status \"df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da\": rpc error: code = NotFound desc = could not find container \"df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da\": container with ID starting with df7dbdd05bab8cb0e7ee78228042d387e0d50f88fe6661c9e15e153ba91180da not found: ID does not exist" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.922285 5002 scope.go:117] "RemoveContainer" containerID="1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17" Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.922520 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17\": container with ID starting with 1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17 not found: ID does not exist" containerID="1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.922538 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17"} err="failed to get container status \"1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17\": rpc error: code = NotFound desc = could not find container \"1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17\": container with ID starting with 1a1703fc4497ed500211349bc3d5d1221e45f4a6555a6dc38d5aa7bce849cf17 not found: ID does not exist" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.922553 5002 scope.go:117] "RemoveContainer" containerID="8716a18b6cf69a4d331c466e77fc5276e9f028a821ebe3f9bfde4b0bb599d9e0" Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.922850 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8716a18b6cf69a4d331c466e77fc5276e9f028a821ebe3f9bfde4b0bb599d9e0\": container with ID starting with 8716a18b6cf69a4d331c466e77fc5276e9f028a821ebe3f9bfde4b0bb599d9e0 not found: ID does not exist" containerID="8716a18b6cf69a4d331c466e77fc5276e9f028a821ebe3f9bfde4b0bb599d9e0" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.922866 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8716a18b6cf69a4d331c466e77fc5276e9f028a821ebe3f9bfde4b0bb599d9e0"} err="failed to get container status \"8716a18b6cf69a4d331c466e77fc5276e9f028a821ebe3f9bfde4b0bb599d9e0\": rpc error: code = NotFound desc = could not find container \"8716a18b6cf69a4d331c466e77fc5276e9f028a821ebe3f9bfde4b0bb599d9e0\": container with ID starting with 8716a18b6cf69a4d331c466e77fc5276e9f028a821ebe3f9bfde4b0bb599d9e0 not found: ID does not exist" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.922878 5002 scope.go:117] "RemoveContainer" containerID="7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.949150 5002 scope.go:117] "RemoveContainer" containerID="7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9" Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.955447 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9\": container with ID starting with 7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9 not found: ID does not exist" containerID="7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.955489 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9"} err="failed to get container status \"7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9\": rpc error: code = NotFound desc = could not find container \"7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9\": container with ID starting with 7f187db52e6b1de8f873db3376e44a8f54b8c3929a1d134c3d504770dc0328f9 not found: ID does not exist" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.955520 5002 scope.go:117] "RemoveContainer" containerID="5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.971088 5002 scope.go:117] "RemoveContainer" containerID="428ae6ca9b62f545763de424679882ca418bb5328308202cbe5e76a963b7548f" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.983940 5002 scope.go:117] "RemoveContainer" containerID="97ca49eb7ee783f45d7c09dda4ace63c3d6f47600a0f7028959d427ba1b6e1f7" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.995073 5002 scope.go:117] "RemoveContainer" containerID="5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa" Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.995573 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa\": container with ID starting with 5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa not found: ID does not exist" containerID="5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.995621 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa"} err="failed to get container status \"5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa\": rpc error: code = NotFound desc = could not find container \"5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa\": container with ID starting with 5bc033d4dc59fccb04739a55fc369dcb7d6798a60c7fb76563249e7f9bb248aa not found: ID does not exist" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.995647 5002 scope.go:117] "RemoveContainer" containerID="428ae6ca9b62f545763de424679882ca418bb5328308202cbe5e76a963b7548f" Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.997556 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"428ae6ca9b62f545763de424679882ca418bb5328308202cbe5e76a963b7548f\": container with ID starting with 428ae6ca9b62f545763de424679882ca418bb5328308202cbe5e76a963b7548f not found: ID does not exist" containerID="428ae6ca9b62f545763de424679882ca418bb5328308202cbe5e76a963b7548f" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.997589 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"428ae6ca9b62f545763de424679882ca418bb5328308202cbe5e76a963b7548f"} err="failed to get container status \"428ae6ca9b62f545763de424679882ca418bb5328308202cbe5e76a963b7548f\": rpc error: code = NotFound desc = could not find container \"428ae6ca9b62f545763de424679882ca418bb5328308202cbe5e76a963b7548f\": container with ID starting with 428ae6ca9b62f545763de424679882ca418bb5328308202cbe5e76a963b7548f not found: ID does not exist" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.997610 5002 scope.go:117] "RemoveContainer" containerID="97ca49eb7ee783f45d7c09dda4ace63c3d6f47600a0f7028959d427ba1b6e1f7" Dec 09 10:05:51 crc kubenswrapper[5002]: E1209 10:05:51.997962 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ca49eb7ee783f45d7c09dda4ace63c3d6f47600a0f7028959d427ba1b6e1f7\": container with ID starting with 97ca49eb7ee783f45d7c09dda4ace63c3d6f47600a0f7028959d427ba1b6e1f7 not found: ID does not exist" containerID="97ca49eb7ee783f45d7c09dda4ace63c3d6f47600a0f7028959d427ba1b6e1f7" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.997997 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ca49eb7ee783f45d7c09dda4ace63c3d6f47600a0f7028959d427ba1b6e1f7"} err="failed to get container status \"97ca49eb7ee783f45d7c09dda4ace63c3d6f47600a0f7028959d427ba1b6e1f7\": rpc error: code = NotFound desc = could not find container \"97ca49eb7ee783f45d7c09dda4ace63c3d6f47600a0f7028959d427ba1b6e1f7\": container with ID starting with 97ca49eb7ee783f45d7c09dda4ace63c3d6f47600a0f7028959d427ba1b6e1f7 not found: ID does not exist" Dec 09 10:05:51 crc kubenswrapper[5002]: I1209 10:05:51.998164 5002 scope.go:117] "RemoveContainer" containerID="3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.011295 5002 scope.go:117] "RemoveContainer" containerID="cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.025052 5002 scope.go:117] "RemoveContainer" containerID="27b1443e54c6d8eafba4a3e3be1fa051be438ef8ade31c3ab7e8491c1731a8df" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.037385 5002 scope.go:117] "RemoveContainer" containerID="3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d" Dec 09 10:05:52 crc kubenswrapper[5002]: E1209 10:05:52.037709 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d\": container with ID starting with 3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d not found: ID does not exist" containerID="3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.037743 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d"} err="failed to get container status \"3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d\": rpc error: code = NotFound desc = could not find container \"3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d\": container with ID starting with 3a0a50587467613ed93843246df92f780495ce62bc1c402465593d1317d65e6d not found: ID does not exist" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.037770 5002 scope.go:117] "RemoveContainer" containerID="cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d" Dec 09 10:05:52 crc kubenswrapper[5002]: E1209 10:05:52.038106 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d\": container with ID starting with cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d not found: ID does not exist" containerID="cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.038141 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d"} err="failed to get container status \"cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d\": rpc error: code = NotFound desc = could not find container \"cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d\": container with ID starting with cf1abfd43763cb606db536b9dc79084807ad5e698579ae25755b164a4a02963d not found: ID does not exist" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.038161 5002 scope.go:117] "RemoveContainer" containerID="27b1443e54c6d8eafba4a3e3be1fa051be438ef8ade31c3ab7e8491c1731a8df" Dec 09 10:05:52 crc kubenswrapper[5002]: E1209 10:05:52.038473 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b1443e54c6d8eafba4a3e3be1fa051be438ef8ade31c3ab7e8491c1731a8df\": container with ID starting with 27b1443e54c6d8eafba4a3e3be1fa051be438ef8ade31c3ab7e8491c1731a8df not found: ID does not exist" containerID="27b1443e54c6d8eafba4a3e3be1fa051be438ef8ade31c3ab7e8491c1731a8df" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.038501 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b1443e54c6d8eafba4a3e3be1fa051be438ef8ade31c3ab7e8491c1731a8df"} err="failed to get container status \"27b1443e54c6d8eafba4a3e3be1fa051be438ef8ade31c3ab7e8491c1731a8df\": rpc error: code = NotFound desc = could not find container \"27b1443e54c6d8eafba4a3e3be1fa051be438ef8ade31c3ab7e8491c1731a8df\": container with ID starting with 27b1443e54c6d8eafba4a3e3be1fa051be438ef8ade31c3ab7e8491c1731a8df not found: ID does not exist" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.073525 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="038508b9-13ca-43a3-8414-09097229bc83" path="/var/lib/kubelet/pods/038508b9-13ca-43a3-8414-09097229bc83/volumes" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.074121 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" path="/var/lib/kubelet/pods/2c7271b9-c21f-44a0-ad27-f4158f36f317/volumes" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.074738 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e38e20-dd9f-43b8-848f-e5618d52f6af" path="/var/lib/kubelet/pods/39e38e20-dd9f-43b8-848f-e5618d52f6af/volumes" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.076214 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58eda1c5-bb10-4843-9ed4-64ee38d040ee" path="/var/lib/kubelet/pods/58eda1c5-bb10-4843-9ed4-64ee38d040ee/volumes" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.076624 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee6148b-020b-46a1-9066-4cc2fb430d13" path="/var/lib/kubelet/pods/eee6148b-020b-46a1-9066-4cc2fb430d13/volumes" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.269044 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.330903 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.354548 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.372340 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.785955 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" event={"ID":"fe698bbf-4fa3-4d21-880a-110c1eedf006","Type":"ContainerStarted","Data":"a472f0702c57d5baee1780dfe3ae97de1da7820329b670cc6b19f84ac8bbba4a"} Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.786282 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" event={"ID":"fe698bbf-4fa3-4d21-880a-110c1eedf006","Type":"ContainerStarted","Data":"c9b38134e0ed50589e55ca8b5c5ab55582aa1438306eb9e148e0bedadf835dfb"} Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.790056 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.796224 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" Dec 09 10:05:52 crc kubenswrapper[5002]: I1209 10:05:52.804858 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ph5kb" podStartSLOduration=2.804761869 podStartE2EDuration="2.804761869s" podCreationTimestamp="2025-12-09 10:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:05:52.802460433 +0000 UTC m=+285.194511504" watchObservedRunningTime="2025-12-09 10:05:52.804761869 +0000 UTC m=+285.196812950" Dec 09 10:05:53 crc kubenswrapper[5002]: I1209 10:05:53.018321 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 09 10:05:53 crc kubenswrapper[5002]: I1209 10:05:53.213788 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 09 10:05:53 crc kubenswrapper[5002]: I1209 10:05:53.340698 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.477243 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.477332 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.551549 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.551612 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.551657 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.551650 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.551674 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.551714 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.551746 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.551823 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.551908 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.552134 5002 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.552150 5002 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.552158 5002 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.552166 5002 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.560014 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.653360 5002 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.809980 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.810319 5002 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66" exitCode=137 Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.810393 5002 scope.go:117] "RemoveContainer" containerID="47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.810445 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.832685 5002 scope.go:117] "RemoveContainer" containerID="47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66" Dec 09 10:05:54 crc kubenswrapper[5002]: E1209 10:05:54.833303 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66\": container with ID starting with 47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66 not found: ID does not exist" containerID="47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66" Dec 09 10:05:54 crc kubenswrapper[5002]: I1209 10:05:54.833370 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66"} err="failed to get container status \"47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66\": rpc error: code = NotFound desc = could not find container \"47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66\": container with ID starting with 47f495cec7a4ee16110d5ccee09fa16753752241ab1d51f76112b659fb602e66 not found: ID does not exist" Dec 09 10:05:56 crc kubenswrapper[5002]: I1209 10:05:56.066244 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 09 10:05:56 crc kubenswrapper[5002]: I1209 10:05:56.066637 5002 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 09 10:05:56 crc kubenswrapper[5002]: I1209 10:05:56.076055 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 10:05:56 crc kubenswrapper[5002]: I1209 10:05:56.076091 5002 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c8d1730c-d62e-46da-8614-ba6d257325d8" Dec 09 10:05:56 crc kubenswrapper[5002]: I1209 10:05:56.080188 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 10:05:56 crc kubenswrapper[5002]: I1209 10:05:56.080230 5002 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c8d1730c-d62e-46da-8614-ba6d257325d8" Dec 09 10:06:08 crc kubenswrapper[5002]: I1209 10:06:08.962643 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 09 10:06:12 crc kubenswrapper[5002]: I1209 10:06:12.597043 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k6s7s"] Dec 09 10:06:12 crc kubenswrapper[5002]: I1209 10:06:12.597780 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" podUID="ae53c0af-26d8-4db2-8885-86b643bf2ee1" containerName="controller-manager" containerID="cri-o://bad62c812098db740748f873f1eff54a9fe95c067b9d1183cfed78980ef8835b" gracePeriod=30 Dec 09 10:06:12 crc kubenswrapper[5002]: I1209 10:06:12.705733 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt"] Dec 09 10:06:12 crc kubenswrapper[5002]: I1209 10:06:12.705991 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" podUID="fef799ad-c87c-4f60-9b8e-afb51958013d" containerName="route-controller-manager" containerID="cri-o://1953c98e8c09876af85ad4075d814391ca089b221cd8d169608db865976b9f86" gracePeriod=30 Dec 09 10:06:12 crc kubenswrapper[5002]: I1209 10:06:12.831188 5002 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-k6s7s container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 09 10:06:12 crc kubenswrapper[5002]: I1209 10:06:12.831245 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" podUID="ae53c0af-26d8-4db2-8885-86b643bf2ee1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 09 10:06:12 crc kubenswrapper[5002]: I1209 10:06:12.908286 5002 generic.go:334] "Generic (PLEG): container finished" podID="fef799ad-c87c-4f60-9b8e-afb51958013d" containerID="1953c98e8c09876af85ad4075d814391ca089b221cd8d169608db865976b9f86" exitCode=0 Dec 09 10:06:12 crc kubenswrapper[5002]: I1209 10:06:12.908366 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" event={"ID":"fef799ad-c87c-4f60-9b8e-afb51958013d","Type":"ContainerDied","Data":"1953c98e8c09876af85ad4075d814391ca089b221cd8d169608db865976b9f86"} Dec 09 10:06:12 crc kubenswrapper[5002]: I1209 10:06:12.909487 5002 generic.go:334] "Generic (PLEG): container finished" podID="ae53c0af-26d8-4db2-8885-86b643bf2ee1" containerID="bad62c812098db740748f873f1eff54a9fe95c067b9d1183cfed78980ef8835b" exitCode=0 Dec 09 10:06:12 crc kubenswrapper[5002]: I1209 10:06:12.909509 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" event={"ID":"ae53c0af-26d8-4db2-8885-86b643bf2ee1","Type":"ContainerDied","Data":"bad62c812098db740748f873f1eff54a9fe95c067b9d1183cfed78980ef8835b"} Dec 09 10:06:12 crc kubenswrapper[5002]: I1209 10:06:12.994973 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.056000 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.141869 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-config\") pod \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.141917 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pfj8\" (UniqueName: \"kubernetes.io/projected/ae53c0af-26d8-4db2-8885-86b643bf2ee1-kube-api-access-4pfj8\") pod \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.141999 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae53c0af-26d8-4db2-8885-86b643bf2ee1-serving-cert\") pod \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.142056 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-proxy-ca-bundles\") pod \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.142105 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-client-ca\") pod \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\" (UID: \"ae53c0af-26d8-4db2-8885-86b643bf2ee1\") " Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.142601 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-client-ca" (OuterVolumeSpecName: "client-ca") pod "ae53c0af-26d8-4db2-8885-86b643bf2ee1" (UID: "ae53c0af-26d8-4db2-8885-86b643bf2ee1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.142658 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-config" (OuterVolumeSpecName: "config") pod "ae53c0af-26d8-4db2-8885-86b643bf2ee1" (UID: "ae53c0af-26d8-4db2-8885-86b643bf2ee1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.142959 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ae53c0af-26d8-4db2-8885-86b643bf2ee1" (UID: "ae53c0af-26d8-4db2-8885-86b643bf2ee1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.147569 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae53c0af-26d8-4db2-8885-86b643bf2ee1-kube-api-access-4pfj8" (OuterVolumeSpecName: "kube-api-access-4pfj8") pod "ae53c0af-26d8-4db2-8885-86b643bf2ee1" (UID: "ae53c0af-26d8-4db2-8885-86b643bf2ee1"). InnerVolumeSpecName "kube-api-access-4pfj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.147592 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae53c0af-26d8-4db2-8885-86b643bf2ee1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ae53c0af-26d8-4db2-8885-86b643bf2ee1" (UID: "ae53c0af-26d8-4db2-8885-86b643bf2ee1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.243101 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-client-ca\") pod \"fef799ad-c87c-4f60-9b8e-afb51958013d\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.243177 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fef799ad-c87c-4f60-9b8e-afb51958013d-serving-cert\") pod \"fef799ad-c87c-4f60-9b8e-afb51958013d\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.243212 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmb4z\" (UniqueName: \"kubernetes.io/projected/fef799ad-c87c-4f60-9b8e-afb51958013d-kube-api-access-kmb4z\") pod \"fef799ad-c87c-4f60-9b8e-afb51958013d\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.243254 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-config\") pod \"fef799ad-c87c-4f60-9b8e-afb51958013d\" (UID: \"fef799ad-c87c-4f60-9b8e-afb51958013d\") " Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.243439 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae53c0af-26d8-4db2-8885-86b643bf2ee1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.243450 5002 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.243459 5002 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.243468 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae53c0af-26d8-4db2-8885-86b643bf2ee1-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.243476 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pfj8\" (UniqueName: \"kubernetes.io/projected/ae53c0af-26d8-4db2-8885-86b643bf2ee1-kube-api-access-4pfj8\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.243966 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-client-ca" (OuterVolumeSpecName: "client-ca") pod "fef799ad-c87c-4f60-9b8e-afb51958013d" (UID: "fef799ad-c87c-4f60-9b8e-afb51958013d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.244016 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-config" (OuterVolumeSpecName: "config") pod "fef799ad-c87c-4f60-9b8e-afb51958013d" (UID: "fef799ad-c87c-4f60-9b8e-afb51958013d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.246464 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fef799ad-c87c-4f60-9b8e-afb51958013d-kube-api-access-kmb4z" (OuterVolumeSpecName: "kube-api-access-kmb4z") pod "fef799ad-c87c-4f60-9b8e-afb51958013d" (UID: "fef799ad-c87c-4f60-9b8e-afb51958013d"). InnerVolumeSpecName "kube-api-access-kmb4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.247173 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fef799ad-c87c-4f60-9b8e-afb51958013d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fef799ad-c87c-4f60-9b8e-afb51958013d" (UID: "fef799ad-c87c-4f60-9b8e-afb51958013d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.344381 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fef799ad-c87c-4f60-9b8e-afb51958013d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.344419 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmb4z\" (UniqueName: \"kubernetes.io/projected/fef799ad-c87c-4f60-9b8e-afb51958013d-kube-api-access-kmb4z\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.344429 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.344437 5002 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fef799ad-c87c-4f60-9b8e-afb51958013d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.576849 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-786fd98fb4-pwgfk"] Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577044 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fef799ad-c87c-4f60-9b8e-afb51958013d" containerName="route-controller-manager" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577057 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef799ad-c87c-4f60-9b8e-afb51958013d" containerName="route-controller-manager" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577067 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerName="extract-utilities" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577073 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerName="extract-utilities" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577083 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038508b9-13ca-43a3-8414-09097229bc83" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577090 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="038508b9-13ca-43a3-8414-09097229bc83" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577097 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577104 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577114 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee6148b-020b-46a1-9066-4cc2fb430d13" containerName="extract-utilities" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577120 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee6148b-020b-46a1-9066-4cc2fb430d13" containerName="extract-utilities" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577127 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577132 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577140 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerName="extract-utilities" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577145 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerName="extract-utilities" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577153 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee6148b-020b-46a1-9066-4cc2fb430d13" containerName="extract-content" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577159 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee6148b-020b-46a1-9066-4cc2fb430d13" containerName="extract-content" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577166 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58eda1c5-bb10-4843-9ed4-64ee38d040ee" containerName="marketplace-operator" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577173 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="58eda1c5-bb10-4843-9ed4-64ee38d040ee" containerName="marketplace-operator" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577185 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038508b9-13ca-43a3-8414-09097229bc83" containerName="extract-utilities" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577191 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="038508b9-13ca-43a3-8414-09097229bc83" containerName="extract-utilities" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577198 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee6148b-020b-46a1-9066-4cc2fb430d13" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577203 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee6148b-020b-46a1-9066-4cc2fb430d13" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577212 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerName="extract-content" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577217 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerName="extract-content" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577226 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerName="extract-content" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577232 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerName="extract-content" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577240 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae53c0af-26d8-4db2-8885-86b643bf2ee1" containerName="controller-manager" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577245 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae53c0af-26d8-4db2-8885-86b643bf2ee1" containerName="controller-manager" Dec 09 10:06:13 crc kubenswrapper[5002]: E1209 10:06:13.577253 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038508b9-13ca-43a3-8414-09097229bc83" containerName="extract-content" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577258 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="038508b9-13ca-43a3-8414-09097229bc83" containerName="extract-content" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577345 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c7271b9-c21f-44a0-ad27-f4158f36f317" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577360 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="038508b9-13ca-43a3-8414-09097229bc83" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577368 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="fef799ad-c87c-4f60-9b8e-afb51958013d" containerName="route-controller-manager" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577377 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee6148b-020b-46a1-9066-4cc2fb430d13" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577384 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae53c0af-26d8-4db2-8885-86b643bf2ee1" containerName="controller-manager" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577393 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="58eda1c5-bb10-4843-9ed4-64ee38d040ee" containerName="marketplace-operator" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577400 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e38e20-dd9f-43b8-848f-e5618d52f6af" containerName="registry-server" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.577749 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.588220 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-786fd98fb4-pwgfk"] Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.660386 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7"] Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.661097 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.669485 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7"] Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.748131 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-client-ca\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.748175 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfcb9729-596d-4afa-a55d-66220c7d04e9-serving-cert\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.748391 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-proxy-ca-bundles\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.748511 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-config\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.748559 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mts95\" (UniqueName: \"kubernetes.io/projected/cfcb9729-596d-4afa-a55d-66220c7d04e9-kube-api-access-mts95\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.849209 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-client-ca\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.849274 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfcb9729-596d-4afa-a55d-66220c7d04e9-serving-cert\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.849310 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9qw5\" (UniqueName: \"kubernetes.io/projected/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-kube-api-access-b9qw5\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.849340 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-proxy-ca-bundles\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.849365 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-config\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.849389 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-config\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.849503 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mts95\" (UniqueName: \"kubernetes.io/projected/cfcb9729-596d-4afa-a55d-66220c7d04e9-kube-api-access-mts95\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.849583 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-client-ca\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.849650 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-serving-cert\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.850176 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-client-ca\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.850529 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-config\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.850682 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-proxy-ca-bundles\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.852940 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfcb9729-596d-4afa-a55d-66220c7d04e9-serving-cert\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.866678 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mts95\" (UniqueName: \"kubernetes.io/projected/cfcb9729-596d-4afa-a55d-66220c7d04e9-kube-api-access-mts95\") pod \"controller-manager-786fd98fb4-pwgfk\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.892833 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.915198 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" event={"ID":"ae53c0af-26d8-4db2-8885-86b643bf2ee1","Type":"ContainerDied","Data":"917b424463977c2974818a32b9e61d3ead53c2d7a90b84b270f96c2d44c65ee8"} Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.915275 5002 scope.go:117] "RemoveContainer" containerID="bad62c812098db740748f873f1eff54a9fe95c067b9d1183cfed78980ef8835b" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.915281 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k6s7s" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.920060 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.919999 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" event={"ID":"fef799ad-c87c-4f60-9b8e-afb51958013d","Type":"ContainerDied","Data":"ae94c6aa98f334d9b2a408983bfebf6e7e28471071dd496597775f7c3d410e8e"} Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.943768 5002 scope.go:117] "RemoveContainer" containerID="1953c98e8c09876af85ad4075d814391ca089b221cd8d169608db865976b9f86" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.950502 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-serving-cert\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.950574 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9qw5\" (UniqueName: \"kubernetes.io/projected/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-kube-api-access-b9qw5\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.950605 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-config\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.950636 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-client-ca\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.952077 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-client-ca\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.954935 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k6s7s"] Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.955428 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-config\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.957338 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-serving-cert\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.959541 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k6s7s"] Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.965434 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt"] Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.969707 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt"] Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.980080 5002 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-c9pnt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: i/o timeout" start-of-body= Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.980148 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c9pnt" podUID="fef799ad-c87c-4f60-9b8e-afb51958013d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: i/o timeout" Dec 09 10:06:13 crc kubenswrapper[5002]: I1209 10:06:13.981367 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9qw5\" (UniqueName: \"kubernetes.io/projected/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-kube-api-access-b9qw5\") pod \"route-controller-manager-6d88dcd4c6-mcsj7\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:14 crc kubenswrapper[5002]: I1209 10:06:14.069228 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae53c0af-26d8-4db2-8885-86b643bf2ee1" path="/var/lib/kubelet/pods/ae53c0af-26d8-4db2-8885-86b643bf2ee1/volumes" Dec 09 10:06:14 crc kubenswrapper[5002]: I1209 10:06:14.069888 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fef799ad-c87c-4f60-9b8e-afb51958013d" path="/var/lib/kubelet/pods/fef799ad-c87c-4f60-9b8e-afb51958013d/volumes" Dec 09 10:06:14 crc kubenswrapper[5002]: I1209 10:06:14.077306 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-786fd98fb4-pwgfk"] Dec 09 10:06:14 crc kubenswrapper[5002]: W1209 10:06:14.082186 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfcb9729_596d_4afa_a55d_66220c7d04e9.slice/crio-2b47974ce58aeed1abeb6adc5ba8c759c25fff53515a83774f29a98f931e1714 WatchSource:0}: Error finding container 2b47974ce58aeed1abeb6adc5ba8c759c25fff53515a83774f29a98f931e1714: Status 404 returned error can't find the container with id 2b47974ce58aeed1abeb6adc5ba8c759c25fff53515a83774f29a98f931e1714 Dec 09 10:06:14 crc kubenswrapper[5002]: I1209 10:06:14.184542 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 10:06:14 crc kubenswrapper[5002]: I1209 10:06:14.203331 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-786fd98fb4-pwgfk"] Dec 09 10:06:14 crc kubenswrapper[5002]: I1209 10:06:14.277896 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7"] Dec 09 10:06:14 crc kubenswrapper[5002]: I1209 10:06:14.278240 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:14 crc kubenswrapper[5002]: I1209 10:06:14.449750 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7"] Dec 09 10:06:14 crc kubenswrapper[5002]: W1209 10:06:14.457868 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa217b39_c5f7_47d5_8abe_30d2e2912ab0.slice/crio-5b56a4e5f196b1fae0cd400254b4bc9da0b674627309e6665a06b0351632298c WatchSource:0}: Error finding container 5b56a4e5f196b1fae0cd400254b4bc9da0b674627309e6665a06b0351632298c: Status 404 returned error can't find the container with id 5b56a4e5f196b1fae0cd400254b4bc9da0b674627309e6665a06b0351632298c Dec 09 10:06:14 crc kubenswrapper[5002]: I1209 10:06:14.926717 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" event={"ID":"cfcb9729-596d-4afa-a55d-66220c7d04e9","Type":"ContainerStarted","Data":"514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0"} Dec 09 10:06:14 crc kubenswrapper[5002]: I1209 10:06:14.926772 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" event={"ID":"cfcb9729-596d-4afa-a55d-66220c7d04e9","Type":"ContainerStarted","Data":"2b47974ce58aeed1abeb6adc5ba8c759c25fff53515a83774f29a98f931e1714"} Dec 09 10:06:14 crc kubenswrapper[5002]: I1209 10:06:14.927585 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" event={"ID":"fa217b39-c5f7-47d5-8abe-30d2e2912ab0","Type":"ContainerStarted","Data":"5b56a4e5f196b1fae0cd400254b4bc9da0b674627309e6665a06b0351632298c"} Dec 09 10:06:15 crc kubenswrapper[5002]: I1209 10:06:15.935900 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" event={"ID":"fa217b39-c5f7-47d5-8abe-30d2e2912ab0","Type":"ContainerStarted","Data":"64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10"} Dec 09 10:06:15 crc kubenswrapper[5002]: I1209 10:06:15.935931 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" podUID="fa217b39-c5f7-47d5-8abe-30d2e2912ab0" containerName="route-controller-manager" containerID="cri-o://64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10" gracePeriod=30 Dec 09 10:06:15 crc kubenswrapper[5002]: I1209 10:06:15.936031 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" podUID="cfcb9729-596d-4afa-a55d-66220c7d04e9" containerName="controller-manager" containerID="cri-o://514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0" gracePeriod=30 Dec 09 10:06:15 crc kubenswrapper[5002]: I1209 10:06:15.936289 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:15 crc kubenswrapper[5002]: I1209 10:06:15.941080 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:15 crc kubenswrapper[5002]: I1209 10:06:15.958726 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" podStartSLOduration=2.95870347 podStartE2EDuration="2.95870347s" podCreationTimestamp="2025-12-09 10:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:06:15.956462655 +0000 UTC m=+308.348513736" watchObservedRunningTime="2025-12-09 10:06:15.95870347 +0000 UTC m=+308.350754551" Dec 09 10:06:15 crc kubenswrapper[5002]: I1209 10:06:15.986015 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" podStartSLOduration=2.98599769 podStartE2EDuration="2.98599769s" podCreationTimestamp="2025-12-09 10:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:06:15.983199419 +0000 UTC m=+308.375250500" watchObservedRunningTime="2025-12-09 10:06:15.98599769 +0000 UTC m=+308.378048771" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.318173 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.322753 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.346481 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt"] Dec 09 10:06:16 crc kubenswrapper[5002]: E1209 10:06:16.346716 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa217b39-c5f7-47d5-8abe-30d2e2912ab0" containerName="route-controller-manager" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.346732 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa217b39-c5f7-47d5-8abe-30d2e2912ab0" containerName="route-controller-manager" Dec 09 10:06:16 crc kubenswrapper[5002]: E1209 10:06:16.346791 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcb9729-596d-4afa-a55d-66220c7d04e9" containerName="controller-manager" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.346801 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcb9729-596d-4afa-a55d-66220c7d04e9" containerName="controller-manager" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.346925 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfcb9729-596d-4afa-a55d-66220c7d04e9" containerName="controller-manager" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.346940 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa217b39-c5f7-47d5-8abe-30d2e2912ab0" containerName="route-controller-manager" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.347337 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.353078 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt"] Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.481398 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-proxy-ca-bundles\") pod \"cfcb9729-596d-4afa-a55d-66220c7d04e9\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.481656 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-config\") pod \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.481740 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-client-ca\") pod \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.481923 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-config\") pod \"cfcb9729-596d-4afa-a55d-66220c7d04e9\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.482046 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9qw5\" (UniqueName: \"kubernetes.io/projected/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-kube-api-access-b9qw5\") pod \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.482124 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mts95\" (UniqueName: \"kubernetes.io/projected/cfcb9729-596d-4afa-a55d-66220c7d04e9-kube-api-access-mts95\") pod \"cfcb9729-596d-4afa-a55d-66220c7d04e9\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.482220 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfcb9729-596d-4afa-a55d-66220c7d04e9-serving-cert\") pod \"cfcb9729-596d-4afa-a55d-66220c7d04e9\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.482294 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-client-ca\") pod \"cfcb9729-596d-4afa-a55d-66220c7d04e9\" (UID: \"cfcb9729-596d-4afa-a55d-66220c7d04e9\") " Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.482383 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-serving-cert\") pod \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\" (UID: \"fa217b39-c5f7-47d5-8abe-30d2e2912ab0\") " Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.482594 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-config\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.482681 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-config" (OuterVolumeSpecName: "config") pod "fa217b39-c5f7-47d5-8abe-30d2e2912ab0" (UID: "fa217b39-c5f7-47d5-8abe-30d2e2912ab0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.482540 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cfcb9729-596d-4afa-a55d-66220c7d04e9" (UID: "cfcb9729-596d-4afa-a55d-66220c7d04e9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.482914 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab033b75-38a6-41b7-8fed-c6296aac6570-serving-cert\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.483051 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "cfcb9729-596d-4afa-a55d-66220c7d04e9" (UID: "cfcb9729-596d-4afa-a55d-66220c7d04e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.483248 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-client-ca\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.483345 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkjkb\" (UniqueName: \"kubernetes.io/projected/ab033b75-38a6-41b7-8fed-c6296aac6570-kube-api-access-zkjkb\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.483374 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-config" (OuterVolumeSpecName: "config") pod "cfcb9729-596d-4afa-a55d-66220c7d04e9" (UID: "cfcb9729-596d-4afa-a55d-66220c7d04e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.483449 5002 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.483468 5002 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.483483 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.483494 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcb9729-596d-4afa-a55d-66220c7d04e9-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.483618 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-client-ca" (OuterVolumeSpecName: "client-ca") pod "fa217b39-c5f7-47d5-8abe-30d2e2912ab0" (UID: "fa217b39-c5f7-47d5-8abe-30d2e2912ab0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.487190 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fa217b39-c5f7-47d5-8abe-30d2e2912ab0" (UID: "fa217b39-c5f7-47d5-8abe-30d2e2912ab0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.487211 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-kube-api-access-b9qw5" (OuterVolumeSpecName: "kube-api-access-b9qw5") pod "fa217b39-c5f7-47d5-8abe-30d2e2912ab0" (UID: "fa217b39-c5f7-47d5-8abe-30d2e2912ab0"). InnerVolumeSpecName "kube-api-access-b9qw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.487191 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfcb9729-596d-4afa-a55d-66220c7d04e9-kube-api-access-mts95" (OuterVolumeSpecName: "kube-api-access-mts95") pod "cfcb9729-596d-4afa-a55d-66220c7d04e9" (UID: "cfcb9729-596d-4afa-a55d-66220c7d04e9"). InnerVolumeSpecName "kube-api-access-mts95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.487303 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfcb9729-596d-4afa-a55d-66220c7d04e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cfcb9729-596d-4afa-a55d-66220c7d04e9" (UID: "cfcb9729-596d-4afa-a55d-66220c7d04e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.584652 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab033b75-38a6-41b7-8fed-c6296aac6570-serving-cert\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.585205 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-client-ca\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.585270 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkjkb\" (UniqueName: \"kubernetes.io/projected/ab033b75-38a6-41b7-8fed-c6296aac6570-kube-api-access-zkjkb\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.585319 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-config\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.585776 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.585797 5002 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.585809 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9qw5\" (UniqueName: \"kubernetes.io/projected/fa217b39-c5f7-47d5-8abe-30d2e2912ab0-kube-api-access-b9qw5\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.585841 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mts95\" (UniqueName: \"kubernetes.io/projected/cfcb9729-596d-4afa-a55d-66220c7d04e9-kube-api-access-mts95\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.585854 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfcb9729-596d-4afa-a55d-66220c7d04e9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.586216 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-client-ca\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.586888 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-config\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.587572 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab033b75-38a6-41b7-8fed-c6296aac6570-serving-cert\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.599120 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkjkb\" (UniqueName: \"kubernetes.io/projected/ab033b75-38a6-41b7-8fed-c6296aac6570-kube-api-access-zkjkb\") pod \"route-controller-manager-d49bbb4c4-zsfzt\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.666202 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.838371 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt"] Dec 09 10:06:16 crc kubenswrapper[5002]: W1209 10:06:16.842977 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab033b75_38a6_41b7_8fed_c6296aac6570.slice/crio-5a5bb45f00632711922a2724836363ba7026f815b98eecaea71d0292e17334ac WatchSource:0}: Error finding container 5a5bb45f00632711922a2724836363ba7026f815b98eecaea71d0292e17334ac: Status 404 returned error can't find the container with id 5a5bb45f00632711922a2724836363ba7026f815b98eecaea71d0292e17334ac Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.943600 5002 generic.go:334] "Generic (PLEG): container finished" podID="cfcb9729-596d-4afa-a55d-66220c7d04e9" containerID="514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0" exitCode=0 Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.943634 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.943660 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" event={"ID":"cfcb9729-596d-4afa-a55d-66220c7d04e9","Type":"ContainerDied","Data":"514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0"} Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.944076 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786fd98fb4-pwgfk" event={"ID":"cfcb9729-596d-4afa-a55d-66220c7d04e9","Type":"ContainerDied","Data":"2b47974ce58aeed1abeb6adc5ba8c759c25fff53515a83774f29a98f931e1714"} Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.944094 5002 scope.go:117] "RemoveContainer" containerID="514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.945794 5002 generic.go:334] "Generic (PLEG): container finished" podID="fa217b39-c5f7-47d5-8abe-30d2e2912ab0" containerID="64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10" exitCode=0 Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.945865 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" event={"ID":"fa217b39-c5f7-47d5-8abe-30d2e2912ab0","Type":"ContainerDied","Data":"64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10"} Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.945890 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" event={"ID":"fa217b39-c5f7-47d5-8abe-30d2e2912ab0","Type":"ContainerDied","Data":"5b56a4e5f196b1fae0cd400254b4bc9da0b674627309e6665a06b0351632298c"} Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.945933 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.950511 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" event={"ID":"ab033b75-38a6-41b7-8fed-c6296aac6570","Type":"ContainerStarted","Data":"5a5bb45f00632711922a2724836363ba7026f815b98eecaea71d0292e17334ac"} Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.965964 5002 scope.go:117] "RemoveContainer" containerID="514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0" Dec 09 10:06:16 crc kubenswrapper[5002]: E1209 10:06:16.966597 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0\": container with ID starting with 514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0 not found: ID does not exist" containerID="514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.966631 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0"} err="failed to get container status \"514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0\": rpc error: code = NotFound desc = could not find container \"514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0\": container with ID starting with 514f47359340d5e7032e22c529e7658b20863607088de6cc9319eb1d274ab2a0 not found: ID does not exist" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.966677 5002 scope.go:117] "RemoveContainer" containerID="64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.975303 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7"] Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.978588 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-mcsj7"] Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.986703 5002 scope.go:117] "RemoveContainer" containerID="64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.987213 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-786fd98fb4-pwgfk"] Dec 09 10:06:16 crc kubenswrapper[5002]: E1209 10:06:16.987327 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10\": container with ID starting with 64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10 not found: ID does not exist" containerID="64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.987360 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10"} err="failed to get container status \"64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10\": rpc error: code = NotFound desc = could not find container \"64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10\": container with ID starting with 64c48f4389fcedd72f729b595799e025ede65bd80ed33666185d3dc94126de10 not found: ID does not exist" Dec 09 10:06:16 crc kubenswrapper[5002]: I1209 10:06:16.992799 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-786fd98fb4-pwgfk"] Dec 09 10:06:17 crc kubenswrapper[5002]: I1209 10:06:17.958863 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" event={"ID":"ab033b75-38a6-41b7-8fed-c6296aac6570","Type":"ContainerStarted","Data":"b80c6e3323d21e234448d038857d8d7263754838f13bdd76f543fd67fd708e3b"} Dec 09 10:06:17 crc kubenswrapper[5002]: I1209 10:06:17.960029 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:17 crc kubenswrapper[5002]: I1209 10:06:17.963955 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:17 crc kubenswrapper[5002]: I1209 10:06:17.974024 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" podStartSLOduration=3.974002269 podStartE2EDuration="3.974002269s" podCreationTimestamp="2025-12-09 10:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:06:17.97336595 +0000 UTC m=+310.365417041" watchObservedRunningTime="2025-12-09 10:06:17.974002269 +0000 UTC m=+310.366053360" Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.068862 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfcb9729-596d-4afa-a55d-66220c7d04e9" path="/var/lib/kubelet/pods/cfcb9729-596d-4afa-a55d-66220c7d04e9/volumes" Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.069404 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa217b39-c5f7-47d5-8abe-30d2e2912ab0" path="/var/lib/kubelet/pods/fa217b39-c5f7-47d5-8abe-30d2e2912ab0/volumes" Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.912512 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77b7d685b-c2jrp"] Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.913107 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.916637 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.917061 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.917224 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.917420 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.917607 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.920907 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.926375 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 10:06:18 crc kubenswrapper[5002]: I1209 10:06:18.931812 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77b7d685b-c2jrp"] Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.015580 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p94mt\" (UniqueName: \"kubernetes.io/projected/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-kube-api-access-p94mt\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.015684 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-client-ca\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.015742 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-config\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.015775 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-serving-cert\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.015885 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-proxy-ca-bundles\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.117522 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-proxy-ca-bundles\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.117858 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p94mt\" (UniqueName: \"kubernetes.io/projected/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-kube-api-access-p94mt\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.117896 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-client-ca\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.117941 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-config\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.117973 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-serving-cert\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.119488 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-client-ca\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.119491 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-proxy-ca-bundles\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.120400 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-config\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.126020 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-serving-cert\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.142406 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p94mt\" (UniqueName: \"kubernetes.io/projected/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-kube-api-access-p94mt\") pod \"controller-manager-77b7d685b-c2jrp\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.230596 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.446953 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77b7d685b-c2jrp"] Dec 09 10:06:19 crc kubenswrapper[5002]: W1209 10:06:19.461466 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod778e9f9f_a7bd_4426_bb62_a0819aea8fe1.slice/crio-60b80805ceedf8af30c05cca832b1fdb9cb1246c932e4d65b056140aa9687882 WatchSource:0}: Error finding container 60b80805ceedf8af30c05cca832b1fdb9cb1246c932e4d65b056140aa9687882: Status 404 returned error can't find the container with id 60b80805ceedf8af30c05cca832b1fdb9cb1246c932e4d65b056140aa9687882 Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.971350 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" event={"ID":"778e9f9f-a7bd-4426-bb62-a0819aea8fe1","Type":"ContainerStarted","Data":"3f989fa3ca8eb1ed0e0c0f32e1dce7a720e3d3c05b6fe40a3203d5296756c5d7"} Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.971446 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" event={"ID":"778e9f9f-a7bd-4426-bb62-a0819aea8fe1","Type":"ContainerStarted","Data":"60b80805ceedf8af30c05cca832b1fdb9cb1246c932e4d65b056140aa9687882"} Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.971479 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:19 crc kubenswrapper[5002]: I1209 10:06:19.986276 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:20 crc kubenswrapper[5002]: I1209 10:06:20.007037 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" podStartSLOduration=6.007014769 podStartE2EDuration="6.007014769s" podCreationTimestamp="2025-12-09 10:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:06:19.989221384 +0000 UTC m=+312.381272475" watchObservedRunningTime="2025-12-09 10:06:20.007014769 +0000 UTC m=+312.399065850" Dec 09 10:06:28 crc kubenswrapper[5002]: I1209 10:06:28.975674 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6h4f9"] Dec 09 10:06:28 crc kubenswrapper[5002]: I1209 10:06:28.978171 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:28 crc kubenswrapper[5002]: I1209 10:06:28.983267 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 10:06:28 crc kubenswrapper[5002]: I1209 10:06:28.985086 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6h4f9"] Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.149999 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35816d2c-1627-4817-a15b-4339b0ca0ef6-utilities\") pod \"community-operators-6h4f9\" (UID: \"35816d2c-1627-4817-a15b-4339b0ca0ef6\") " pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.150129 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnmhw\" (UniqueName: \"kubernetes.io/projected/35816d2c-1627-4817-a15b-4339b0ca0ef6-kube-api-access-hnmhw\") pod \"community-operators-6h4f9\" (UID: \"35816d2c-1627-4817-a15b-4339b0ca0ef6\") " pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.150273 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35816d2c-1627-4817-a15b-4339b0ca0ef6-catalog-content\") pod \"community-operators-6h4f9\" (UID: \"35816d2c-1627-4817-a15b-4339b0ca0ef6\") " pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.165893 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kgbq4"] Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.166809 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.170420 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.172906 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgbq4"] Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.252056 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35816d2c-1627-4817-a15b-4339b0ca0ef6-catalog-content\") pod \"community-operators-6h4f9\" (UID: \"35816d2c-1627-4817-a15b-4339b0ca0ef6\") " pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.252143 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35816d2c-1627-4817-a15b-4339b0ca0ef6-utilities\") pod \"community-operators-6h4f9\" (UID: \"35816d2c-1627-4817-a15b-4339b0ca0ef6\") " pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.252201 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnmhw\" (UniqueName: \"kubernetes.io/projected/35816d2c-1627-4817-a15b-4339b0ca0ef6-kube-api-access-hnmhw\") pod \"community-operators-6h4f9\" (UID: \"35816d2c-1627-4817-a15b-4339b0ca0ef6\") " pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.252670 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35816d2c-1627-4817-a15b-4339b0ca0ef6-utilities\") pod \"community-operators-6h4f9\" (UID: \"35816d2c-1627-4817-a15b-4339b0ca0ef6\") " pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.252670 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35816d2c-1627-4817-a15b-4339b0ca0ef6-catalog-content\") pod \"community-operators-6h4f9\" (UID: \"35816d2c-1627-4817-a15b-4339b0ca0ef6\") " pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.275845 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnmhw\" (UniqueName: \"kubernetes.io/projected/35816d2c-1627-4817-a15b-4339b0ca0ef6-kube-api-access-hnmhw\") pod \"community-operators-6h4f9\" (UID: \"35816d2c-1627-4817-a15b-4339b0ca0ef6\") " pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.349517 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.353522 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t7cb\" (UniqueName: \"kubernetes.io/projected/170c8ce8-33c6-4369-bee5-88bbf86890ca-kube-api-access-8t7cb\") pod \"certified-operators-kgbq4\" (UID: \"170c8ce8-33c6-4369-bee5-88bbf86890ca\") " pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.353606 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/170c8ce8-33c6-4369-bee5-88bbf86890ca-catalog-content\") pod \"certified-operators-kgbq4\" (UID: \"170c8ce8-33c6-4369-bee5-88bbf86890ca\") " pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.353927 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/170c8ce8-33c6-4369-bee5-88bbf86890ca-utilities\") pod \"certified-operators-kgbq4\" (UID: \"170c8ce8-33c6-4369-bee5-88bbf86890ca\") " pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.455308 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/170c8ce8-33c6-4369-bee5-88bbf86890ca-utilities\") pod \"certified-operators-kgbq4\" (UID: \"170c8ce8-33c6-4369-bee5-88bbf86890ca\") " pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.455631 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t7cb\" (UniqueName: \"kubernetes.io/projected/170c8ce8-33c6-4369-bee5-88bbf86890ca-kube-api-access-8t7cb\") pod \"certified-operators-kgbq4\" (UID: \"170c8ce8-33c6-4369-bee5-88bbf86890ca\") " pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.455655 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/170c8ce8-33c6-4369-bee5-88bbf86890ca-catalog-content\") pod \"certified-operators-kgbq4\" (UID: \"170c8ce8-33c6-4369-bee5-88bbf86890ca\") " pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.455749 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/170c8ce8-33c6-4369-bee5-88bbf86890ca-utilities\") pod \"certified-operators-kgbq4\" (UID: \"170c8ce8-33c6-4369-bee5-88bbf86890ca\") " pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.456034 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/170c8ce8-33c6-4369-bee5-88bbf86890ca-catalog-content\") pod \"certified-operators-kgbq4\" (UID: \"170c8ce8-33c6-4369-bee5-88bbf86890ca\") " pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.473793 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t7cb\" (UniqueName: \"kubernetes.io/projected/170c8ce8-33c6-4369-bee5-88bbf86890ca-kube-api-access-8t7cb\") pod \"certified-operators-kgbq4\" (UID: \"170c8ce8-33c6-4369-bee5-88bbf86890ca\") " pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.483675 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:29 crc kubenswrapper[5002]: W1209 10:06:29.756172 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35816d2c_1627_4817_a15b_4339b0ca0ef6.slice/crio-2860b4dbe4006ee0e3369e845f2ccba6628b133fec309dad71170d196054626f WatchSource:0}: Error finding container 2860b4dbe4006ee0e3369e845f2ccba6628b133fec309dad71170d196054626f: Status 404 returned error can't find the container with id 2860b4dbe4006ee0e3369e845f2ccba6628b133fec309dad71170d196054626f Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.757054 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6h4f9"] Dec 09 10:06:29 crc kubenswrapper[5002]: I1209 10:06:29.889665 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgbq4"] Dec 09 10:06:29 crc kubenswrapper[5002]: W1209 10:06:29.893282 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod170c8ce8_33c6_4369_bee5_88bbf86890ca.slice/crio-b6c30fb16a99f6cae916d782c9b93f989a13a3f9aa30e743b07a698c75c2b7c6 WatchSource:0}: Error finding container b6c30fb16a99f6cae916d782c9b93f989a13a3f9aa30e743b07a698c75c2b7c6: Status 404 returned error can't find the container with id b6c30fb16a99f6cae916d782c9b93f989a13a3f9aa30e743b07a698c75c2b7c6 Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.038417 5002 generic.go:334] "Generic (PLEG): container finished" podID="35816d2c-1627-4817-a15b-4339b0ca0ef6" containerID="a59944a7ae74ffe719c7b164f45113927f14311b421f9eed5148fb4bacc6f420" exitCode=0 Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.038487 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6h4f9" event={"ID":"35816d2c-1627-4817-a15b-4339b0ca0ef6","Type":"ContainerDied","Data":"a59944a7ae74ffe719c7b164f45113927f14311b421f9eed5148fb4bacc6f420"} Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.038511 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6h4f9" event={"ID":"35816d2c-1627-4817-a15b-4339b0ca0ef6","Type":"ContainerStarted","Data":"2860b4dbe4006ee0e3369e845f2ccba6628b133fec309dad71170d196054626f"} Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.041334 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgbq4" event={"ID":"170c8ce8-33c6-4369-bee5-88bbf86890ca","Type":"ContainerStarted","Data":"b6c30fb16a99f6cae916d782c9b93f989a13a3f9aa30e743b07a698c75c2b7c6"} Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.759585 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q9xhw"] Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.760804 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.765207 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.769771 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9xhw"] Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.771961 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntdgq\" (UniqueName: \"kubernetes.io/projected/cf487cfa-4480-4b72-86a8-6c57cc06403a-kube-api-access-ntdgq\") pod \"redhat-marketplace-q9xhw\" (UID: \"cf487cfa-4480-4b72-86a8-6c57cc06403a\") " pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.772045 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf487cfa-4480-4b72-86a8-6c57cc06403a-catalog-content\") pod \"redhat-marketplace-q9xhw\" (UID: \"cf487cfa-4480-4b72-86a8-6c57cc06403a\") " pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.772115 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf487cfa-4480-4b72-86a8-6c57cc06403a-utilities\") pod \"redhat-marketplace-q9xhw\" (UID: \"cf487cfa-4480-4b72-86a8-6c57cc06403a\") " pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.872920 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntdgq\" (UniqueName: \"kubernetes.io/projected/cf487cfa-4480-4b72-86a8-6c57cc06403a-kube-api-access-ntdgq\") pod \"redhat-marketplace-q9xhw\" (UID: \"cf487cfa-4480-4b72-86a8-6c57cc06403a\") " pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.872987 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf487cfa-4480-4b72-86a8-6c57cc06403a-catalog-content\") pod \"redhat-marketplace-q9xhw\" (UID: \"cf487cfa-4480-4b72-86a8-6c57cc06403a\") " pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.873028 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf487cfa-4480-4b72-86a8-6c57cc06403a-utilities\") pod \"redhat-marketplace-q9xhw\" (UID: \"cf487cfa-4480-4b72-86a8-6c57cc06403a\") " pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.873419 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf487cfa-4480-4b72-86a8-6c57cc06403a-utilities\") pod \"redhat-marketplace-q9xhw\" (UID: \"cf487cfa-4480-4b72-86a8-6c57cc06403a\") " pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.873621 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf487cfa-4480-4b72-86a8-6c57cc06403a-catalog-content\") pod \"redhat-marketplace-q9xhw\" (UID: \"cf487cfa-4480-4b72-86a8-6c57cc06403a\") " pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:30 crc kubenswrapper[5002]: I1209 10:06:30.894445 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntdgq\" (UniqueName: \"kubernetes.io/projected/cf487cfa-4480-4b72-86a8-6c57cc06403a-kube-api-access-ntdgq\") pod \"redhat-marketplace-q9xhw\" (UID: \"cf487cfa-4480-4b72-86a8-6c57cc06403a\") " pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.053960 5002 generic.go:334] "Generic (PLEG): container finished" podID="170c8ce8-33c6-4369-bee5-88bbf86890ca" containerID="1c7ee562cb991749758d9421a62cdea11b6a41e7e79c6a80166afb6565441892" exitCode=0 Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.054015 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgbq4" event={"ID":"170c8ce8-33c6-4369-bee5-88bbf86890ca","Type":"ContainerDied","Data":"1c7ee562cb991749758d9421a62cdea11b6a41e7e79c6a80166afb6565441892"} Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.075774 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.469367 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9xhw"] Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.766221 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lp6l4"] Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.767522 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.769940 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.771323 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lp6l4"] Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.886556 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-catalog-content\") pod \"redhat-operators-lp6l4\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.886618 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xb7m\" (UniqueName: \"kubernetes.io/projected/9dc64ca0-5e54-466d-b19f-7726b430268a-kube-api-access-6xb7m\") pod \"redhat-operators-lp6l4\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.886685 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-utilities\") pod \"redhat-operators-lp6l4\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.987472 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-catalog-content\") pod \"redhat-operators-lp6l4\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.987522 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xb7m\" (UniqueName: \"kubernetes.io/projected/9dc64ca0-5e54-466d-b19f-7726b430268a-kube-api-access-6xb7m\") pod \"redhat-operators-lp6l4\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.987562 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-utilities\") pod \"redhat-operators-lp6l4\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.987920 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-catalog-content\") pod \"redhat-operators-lp6l4\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:31 crc kubenswrapper[5002]: I1209 10:06:31.987952 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-utilities\") pod \"redhat-operators-lp6l4\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.010258 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xb7m\" (UniqueName: \"kubernetes.io/projected/9dc64ca0-5e54-466d-b19f-7726b430268a-kube-api-access-6xb7m\") pod \"redhat-operators-lp6l4\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.060986 5002 generic.go:334] "Generic (PLEG): container finished" podID="170c8ce8-33c6-4369-bee5-88bbf86890ca" containerID="fc10d15f016e220c333c0e9a1e39a765e3cd7738d1e72d4fe814d1d9aeaba84f" exitCode=0 Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.062459 5002 generic.go:334] "Generic (PLEG): container finished" podID="cf487cfa-4480-4b72-86a8-6c57cc06403a" containerID="d2062719df7a8342ab0bbe6033b74429917a29f52763a453da36849c953cb103" exitCode=0 Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.065071 5002 generic.go:334] "Generic (PLEG): container finished" podID="35816d2c-1627-4817-a15b-4339b0ca0ef6" containerID="3f63158af148a09ed36da1b37c3be5007e1147a6e2da756fca8b422a9f02e625" exitCode=0 Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.070684 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgbq4" event={"ID":"170c8ce8-33c6-4369-bee5-88bbf86890ca","Type":"ContainerDied","Data":"fc10d15f016e220c333c0e9a1e39a765e3cd7738d1e72d4fe814d1d9aeaba84f"} Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.070751 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9xhw" event={"ID":"cf487cfa-4480-4b72-86a8-6c57cc06403a","Type":"ContainerDied","Data":"d2062719df7a8342ab0bbe6033b74429917a29f52763a453da36849c953cb103"} Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.070767 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9xhw" event={"ID":"cf487cfa-4480-4b72-86a8-6c57cc06403a","Type":"ContainerStarted","Data":"08d90ee7d0d1b20d3707207531cfb005c692d633cc4ac334445ba61b76de95e6"} Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.070779 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6h4f9" event={"ID":"35816d2c-1627-4817-a15b-4339b0ca0ef6","Type":"ContainerDied","Data":"3f63158af148a09ed36da1b37c3be5007e1147a6e2da756fca8b422a9f02e625"} Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.095153 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.517682 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lp6l4"] Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.605714 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77b7d685b-c2jrp"] Dec 09 10:06:32 crc kubenswrapper[5002]: I1209 10:06:32.606242 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" podUID="778e9f9f-a7bd-4426-bb62-a0819aea8fe1" containerName="controller-manager" containerID="cri-o://3f989fa3ca8eb1ed0e0c0f32e1dce7a720e3d3c05b6fe40a3203d5296756c5d7" gracePeriod=30 Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.072539 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgbq4" event={"ID":"170c8ce8-33c6-4369-bee5-88bbf86890ca","Type":"ContainerStarted","Data":"6d67419f4121804d866dd478fc528db07e8fe5b1167950e623652915edca0c1e"} Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.073966 5002 generic.go:334] "Generic (PLEG): container finished" podID="cf487cfa-4480-4b72-86a8-6c57cc06403a" containerID="eb2c5e7428e14cf4ce4519ae5cb8b3d5489e367637cece6c4c31e47eb45cec13" exitCode=0 Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.074078 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9xhw" event={"ID":"cf487cfa-4480-4b72-86a8-6c57cc06403a","Type":"ContainerDied","Data":"eb2c5e7428e14cf4ce4519ae5cb8b3d5489e367637cece6c4c31e47eb45cec13"} Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.078710 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6h4f9" event={"ID":"35816d2c-1627-4817-a15b-4339b0ca0ef6","Type":"ContainerStarted","Data":"7aa40db3ce828ba126a41c400835ed17f0ec52e35c53be6e4b3f3620fe530e49"} Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.081351 5002 generic.go:334] "Generic (PLEG): container finished" podID="778e9f9f-a7bd-4426-bb62-a0819aea8fe1" containerID="3f989fa3ca8eb1ed0e0c0f32e1dce7a720e3d3c05b6fe40a3203d5296756c5d7" exitCode=0 Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.081622 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" event={"ID":"778e9f9f-a7bd-4426-bb62-a0819aea8fe1","Type":"ContainerDied","Data":"3f989fa3ca8eb1ed0e0c0f32e1dce7a720e3d3c05b6fe40a3203d5296756c5d7"} Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.083624 5002 generic.go:334] "Generic (PLEG): container finished" podID="9dc64ca0-5e54-466d-b19f-7726b430268a" containerID="8ceb08ae4329a958a7aab9104ead6e429d6f81115d8c1bb32338e2b5c2dad0fd" exitCode=0 Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.083695 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp6l4" event={"ID":"9dc64ca0-5e54-466d-b19f-7726b430268a","Type":"ContainerDied","Data":"8ceb08ae4329a958a7aab9104ead6e429d6f81115d8c1bb32338e2b5c2dad0fd"} Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.083737 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp6l4" event={"ID":"9dc64ca0-5e54-466d-b19f-7726b430268a","Type":"ContainerStarted","Data":"8186823966bea611dd060448f76f12653c0b07b0f415c9dbe0f634b6d6b5da91"} Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.096489 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kgbq4" podStartSLOduration=2.6427763239999997 podStartE2EDuration="4.096469057s" podCreationTimestamp="2025-12-09 10:06:29 +0000 UTC" firstStartedPulling="2025-12-09 10:06:31.056498015 +0000 UTC m=+323.448549096" lastFinishedPulling="2025-12-09 10:06:32.510190748 +0000 UTC m=+324.902241829" observedRunningTime="2025-12-09 10:06:33.095247551 +0000 UTC m=+325.487298632" watchObservedRunningTime="2025-12-09 10:06:33.096469057 +0000 UTC m=+325.488520138" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.113494 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.132113 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6h4f9" podStartSLOduration=2.60314656 podStartE2EDuration="5.132095077s" podCreationTimestamp="2025-12-09 10:06:28 +0000 UTC" firstStartedPulling="2025-12-09 10:06:30.039722062 +0000 UTC m=+322.431773143" lastFinishedPulling="2025-12-09 10:06:32.568670579 +0000 UTC m=+324.960721660" observedRunningTime="2025-12-09 10:06:33.130077549 +0000 UTC m=+325.522128630" watchObservedRunningTime="2025-12-09 10:06:33.132095077 +0000 UTC m=+325.524146148" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.305532 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-config\") pod \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.305599 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p94mt\" (UniqueName: \"kubernetes.io/projected/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-kube-api-access-p94mt\") pod \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.305703 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-proxy-ca-bundles\") pod \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.305727 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-client-ca\") pod \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.305747 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-serving-cert\") pod \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\" (UID: \"778e9f9f-a7bd-4426-bb62-a0819aea8fe1\") " Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.306305 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-client-ca" (OuterVolumeSpecName: "client-ca") pod "778e9f9f-a7bd-4426-bb62-a0819aea8fe1" (UID: "778e9f9f-a7bd-4426-bb62-a0819aea8fe1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.306367 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-config" (OuterVolumeSpecName: "config") pod "778e9f9f-a7bd-4426-bb62-a0819aea8fe1" (UID: "778e9f9f-a7bd-4426-bb62-a0819aea8fe1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.306391 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "778e9f9f-a7bd-4426-bb62-a0819aea8fe1" (UID: "778e9f9f-a7bd-4426-bb62-a0819aea8fe1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.311219 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "778e9f9f-a7bd-4426-bb62-a0819aea8fe1" (UID: "778e9f9f-a7bd-4426-bb62-a0819aea8fe1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.311378 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-kube-api-access-p94mt" (OuterVolumeSpecName: "kube-api-access-p94mt") pod "778e9f9f-a7bd-4426-bb62-a0819aea8fe1" (UID: "778e9f9f-a7bd-4426-bb62-a0819aea8fe1"). InnerVolumeSpecName "kube-api-access-p94mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.407442 5002 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.407501 5002 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.407517 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.407529 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.407541 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p94mt\" (UniqueName: \"kubernetes.io/projected/778e9f9f-a7bd-4426-bb62-a0819aea8fe1-kube-api-access-p94mt\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.928923 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-786fd98fb4-kzs46"] Dec 09 10:06:33 crc kubenswrapper[5002]: E1209 10:06:33.929475 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="778e9f9f-a7bd-4426-bb62-a0819aea8fe1" containerName="controller-manager" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.929503 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="778e9f9f-a7bd-4426-bb62-a0819aea8fe1" containerName="controller-manager" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.929786 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="778e9f9f-a7bd-4426-bb62-a0819aea8fe1" containerName="controller-manager" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.933927 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:33 crc kubenswrapper[5002]: I1209 10:06:33.947474 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-786fd98fb4-kzs46"] Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.097190 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" event={"ID":"778e9f9f-a7bd-4426-bb62-a0819aea8fe1","Type":"ContainerDied","Data":"60b80805ceedf8af30c05cca832b1fdb9cb1246c932e4d65b056140aa9687882"} Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.097561 5002 scope.go:117] "RemoveContainer" containerID="3f989fa3ca8eb1ed0e0c0f32e1dce7a720e3d3c05b6fe40a3203d5296756c5d7" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.097228 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b7d685b-c2jrp" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.099195 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp6l4" event={"ID":"9dc64ca0-5e54-466d-b19f-7726b430268a","Type":"ContainerStarted","Data":"b5488d9913306cef1d39b41e2c80ea75cfef4f60cadc56af7f872aba129b5892"} Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.103318 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9xhw" event={"ID":"cf487cfa-4480-4b72-86a8-6c57cc06403a","Type":"ContainerStarted","Data":"bda740c950b313e4ecd957c2b8400560bb4958d2705945aa0ddff3dbaf5de270"} Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.117347 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29b01f64-7d15-4a86-b94f-457ea3cfc067-proxy-ca-bundles\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.117429 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29b01f64-7d15-4a86-b94f-457ea3cfc067-client-ca\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.117490 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b01f64-7d15-4a86-b94f-457ea3cfc067-config\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.117514 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5558\" (UniqueName: \"kubernetes.io/projected/29b01f64-7d15-4a86-b94f-457ea3cfc067-kube-api-access-p5558\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.117534 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29b01f64-7d15-4a86-b94f-457ea3cfc067-serving-cert\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.140609 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77b7d685b-c2jrp"] Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.142919 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77b7d685b-c2jrp"] Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.152707 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q9xhw" podStartSLOduration=2.736891406 podStartE2EDuration="4.152684991s" podCreationTimestamp="2025-12-09 10:06:30 +0000 UTC" firstStartedPulling="2025-12-09 10:06:32.063430934 +0000 UTC m=+324.455482015" lastFinishedPulling="2025-12-09 10:06:33.479224519 +0000 UTC m=+325.871275600" observedRunningTime="2025-12-09 10:06:34.151134796 +0000 UTC m=+326.543185877" watchObservedRunningTime="2025-12-09 10:06:34.152684991 +0000 UTC m=+326.544736072" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.218011 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29b01f64-7d15-4a86-b94f-457ea3cfc067-proxy-ca-bundles\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.218100 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29b01f64-7d15-4a86-b94f-457ea3cfc067-client-ca\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.218152 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b01f64-7d15-4a86-b94f-457ea3cfc067-config\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.218183 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5558\" (UniqueName: \"kubernetes.io/projected/29b01f64-7d15-4a86-b94f-457ea3cfc067-kube-api-access-p5558\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.218211 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29b01f64-7d15-4a86-b94f-457ea3cfc067-serving-cert\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.219041 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29b01f64-7d15-4a86-b94f-457ea3cfc067-client-ca\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.219599 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b01f64-7d15-4a86-b94f-457ea3cfc067-config\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.220442 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29b01f64-7d15-4a86-b94f-457ea3cfc067-proxy-ca-bundles\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.240166 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29b01f64-7d15-4a86-b94f-457ea3cfc067-serving-cert\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.252744 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5558\" (UniqueName: \"kubernetes.io/projected/29b01f64-7d15-4a86-b94f-457ea3cfc067-kube-api-access-p5558\") pod \"controller-manager-786fd98fb4-kzs46\" (UID: \"29b01f64-7d15-4a86-b94f-457ea3cfc067\") " pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.549674 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:34 crc kubenswrapper[5002]: I1209 10:06:34.947885 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-786fd98fb4-kzs46"] Dec 09 10:06:35 crc kubenswrapper[5002]: I1209 10:06:35.111487 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" event={"ID":"29b01f64-7d15-4a86-b94f-457ea3cfc067","Type":"ContainerStarted","Data":"2fc77c141791220b89cb43cd263846477de2567f5463b280ac610cd177898db0"} Dec 09 10:06:36 crc kubenswrapper[5002]: I1209 10:06:36.067453 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="778e9f9f-a7bd-4426-bb62-a0819aea8fe1" path="/var/lib/kubelet/pods/778e9f9f-a7bd-4426-bb62-a0819aea8fe1/volumes" Dec 09 10:06:36 crc kubenswrapper[5002]: I1209 10:06:36.119272 5002 generic.go:334] "Generic (PLEG): container finished" podID="9dc64ca0-5e54-466d-b19f-7726b430268a" containerID="b5488d9913306cef1d39b41e2c80ea75cfef4f60cadc56af7f872aba129b5892" exitCode=0 Dec 09 10:06:36 crc kubenswrapper[5002]: I1209 10:06:36.119352 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp6l4" event={"ID":"9dc64ca0-5e54-466d-b19f-7726b430268a","Type":"ContainerDied","Data":"b5488d9913306cef1d39b41e2c80ea75cfef4f60cadc56af7f872aba129b5892"} Dec 09 10:06:36 crc kubenswrapper[5002]: I1209 10:06:36.121208 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" event={"ID":"29b01f64-7d15-4a86-b94f-457ea3cfc067","Type":"ContainerStarted","Data":"7c559c079cd66bce6d5bb8b16e9a280491341f1847e9923b81666309604e3019"} Dec 09 10:06:36 crc kubenswrapper[5002]: I1209 10:06:36.121448 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:36 crc kubenswrapper[5002]: I1209 10:06:36.126640 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" Dec 09 10:06:36 crc kubenswrapper[5002]: I1209 10:06:36.157939 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-786fd98fb4-kzs46" podStartSLOduration=4.157918717 podStartE2EDuration="4.157918717s" podCreationTimestamp="2025-12-09 10:06:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:06:36.157598329 +0000 UTC m=+328.549649410" watchObservedRunningTime="2025-12-09 10:06:36.157918717 +0000 UTC m=+328.549969798" Dec 09 10:06:37 crc kubenswrapper[5002]: I1209 10:06:37.139323 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp6l4" event={"ID":"9dc64ca0-5e54-466d-b19f-7726b430268a","Type":"ContainerStarted","Data":"0f18d8edee8bc8d1606f74cb4bf2f562b8ee9b1c438cd91f2ddb43e9b8578ab6"} Dec 09 10:06:37 crc kubenswrapper[5002]: I1209 10:06:37.162363 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lp6l4" podStartSLOduration=2.723620415 podStartE2EDuration="6.162335482s" podCreationTimestamp="2025-12-09 10:06:31 +0000 UTC" firstStartedPulling="2025-12-09 10:06:33.08482825 +0000 UTC m=+325.476879331" lastFinishedPulling="2025-12-09 10:06:36.523543307 +0000 UTC m=+328.915594398" observedRunningTime="2025-12-09 10:06:37.155907493 +0000 UTC m=+329.547958584" watchObservedRunningTime="2025-12-09 10:06:37.162335482 +0000 UTC m=+329.554386563" Dec 09 10:06:39 crc kubenswrapper[5002]: I1209 10:06:39.350723 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:39 crc kubenswrapper[5002]: I1209 10:06:39.351102 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:39 crc kubenswrapper[5002]: I1209 10:06:39.408832 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:39 crc kubenswrapper[5002]: I1209 10:06:39.484588 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:39 crc kubenswrapper[5002]: I1209 10:06:39.484633 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:39 crc kubenswrapper[5002]: I1209 10:06:39.527458 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:40 crc kubenswrapper[5002]: I1209 10:06:40.195527 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6h4f9" Dec 09 10:06:40 crc kubenswrapper[5002]: I1209 10:06:40.198770 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kgbq4" Dec 09 10:06:41 crc kubenswrapper[5002]: I1209 10:06:41.076211 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:41 crc kubenswrapper[5002]: I1209 10:06:41.077158 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:41 crc kubenswrapper[5002]: I1209 10:06:41.127720 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:41 crc kubenswrapper[5002]: I1209 10:06:41.204039 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q9xhw" Dec 09 10:06:42 crc kubenswrapper[5002]: I1209 10:06:42.182366 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:42 crc kubenswrapper[5002]: I1209 10:06:42.182436 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:42 crc kubenswrapper[5002]: I1209 10:06:42.227613 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:43 crc kubenswrapper[5002]: I1209 10:06:43.235753 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 10:06:52 crc kubenswrapper[5002]: I1209 10:06:52.612684 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt"] Dec 09 10:06:52 crc kubenswrapper[5002]: I1209 10:06:52.613482 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" podUID="ab033b75-38a6-41b7-8fed-c6296aac6570" containerName="route-controller-manager" containerID="cri-o://b80c6e3323d21e234448d038857d8d7263754838f13bdd76f543fd67fd708e3b" gracePeriod=30 Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.260752 5002 generic.go:334] "Generic (PLEG): container finished" podID="ab033b75-38a6-41b7-8fed-c6296aac6570" containerID="b80c6e3323d21e234448d038857d8d7263754838f13bdd76f543fd67fd708e3b" exitCode=0 Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.260866 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" event={"ID":"ab033b75-38a6-41b7-8fed-c6296aac6570","Type":"ContainerDied","Data":"b80c6e3323d21e234448d038857d8d7263754838f13bdd76f543fd67fd708e3b"} Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.809021 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.836523 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j"] Dec 09 10:06:56 crc kubenswrapper[5002]: E1209 10:06:56.836724 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab033b75-38a6-41b7-8fed-c6296aac6570" containerName="route-controller-manager" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.836735 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab033b75-38a6-41b7-8fed-c6296aac6570" containerName="route-controller-manager" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.836861 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab033b75-38a6-41b7-8fed-c6296aac6570" containerName="route-controller-manager" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.837181 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.862316 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j"] Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.973764 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkjkb\" (UniqueName: \"kubernetes.io/projected/ab033b75-38a6-41b7-8fed-c6296aac6570-kube-api-access-zkjkb\") pod \"ab033b75-38a6-41b7-8fed-c6296aac6570\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.973932 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-config\") pod \"ab033b75-38a6-41b7-8fed-c6296aac6570\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.973964 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab033b75-38a6-41b7-8fed-c6296aac6570-serving-cert\") pod \"ab033b75-38a6-41b7-8fed-c6296aac6570\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.973987 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-client-ca\") pod \"ab033b75-38a6-41b7-8fed-c6296aac6570\" (UID: \"ab033b75-38a6-41b7-8fed-c6296aac6570\") " Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.974162 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c12dcb5-91b6-4ec1-9fde-af10463056f3-config\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.974195 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krxvr\" (UniqueName: \"kubernetes.io/projected/5c12dcb5-91b6-4ec1-9fde-af10463056f3-kube-api-access-krxvr\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.974226 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c12dcb5-91b6-4ec1-9fde-af10463056f3-serving-cert\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.974246 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c12dcb5-91b6-4ec1-9fde-af10463056f3-client-ca\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.974637 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-config" (OuterVolumeSpecName: "config") pod "ab033b75-38a6-41b7-8fed-c6296aac6570" (UID: "ab033b75-38a6-41b7-8fed-c6296aac6570"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.975045 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab033b75-38a6-41b7-8fed-c6296aac6570" (UID: "ab033b75-38a6-41b7-8fed-c6296aac6570"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.979958 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab033b75-38a6-41b7-8fed-c6296aac6570-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab033b75-38a6-41b7-8fed-c6296aac6570" (UID: "ab033b75-38a6-41b7-8fed-c6296aac6570"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:06:56 crc kubenswrapper[5002]: I1209 10:06:56.980031 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab033b75-38a6-41b7-8fed-c6296aac6570-kube-api-access-zkjkb" (OuterVolumeSpecName: "kube-api-access-zkjkb") pod "ab033b75-38a6-41b7-8fed-c6296aac6570" (UID: "ab033b75-38a6-41b7-8fed-c6296aac6570"). InnerVolumeSpecName "kube-api-access-zkjkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.075702 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c12dcb5-91b6-4ec1-9fde-af10463056f3-serving-cert\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.075792 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c12dcb5-91b6-4ec1-9fde-af10463056f3-client-ca\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.076780 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c12dcb5-91b6-4ec1-9fde-af10463056f3-client-ca\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.076980 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c12dcb5-91b6-4ec1-9fde-af10463056f3-config\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.077973 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c12dcb5-91b6-4ec1-9fde-af10463056f3-config\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.077998 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krxvr\" (UniqueName: \"kubernetes.io/projected/5c12dcb5-91b6-4ec1-9fde-af10463056f3-kube-api-access-krxvr\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.078261 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkjkb\" (UniqueName: \"kubernetes.io/projected/ab033b75-38a6-41b7-8fed-c6296aac6570-kube-api-access-zkjkb\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.078288 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.078300 5002 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab033b75-38a6-41b7-8fed-c6296aac6570-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.078312 5002 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab033b75-38a6-41b7-8fed-c6296aac6570-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.079980 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c12dcb5-91b6-4ec1-9fde-af10463056f3-serving-cert\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.096899 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krxvr\" (UniqueName: \"kubernetes.io/projected/5c12dcb5-91b6-4ec1-9fde-af10463056f3-kube-api-access-krxvr\") pod \"route-controller-manager-6d88dcd4c6-dvf7j\" (UID: \"5c12dcb5-91b6-4ec1-9fde-af10463056f3\") " pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.154687 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.269670 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" event={"ID":"ab033b75-38a6-41b7-8fed-c6296aac6570","Type":"ContainerDied","Data":"5a5bb45f00632711922a2724836363ba7026f815b98eecaea71d0292e17334ac"} Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.269714 5002 scope.go:117] "RemoveContainer" containerID="b80c6e3323d21e234448d038857d8d7263754838f13bdd76f543fd67fd708e3b" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.269827 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.297789 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt"] Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.300657 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt"] Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.540151 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j"] Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.667474 5002 patch_prober.go:28] interesting pod/route-controller-manager-d49bbb4c4-zsfzt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 09 10:06:57 crc kubenswrapper[5002]: I1209 10:06:57.668034 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d49bbb4c4-zsfzt" podUID="ab033b75-38a6-41b7-8fed-c6296aac6570" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 10:06:58 crc kubenswrapper[5002]: I1209 10:06:58.069324 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab033b75-38a6-41b7-8fed-c6296aac6570" path="/var/lib/kubelet/pods/ab033b75-38a6-41b7-8fed-c6296aac6570/volumes" Dec 09 10:06:58 crc kubenswrapper[5002]: I1209 10:06:58.276962 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" event={"ID":"5c12dcb5-91b6-4ec1-9fde-af10463056f3","Type":"ContainerStarted","Data":"22ef3156c6526c7dfda655f4a2b295062d7233d98bb941b52e106ae73c0f4de9"} Dec 09 10:06:58 crc kubenswrapper[5002]: I1209 10:06:58.277012 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" event={"ID":"5c12dcb5-91b6-4ec1-9fde-af10463056f3","Type":"ContainerStarted","Data":"ed7365ab184836c1f482d88675abf742433ee84088799dd174b0f8fdb4a23113"} Dec 09 10:06:58 crc kubenswrapper[5002]: I1209 10:06:58.277217 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:58 crc kubenswrapper[5002]: I1209 10:06:58.282373 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" Dec 09 10:06:58 crc kubenswrapper[5002]: I1209 10:06:58.294198 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d88dcd4c6-dvf7j" podStartSLOduration=6.294179218 podStartE2EDuration="6.294179218s" podCreationTimestamp="2025-12-09 10:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:06:58.292469843 +0000 UTC m=+350.684520934" watchObservedRunningTime="2025-12-09 10:06:58.294179218 +0000 UTC m=+350.686230319" Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.888271 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pjvkx"] Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.889662 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.951312 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pjvkx"] Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.970795 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.970902 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-registry-certificates\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.971025 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.971126 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.971209 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-trusted-ca\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.971252 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-bound-sa-token\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.971363 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-registry-tls\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.971410 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlllf\" (UniqueName: \"kubernetes.io/projected/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-kube-api-access-rlllf\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:03 crc kubenswrapper[5002]: I1209 10:07:03.990887 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.073051 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-trusted-ca\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.073207 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-bound-sa-token\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.073244 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-registry-tls\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.073267 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlllf\" (UniqueName: \"kubernetes.io/projected/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-kube-api-access-rlllf\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.073289 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.073315 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-registry-certificates\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.073343 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.073948 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.074613 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-trusted-ca\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.074729 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-registry-certificates\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.079005 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.079602 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-registry-tls\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.088910 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-bound-sa-token\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.092788 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlllf\" (UniqueName: \"kubernetes.io/projected/d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f-kube-api-access-rlllf\") pod \"image-registry-66df7c8f76-pjvkx\" (UID: \"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f\") " pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.223556 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:04 crc kubenswrapper[5002]: I1209 10:07:04.640150 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pjvkx"] Dec 09 10:07:05 crc kubenswrapper[5002]: I1209 10:07:05.314129 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" event={"ID":"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f","Type":"ContainerStarted","Data":"940af52376cd06effbf34a9d587cc9d29abd71f4f580a78557c5a53de9efec4b"} Dec 09 10:07:05 crc kubenswrapper[5002]: I1209 10:07:05.314461 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" event={"ID":"d9cd7ef1-1f18-4b91-ad6b-47ec29108f1f","Type":"ContainerStarted","Data":"f06031dc12265733b887868fea14b752d0cf41264f24e39345d5a18aa3c98915"} Dec 09 10:07:05 crc kubenswrapper[5002]: I1209 10:07:05.314484 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:05 crc kubenswrapper[5002]: I1209 10:07:05.340233 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" podStartSLOduration=2.3402078250000002 podStartE2EDuration="2.340207825s" podCreationTimestamp="2025-12-09 10:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:07:05.332127262 +0000 UTC m=+357.724178383" watchObservedRunningTime="2025-12-09 10:07:05.340207825 +0000 UTC m=+357.732258946" Dec 09 10:07:07 crc kubenswrapper[5002]: I1209 10:07:07.964093 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:07:07 crc kubenswrapper[5002]: I1209 10:07:07.964153 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:07:24 crc kubenswrapper[5002]: I1209 10:07:24.230563 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-pjvkx" Dec 09 10:07:24 crc kubenswrapper[5002]: I1209 10:07:24.299865 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-82dlm"] Dec 09 10:07:37 crc kubenswrapper[5002]: I1209 10:07:37.964878 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:07:37 crc kubenswrapper[5002]: I1209 10:07:37.967089 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:07:49 crc kubenswrapper[5002]: I1209 10:07:49.366474 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" podUID="cac4bbbb-1f9f-4107-99ee-cb71771ec460" containerName="registry" containerID="cri-o://deeb54b469ee1431d93f7d5a33c9c094bd182a9b86e0812938c9ee0141909463" gracePeriod=30 Dec 09 10:07:49 crc kubenswrapper[5002]: I1209 10:07:49.613036 5002 generic.go:334] "Generic (PLEG): container finished" podID="cac4bbbb-1f9f-4107-99ee-cb71771ec460" containerID="deeb54b469ee1431d93f7d5a33c9c094bd182a9b86e0812938c9ee0141909463" exitCode=0 Dec 09 10:07:49 crc kubenswrapper[5002]: I1209 10:07:49.613371 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" event={"ID":"cac4bbbb-1f9f-4107-99ee-cb71771ec460","Type":"ContainerDied","Data":"deeb54b469ee1431d93f7d5a33c9c094bd182a9b86e0812938c9ee0141909463"} Dec 09 10:07:49 crc kubenswrapper[5002]: I1209 10:07:49.846534 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.013550 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cac4bbbb-1f9f-4107-99ee-cb71771ec460-ca-trust-extracted\") pod \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.013645 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-trusted-ca\") pod \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.013684 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-bound-sa-token\") pod \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.013937 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.014012 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-tls\") pod \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.014041 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cac4bbbb-1f9f-4107-99ee-cb71771ec460-installation-pull-secrets\") pod \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.014078 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv5lc\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-kube-api-access-xv5lc\") pod \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.014107 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-certificates\") pod \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\" (UID: \"cac4bbbb-1f9f-4107-99ee-cb71771ec460\") " Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.014805 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cac4bbbb-1f9f-4107-99ee-cb71771ec460" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.014908 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cac4bbbb-1f9f-4107-99ee-cb71771ec460" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.021062 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-kube-api-access-xv5lc" (OuterVolumeSpecName: "kube-api-access-xv5lc") pod "cac4bbbb-1f9f-4107-99ee-cb71771ec460" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460"). InnerVolumeSpecName "kube-api-access-xv5lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.021046 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cac4bbbb-1f9f-4107-99ee-cb71771ec460-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cac4bbbb-1f9f-4107-99ee-cb71771ec460" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.021492 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cac4bbbb-1f9f-4107-99ee-cb71771ec460" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.022178 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cac4bbbb-1f9f-4107-99ee-cb71771ec460" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.026778 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "cac4bbbb-1f9f-4107-99ee-cb71771ec460" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.033308 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cac4bbbb-1f9f-4107-99ee-cb71771ec460-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cac4bbbb-1f9f-4107-99ee-cb71771ec460" (UID: "cac4bbbb-1f9f-4107-99ee-cb71771ec460"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.116098 5002 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cac4bbbb-1f9f-4107-99ee-cb71771ec460-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.116154 5002 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.116172 5002 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.116189 5002 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.116210 5002 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cac4bbbb-1f9f-4107-99ee-cb71771ec460-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.116228 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv5lc\" (UniqueName: \"kubernetes.io/projected/cac4bbbb-1f9f-4107-99ee-cb71771ec460-kube-api-access-xv5lc\") on node \"crc\" DevicePath \"\"" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.116245 5002 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cac4bbbb-1f9f-4107-99ee-cb71771ec460-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.623937 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" event={"ID":"cac4bbbb-1f9f-4107-99ee-cb71771ec460","Type":"ContainerDied","Data":"7a39326b13b9b0c10f2706750a200be8136c35c004bfa8e788f9075df02b0dab"} Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.624034 5002 scope.go:117] "RemoveContainer" containerID="deeb54b469ee1431d93f7d5a33c9c094bd182a9b86e0812938c9ee0141909463" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.624276 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-82dlm" Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.642807 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-82dlm"] Dec 09 10:07:50 crc kubenswrapper[5002]: I1209 10:07:50.647112 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-82dlm"] Dec 09 10:07:52 crc kubenswrapper[5002]: I1209 10:07:52.069150 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac4bbbb-1f9f-4107-99ee-cb71771ec460" path="/var/lib/kubelet/pods/cac4bbbb-1f9f-4107-99ee-cb71771ec460/volumes" Dec 09 10:08:07 crc kubenswrapper[5002]: I1209 10:08:07.964717 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:08:07 crc kubenswrapper[5002]: I1209 10:08:07.965504 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:08:07 crc kubenswrapper[5002]: I1209 10:08:07.965560 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:08:07 crc kubenswrapper[5002]: I1209 10:08:07.966277 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a8e4e15cd2cad8dac6467aa0019df2b17c907b359be99120bdb37a1c05087210"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:08:07 crc kubenswrapper[5002]: I1209 10:08:07.966346 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://a8e4e15cd2cad8dac6467aa0019df2b17c907b359be99120bdb37a1c05087210" gracePeriod=600 Dec 09 10:08:08 crc kubenswrapper[5002]: I1209 10:08:08.168203 5002 scope.go:117] "RemoveContainer" containerID="c49d7b26bbe255f1217808981337d8190bdeac4f5008ee17df5242867e3103e7" Dec 09 10:08:08 crc kubenswrapper[5002]: I1209 10:08:08.186599 5002 scope.go:117] "RemoveContainer" containerID="b484d6dfa4fffc554c21f9d9bc8e77a62e96ebb9426f2bb34dea4b2c9244d680" Dec 09 10:08:08 crc kubenswrapper[5002]: I1209 10:08:08.199001 5002 scope.go:117] "RemoveContainer" containerID="a17adbbd63e7acd84c78e9058c55bbb95628ad8b486d7b4f65b3a59682285be9" Dec 09 10:08:08 crc kubenswrapper[5002]: I1209 10:08:08.212656 5002 scope.go:117] "RemoveContainer" containerID="8bd67b1b7ad3e21872e9aa91e61c978e0999546882ee2e21347d4f60e435e0b9" Dec 09 10:08:08 crc kubenswrapper[5002]: I1209 10:08:08.227095 5002 scope.go:117] "RemoveContainer" containerID="301eb2b68a37707ea483536d639e5f476d3cb41ad8c60fbaea2e019fb51bad48" Dec 09 10:08:08 crc kubenswrapper[5002]: I1209 10:08:08.245147 5002 scope.go:117] "RemoveContainer" containerID="9e0798c79b0f2252ab999dbc35cab1e90ac5a68b731b112aeba0ecb47eef4b5f" Dec 09 10:08:08 crc kubenswrapper[5002]: I1209 10:08:08.741165 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="a8e4e15cd2cad8dac6467aa0019df2b17c907b359be99120bdb37a1c05087210" exitCode=0 Dec 09 10:08:08 crc kubenswrapper[5002]: I1209 10:08:08.741517 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"a8e4e15cd2cad8dac6467aa0019df2b17c907b359be99120bdb37a1c05087210"} Dec 09 10:08:08 crc kubenswrapper[5002]: I1209 10:08:08.741544 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"6e8ddc7938962efdbd7e068e9a049cef2b10bcb309a4f79600438289cac64211"} Dec 09 10:08:08 crc kubenswrapper[5002]: I1209 10:08:08.741561 5002 scope.go:117] "RemoveContainer" containerID="44da8a223abf131b459b827b0e8de65b415150f406fe22f2efb7e160cba4166c" Dec 09 10:08:42 crc kubenswrapper[5002]: E1209 10:08:42.778692 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/NetworkManager-dispatcher.service\": RecentStats: unable to find data in memory cache]" Dec 09 10:10:37 crc kubenswrapper[5002]: I1209 10:10:37.964578 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:10:37 crc kubenswrapper[5002]: I1209 10:10:37.965246 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:11:07 crc kubenswrapper[5002]: I1209 10:11:07.966029 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:11:07 crc kubenswrapper[5002]: I1209 10:11:07.966925 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:11:37 crc kubenswrapper[5002]: I1209 10:11:37.965361 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:11:37 crc kubenswrapper[5002]: I1209 10:11:37.966162 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:11:37 crc kubenswrapper[5002]: I1209 10:11:37.966231 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:11:37 crc kubenswrapper[5002]: I1209 10:11:37.967113 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6e8ddc7938962efdbd7e068e9a049cef2b10bcb309a4f79600438289cac64211"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:11:37 crc kubenswrapper[5002]: I1209 10:11:37.967288 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://6e8ddc7938962efdbd7e068e9a049cef2b10bcb309a4f79600438289cac64211" gracePeriod=600 Dec 09 10:11:38 crc kubenswrapper[5002]: I1209 10:11:38.214421 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="6e8ddc7938962efdbd7e068e9a049cef2b10bcb309a4f79600438289cac64211" exitCode=0 Dec 09 10:11:38 crc kubenswrapper[5002]: I1209 10:11:38.214473 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"6e8ddc7938962efdbd7e068e9a049cef2b10bcb309a4f79600438289cac64211"} Dec 09 10:11:38 crc kubenswrapper[5002]: I1209 10:11:38.214511 5002 scope.go:117] "RemoveContainer" containerID="a8e4e15cd2cad8dac6467aa0019df2b17c907b359be99120bdb37a1c05087210" Dec 09 10:11:39 crc kubenswrapper[5002]: I1209 10:11:39.223522 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"637ff8a569cd1216521e8fa15ac9579d9708df3205b27a6dad02b958376b5a55"} Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.305630 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-4qffn"] Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.306412 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac4bbbb-1f9f-4107-99ee-cb71771ec460" containerName="registry" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.306431 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac4bbbb-1f9f-4107-99ee-cb71771ec460" containerName="registry" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.306542 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac4bbbb-1f9f-4107-99ee-cb71771ec460" containerName="registry" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.307091 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.310744 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.311493 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.311845 5002 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-vhsld" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.312021 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.317925 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-4qffn"] Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.406623 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-node-mnt\") pod \"crc-storage-crc-4qffn\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.406888 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgzj5\" (UniqueName: \"kubernetes.io/projected/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-kube-api-access-pgzj5\") pod \"crc-storage-crc-4qffn\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.406958 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-crc-storage\") pod \"crc-storage-crc-4qffn\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.508657 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-node-mnt\") pod \"crc-storage-crc-4qffn\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.508801 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgzj5\" (UniqueName: \"kubernetes.io/projected/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-kube-api-access-pgzj5\") pod \"crc-storage-crc-4qffn\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.508885 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-crc-storage\") pod \"crc-storage-crc-4qffn\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.509159 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-node-mnt\") pod \"crc-storage-crc-4qffn\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.510627 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-crc-storage\") pod \"crc-storage-crc-4qffn\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.544676 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2mnnl"] Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.545309 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovn-controller" containerID="cri-o://dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4" gracePeriod=30 Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.545683 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgzj5\" (UniqueName: \"kubernetes.io/projected/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-kube-api-access-pgzj5\") pod \"crc-storage-crc-4qffn\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.545804 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13" gracePeriod=30 Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.545855 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovn-acl-logging" containerID="cri-o://8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7" gracePeriod=30 Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.545804 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="kube-rbac-proxy-node" containerID="cri-o://7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20" gracePeriod=30 Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.546081 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="sbdb" containerID="cri-o://ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d" gracePeriod=30 Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.546167 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="nbdb" containerID="cri-o://685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f" gracePeriod=30 Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.546208 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="northd" containerID="cri-o://e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d" gracePeriod=30 Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.584773 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" containerID="cri-o://e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424" gracePeriod=30 Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.637059 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.731687 5002 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(d1dc1e456795543c3de544b740bc631c16962b9680b49ef4e316c92c1ea8061c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.731788 5002 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(d1dc1e456795543c3de544b740bc631c16962b9680b49ef4e316c92c1ea8061c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.731878 5002 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(d1dc1e456795543c3de544b740bc631c16962b9680b49ef4e316c92c1ea8061c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.731950 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-4qffn_crc-storage(7b0ffb30-1518-4d4c-ac8b-b619bd493a5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-4qffn_crc-storage(7b0ffb30-1518-4d4c-ac8b-b619bd493a5b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(d1dc1e456795543c3de544b740bc631c16962b9680b49ef4e316c92c1ea8061c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-4qffn" podUID="7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.855270 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/3.log" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.858000 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovn-acl-logging/0.log" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.858908 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovn-controller/0.log" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.859554 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.913976 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnh82\" (UniqueName: \"kubernetes.io/projected/7013527e-73de-4427-af9c-e33663b1c222-kube-api-access-fnh82\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914068 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-etc-openvswitch\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914139 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-bin\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914211 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-var-lib-openvswitch\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914298 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-slash\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914358 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-systemd\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914397 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-log-socket\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914437 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-openvswitch\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914574 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914786 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-slash" (OuterVolumeSpecName: "host-slash") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914987 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915020 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914970 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-log-socket" (OuterVolumeSpecName: "log-socket") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.914994 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915206 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-systemd-units\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915299 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-ovn\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915322 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915512 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-script-lib\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915527 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915623 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-kubelet\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915632 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915683 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-config\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915728 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-env-overrides\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915758 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-node-log\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915805 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7013527e-73de-4427-af9c-e33663b1c222-ovn-node-metrics-cert\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.915930 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-node-log" (OuterVolumeSpecName: "node-log") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916001 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-netd\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916152 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-ovn-kubernetes\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916289 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-var-lib-cni-networks-ovn-kubernetes\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916086 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916229 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916238 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916289 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916361 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-netns\") pod \"7013527e-73de-4427-af9c-e33663b1c222\" (UID: \"7013527e-73de-4427-af9c-e33663b1c222\") " Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916379 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916501 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916809 5002 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-slash\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916896 5002 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-log-socket\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916918 5002 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916938 5002 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916957 5002 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916974 5002 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.916990 5002 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.917006 5002 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.917023 5002 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-node-log\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.917039 5002 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.917056 5002 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.917073 5002 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.917090 5002 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.917108 5002 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.917124 5002 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.917219 5002 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.921042 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.921797 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7013527e-73de-4427-af9c-e33663b1c222-kube-api-access-fnh82" (OuterVolumeSpecName: "kube-api-access-fnh82") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "kube-api-access-fnh82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.923758 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7013527e-73de-4427-af9c-e33663b1c222-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.938517 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "7013527e-73de-4427-af9c-e33663b1c222" (UID: "7013527e-73de-4427-af9c-e33663b1c222"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.940564 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ppjt6"] Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.941376 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.941427 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.941459 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.941479 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.941505 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovn-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.941525 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovn-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.941555 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.941573 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.941599 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="nbdb" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.941617 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="nbdb" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.941639 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovn-acl-logging" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.941658 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovn-acl-logging" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.941687 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="sbdb" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.941705 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="sbdb" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.941735 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.941754 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.941781 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="northd" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.941799 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="northd" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.941855 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="kube-rbac-proxy-node" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.941877 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="kube-rbac-proxy-node" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.941952 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="kubecfg-setup" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.941974 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="kubecfg-setup" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942220 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942250 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovn-acl-logging" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942275 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="sbdb" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942300 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942322 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="kube-rbac-proxy-node" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942345 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942368 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="northd" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942389 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942412 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="nbdb" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942431 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942453 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovn-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.942745 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942773 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: E1209 10:13:00.942804 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.942858 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.943205 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013527e-73de-4427-af9c-e33663b1c222" containerName="ovnkube-controller" Dec 09 10:13:00 crc kubenswrapper[5002]: I1209 10:13:00.948677 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.003007 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovnkube-controller/3.log" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.005552 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovn-acl-logging/0.log" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006230 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mnnl_7013527e-73de-4427-af9c-e33663b1c222/ovn-controller/0.log" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006838 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424" exitCode=0 Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006859 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d" exitCode=0 Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006870 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f" exitCode=0 Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006878 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d" exitCode=0 Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006884 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13" exitCode=0 Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006891 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20" exitCode=0 Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006897 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7" exitCode=143 Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006903 5002 generic.go:334] "Generic (PLEG): container finished" podID="7013527e-73de-4427-af9c-e33663b1c222" containerID="dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4" exitCode=143 Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006931 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006974 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006976 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007002 5002 scope.go:117] "RemoveContainer" containerID="e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.006990 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007175 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007200 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007220 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007237 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007256 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007267 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007277 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007287 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007296 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007320 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007333 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007343 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007357 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007374 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007386 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007396 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007406 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007415 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007425 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007434 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007444 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007453 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007464 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007477 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007520 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007533 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007543 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007553 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007563 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007574 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007583 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007593 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007603 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007612 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.007626 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mnnl" event={"ID":"7013527e-73de-4427-af9c-e33663b1c222","Type":"ContainerDied","Data":"b9e28e18876930927475b2285b8109c84f6a30fe113b3aeb6a3cbf15eaaaf7bc"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.008107 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.008133 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.008143 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.008152 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.008162 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.008171 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.008180 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.008190 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.008199 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.008208 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.009602 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rgf44_28ed6e93-eda5-4648-b185-25d2960ce0f0/kube-multus/2.log" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.010275 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rgf44_28ed6e93-eda5-4648-b185-25d2960ce0f0/kube-multus/1.log" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.010314 5002 generic.go:334] "Generic (PLEG): container finished" podID="28ed6e93-eda5-4648-b185-25d2960ce0f0" containerID="687b3ce71d2c60749a207725755ffed8681eeb08793418478b64111660f7d33d" exitCode=2 Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.010363 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.010425 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rgf44" event={"ID":"28ed6e93-eda5-4648-b185-25d2960ce0f0","Type":"ContainerDied","Data":"687b3ce71d2c60749a207725755ffed8681eeb08793418478b64111660f7d33d"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.010473 5002 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23"} Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.010644 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.010942 5002 scope.go:117] "RemoveContainer" containerID="687b3ce71d2c60749a207725755ffed8681eeb08793418478b64111660f7d33d" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.011099 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-rgf44_openshift-multus(28ed6e93-eda5-4648-b185-25d2960ce0f0)\"" pod="openshift-multus/multus-rgf44" podUID="28ed6e93-eda5-4648-b185-25d2960ce0f0" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.018521 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-slash\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.018571 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-ovnkube-script-lib\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.018608 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-kubelet\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.018700 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-ovn-node-metrics-cert\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.019314 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-run-openvswitch\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.019718 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-run-ovn\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.019771 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.019808 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-run-systemd\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.019870 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-run-netns\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.019900 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-log-socket\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.019939 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-cni-bin\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.019988 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-cni-netd\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020036 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-ovnkube-config\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020069 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-etc-openvswitch\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020119 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-994d8\" (UniqueName: \"kubernetes.io/projected/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-kube-api-access-994d8\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020141 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020156 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-node-log\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020173 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-var-lib-openvswitch\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020207 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-systemd-units\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020229 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-env-overrides\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020294 5002 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7013527e-73de-4427-af9c-e33663b1c222-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020307 5002 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7013527e-73de-4427-af9c-e33663b1c222-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020317 5002 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7013527e-73de-4427-af9c-e33663b1c222-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.020328 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnh82\" (UniqueName: \"kubernetes.io/projected/7013527e-73de-4427-af9c-e33663b1c222-kube-api-access-fnh82\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.041245 5002 scope.go:117] "RemoveContainer" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.066614 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2mnnl"] Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.071577 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2mnnl"] Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.076049 5002 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(38bbd925ba863573e5f237e78f22061c012751dde693d61113db82d81c939e76): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.076106 5002 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(38bbd925ba863573e5f237e78f22061c012751dde693d61113db82d81c939e76): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.076132 5002 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(38bbd925ba863573e5f237e78f22061c012751dde693d61113db82d81c939e76): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.076177 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-4qffn_crc-storage(7b0ffb30-1518-4d4c-ac8b-b619bd493a5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-4qffn_crc-storage(7b0ffb30-1518-4d4c-ac8b-b619bd493a5b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(38bbd925ba863573e5f237e78f22061c012751dde693d61113db82d81c939e76): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-4qffn" podUID="7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.083328 5002 scope.go:117] "RemoveContainer" containerID="ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.100293 5002 scope.go:117] "RemoveContainer" containerID="685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.121767 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-run-openvswitch\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.121851 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-run-ovn\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.121908 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.121934 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-run-systemd\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.121957 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-run-netns\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.121960 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-run-openvswitch\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.121996 5002 scope.go:117] "RemoveContainer" containerID="e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122026 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-run-ovn\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122035 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-log-socket\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.121975 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-log-socket\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122157 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-cni-bin\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122189 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-cni-netd\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122222 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-ovnkube-config\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122226 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-run-systemd\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122291 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122317 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-etc-openvswitch\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122359 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-cni-netd\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122361 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-run-netns\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122428 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-994d8\" (UniqueName: \"kubernetes.io/projected/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-kube-api-access-994d8\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122468 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122514 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-var-lib-openvswitch\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122544 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-node-log\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122582 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-systemd-units\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122620 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-env-overrides\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122732 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-slash\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122765 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-ovnkube-script-lib\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122860 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-kubelet\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122895 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-cni-bin\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.122921 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-ovn-node-metrics-cert\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.124535 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-node-log\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.124593 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-var-lib-openvswitch\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.124670 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-etc-openvswitch\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.124724 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-systemd-units\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.125093 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-slash\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.125159 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-kubelet\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.125268 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.125809 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-ovnkube-config\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.126144 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-ovnkube-script-lib\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.126408 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-env-overrides\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.130608 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-ovn-node-metrics-cert\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.141680 5002 scope.go:117] "RemoveContainer" containerID="3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.143749 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-994d8\" (UniqueName: \"kubernetes.io/projected/7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1-kube-api-access-994d8\") pod \"ovnkube-node-ppjt6\" (UID: \"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.155039 5002 scope.go:117] "RemoveContainer" containerID="7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.167134 5002 scope.go:117] "RemoveContainer" containerID="8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.181393 5002 scope.go:117] "RemoveContainer" containerID="dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.196444 5002 scope.go:117] "RemoveContainer" containerID="39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.210227 5002 scope.go:117] "RemoveContainer" containerID="e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.210565 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424\": container with ID starting with e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424 not found: ID does not exist" containerID="e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.210599 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424"} err="failed to get container status \"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424\": rpc error: code = NotFound desc = could not find container \"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424\": container with ID starting with e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.210625 5002 scope.go:117] "RemoveContainer" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.210939 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\": container with ID starting with d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871 not found: ID does not exist" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.210985 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871"} err="failed to get container status \"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\": rpc error: code = NotFound desc = could not find container \"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\": container with ID starting with d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.211018 5002 scope.go:117] "RemoveContainer" containerID="ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.211291 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\": container with ID starting with ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d not found: ID does not exist" containerID="ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.211322 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d"} err="failed to get container status \"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\": rpc error: code = NotFound desc = could not find container \"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\": container with ID starting with ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.211342 5002 scope.go:117] "RemoveContainer" containerID="685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.211874 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\": container with ID starting with 685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f not found: ID does not exist" containerID="685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.211914 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f"} err="failed to get container status \"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\": rpc error: code = NotFound desc = could not find container \"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\": container with ID starting with 685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.211963 5002 scope.go:117] "RemoveContainer" containerID="e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.212252 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\": container with ID starting with e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d not found: ID does not exist" containerID="e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.212278 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d"} err="failed to get container status \"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\": rpc error: code = NotFound desc = could not find container \"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\": container with ID starting with e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.212293 5002 scope.go:117] "RemoveContainer" containerID="3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.212507 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\": container with ID starting with 3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13 not found: ID does not exist" containerID="3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.212535 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13"} err="failed to get container status \"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\": rpc error: code = NotFound desc = could not find container \"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\": container with ID starting with 3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.212552 5002 scope.go:117] "RemoveContainer" containerID="7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.212789 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\": container with ID starting with 7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20 not found: ID does not exist" containerID="7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.212848 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20"} err="failed to get container status \"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\": rpc error: code = NotFound desc = could not find container \"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\": container with ID starting with 7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.212873 5002 scope.go:117] "RemoveContainer" containerID="8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.213113 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\": container with ID starting with 8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7 not found: ID does not exist" containerID="8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.213150 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7"} err="failed to get container status \"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\": rpc error: code = NotFound desc = could not find container \"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\": container with ID starting with 8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.213172 5002 scope.go:117] "RemoveContainer" containerID="dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.213746 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\": container with ID starting with dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4 not found: ID does not exist" containerID="dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.213782 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4"} err="failed to get container status \"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\": rpc error: code = NotFound desc = could not find container \"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\": container with ID starting with dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.213806 5002 scope.go:117] "RemoveContainer" containerID="39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794" Dec 09 10:13:01 crc kubenswrapper[5002]: E1209 10:13:01.214364 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\": container with ID starting with 39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794 not found: ID does not exist" containerID="39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.214402 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794"} err="failed to get container status \"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\": rpc error: code = NotFound desc = could not find container \"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\": container with ID starting with 39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.214428 5002 scope.go:117] "RemoveContainer" containerID="e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.214667 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424"} err="failed to get container status \"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424\": rpc error: code = NotFound desc = could not find container \"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424\": container with ID starting with e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.214697 5002 scope.go:117] "RemoveContainer" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.214942 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871"} err="failed to get container status \"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\": rpc error: code = NotFound desc = could not find container \"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\": container with ID starting with d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.214971 5002 scope.go:117] "RemoveContainer" containerID="ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.215178 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d"} err="failed to get container status \"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\": rpc error: code = NotFound desc = could not find container \"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\": container with ID starting with ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.215205 5002 scope.go:117] "RemoveContainer" containerID="685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.215402 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f"} err="failed to get container status \"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\": rpc error: code = NotFound desc = could not find container \"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\": container with ID starting with 685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.215432 5002 scope.go:117] "RemoveContainer" containerID="e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.215681 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d"} err="failed to get container status \"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\": rpc error: code = NotFound desc = could not find container \"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\": container with ID starting with e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.215708 5002 scope.go:117] "RemoveContainer" containerID="3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.215941 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13"} err="failed to get container status \"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\": rpc error: code = NotFound desc = could not find container \"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\": container with ID starting with 3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.215974 5002 scope.go:117] "RemoveContainer" containerID="7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.216315 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20"} err="failed to get container status \"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\": rpc error: code = NotFound desc = could not find container \"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\": container with ID starting with 7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.216347 5002 scope.go:117] "RemoveContainer" containerID="8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.216597 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7"} err="failed to get container status \"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\": rpc error: code = NotFound desc = could not find container \"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\": container with ID starting with 8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.216627 5002 scope.go:117] "RemoveContainer" containerID="dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.217343 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4"} err="failed to get container status \"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\": rpc error: code = NotFound desc = could not find container \"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\": container with ID starting with dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.217374 5002 scope.go:117] "RemoveContainer" containerID="39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.217729 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794"} err="failed to get container status \"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\": rpc error: code = NotFound desc = could not find container \"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\": container with ID starting with 39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.217758 5002 scope.go:117] "RemoveContainer" containerID="e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.218229 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424"} err="failed to get container status \"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424\": rpc error: code = NotFound desc = could not find container \"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424\": container with ID starting with e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.218266 5002 scope.go:117] "RemoveContainer" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.218605 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871"} err="failed to get container status \"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\": rpc error: code = NotFound desc = could not find container \"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\": container with ID starting with d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.218635 5002 scope.go:117] "RemoveContainer" containerID="ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.218979 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d"} err="failed to get container status \"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\": rpc error: code = NotFound desc = could not find container \"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\": container with ID starting with ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.219010 5002 scope.go:117] "RemoveContainer" containerID="685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.219415 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f"} err="failed to get container status \"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\": rpc error: code = NotFound desc = could not find container \"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\": container with ID starting with 685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.219448 5002 scope.go:117] "RemoveContainer" containerID="e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.219882 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d"} err="failed to get container status \"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\": rpc error: code = NotFound desc = could not find container \"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\": container with ID starting with e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.219911 5002 scope.go:117] "RemoveContainer" containerID="3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.220127 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13"} err="failed to get container status \"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\": rpc error: code = NotFound desc = could not find container \"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\": container with ID starting with 3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.220152 5002 scope.go:117] "RemoveContainer" containerID="7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.220539 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20"} err="failed to get container status \"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\": rpc error: code = NotFound desc = could not find container \"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\": container with ID starting with 7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.220571 5002 scope.go:117] "RemoveContainer" containerID="8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.220801 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7"} err="failed to get container status \"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\": rpc error: code = NotFound desc = could not find container \"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\": container with ID starting with 8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.220852 5002 scope.go:117] "RemoveContainer" containerID="dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.221143 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4"} err="failed to get container status \"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\": rpc error: code = NotFound desc = could not find container \"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\": container with ID starting with dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.221174 5002 scope.go:117] "RemoveContainer" containerID="39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.221518 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794"} err="failed to get container status \"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\": rpc error: code = NotFound desc = could not find container \"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\": container with ID starting with 39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.221546 5002 scope.go:117] "RemoveContainer" containerID="e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.221753 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424"} err="failed to get container status \"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424\": rpc error: code = NotFound desc = could not find container \"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424\": container with ID starting with e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.221779 5002 scope.go:117] "RemoveContainer" containerID="d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.222050 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871"} err="failed to get container status \"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\": rpc error: code = NotFound desc = could not find container \"d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871\": container with ID starting with d793531ca62af532bfc4e6f7582de7db1e219e2329be5c82f2e30bb11df1b871 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.222079 5002 scope.go:117] "RemoveContainer" containerID="ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.222295 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d"} err="failed to get container status \"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\": rpc error: code = NotFound desc = could not find container \"ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d\": container with ID starting with ce7ef077bc889c4fdc66e721130cfd77f70f09b4e6865c7e0489084fe640544d not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.222324 5002 scope.go:117] "RemoveContainer" containerID="685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.222728 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f"} err="failed to get container status \"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\": rpc error: code = NotFound desc = could not find container \"685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f\": container with ID starting with 685d60a49851f54dc5271e1645d575dff0f9fbe3ef50fe34296ea681b357284f not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.222762 5002 scope.go:117] "RemoveContainer" containerID="e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.223065 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d"} err="failed to get container status \"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\": rpc error: code = NotFound desc = could not find container \"e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d\": container with ID starting with e1103a820e4f843e4946ffa6e51d4763de14d47565b25b5e9063a22cce86f86d not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.223096 5002 scope.go:117] "RemoveContainer" containerID="3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.223993 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13"} err="failed to get container status \"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\": rpc error: code = NotFound desc = could not find container \"3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13\": container with ID starting with 3d9fe35a5ac87704e2b728e3a34273358ecddc032bd275d2563e15c608b47e13 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.224025 5002 scope.go:117] "RemoveContainer" containerID="7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.224300 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20"} err="failed to get container status \"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\": rpc error: code = NotFound desc = could not find container \"7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20\": container with ID starting with 7482c29da9fc8b7a51842c8963cbe0f106617424f8dff818bc90d3170b190a20 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.224334 5002 scope.go:117] "RemoveContainer" containerID="8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.224836 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7"} err="failed to get container status \"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\": rpc error: code = NotFound desc = could not find container \"8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7\": container with ID starting with 8e39d71c51eb2c83f915b02af1b39a7b2fc519c7bf72e63320ac9ec1ec9d45b7 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.224866 5002 scope.go:117] "RemoveContainer" containerID="dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.225058 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4"} err="failed to get container status \"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\": rpc error: code = NotFound desc = could not find container \"dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4\": container with ID starting with dc9c00197f7e726922464ed088af2757c77566ffade28c648ea8e3e2cce573f4 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.225087 5002 scope.go:117] "RemoveContainer" containerID="39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.225320 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794"} err="failed to get container status \"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\": rpc error: code = NotFound desc = could not find container \"39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794\": container with ID starting with 39e50c3f884082c18fc291d144104b141530370826fe33429d93ebbc7bc3d794 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.225352 5002 scope.go:117] "RemoveContainer" containerID="e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.225638 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424"} err="failed to get container status \"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424\": rpc error: code = NotFound desc = could not find container \"e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424\": container with ID starting with e4d7c92bb65c67935a65e2b8f0a344e9dc8d267be7e8dafcfb7b61a61ab1e424 not found: ID does not exist" Dec 09 10:13:01 crc kubenswrapper[5002]: I1209 10:13:01.275456 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:01 crc kubenswrapper[5002]: W1209 10:13:01.299613 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c15dae5_92e8_4ce7_8646_28c1e4b0e1d1.slice/crio-1376d815ab7c808b193d343a351e336e231bb0ae0fee82ed1ca7e1005bee296f WatchSource:0}: Error finding container 1376d815ab7c808b193d343a351e336e231bb0ae0fee82ed1ca7e1005bee296f: Status 404 returned error can't find the container with id 1376d815ab7c808b193d343a351e336e231bb0ae0fee82ed1ca7e1005bee296f Dec 09 10:13:02 crc kubenswrapper[5002]: I1209 10:13:02.019197 5002 generic.go:334] "Generic (PLEG): container finished" podID="7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1" containerID="89218df98cebee545976e4caa02849fc0964691a50ee3bb9d9c1d96077827f40" exitCode=0 Dec 09 10:13:02 crc kubenswrapper[5002]: I1209 10:13:02.019306 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" event={"ID":"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1","Type":"ContainerDied","Data":"89218df98cebee545976e4caa02849fc0964691a50ee3bb9d9c1d96077827f40"} Dec 09 10:13:02 crc kubenswrapper[5002]: I1209 10:13:02.020986 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" event={"ID":"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1","Type":"ContainerStarted","Data":"1376d815ab7c808b193d343a351e336e231bb0ae0fee82ed1ca7e1005bee296f"} Dec 09 10:13:02 crc kubenswrapper[5002]: I1209 10:13:02.094618 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7013527e-73de-4427-af9c-e33663b1c222" path="/var/lib/kubelet/pods/7013527e-73de-4427-af9c-e33663b1c222/volumes" Dec 09 10:13:03 crc kubenswrapper[5002]: I1209 10:13:03.029062 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" event={"ID":"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1","Type":"ContainerStarted","Data":"5ecbcf283c2416348c18e5dae3390221b8c8d7db5d93a7ba6de732f83e08edb6"} Dec 09 10:13:03 crc kubenswrapper[5002]: I1209 10:13:03.029385 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" event={"ID":"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1","Type":"ContainerStarted","Data":"91c2870e68d96a8ac6e3430aa99a9bbbea016cc3ac84be5ab9eba5a087d40a7f"} Dec 09 10:13:03 crc kubenswrapper[5002]: I1209 10:13:03.029395 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" event={"ID":"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1","Type":"ContainerStarted","Data":"e0eb00d32252f403ee462520bace001a1fac8b597a273bfae2b87a76a37d88e9"} Dec 09 10:13:03 crc kubenswrapper[5002]: I1209 10:13:03.029407 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" event={"ID":"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1","Type":"ContainerStarted","Data":"2de47527ce6ea121816b4abfa039351800c309c71f2f859fd3e47927685363d7"} Dec 09 10:13:03 crc kubenswrapper[5002]: I1209 10:13:03.029416 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" event={"ID":"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1","Type":"ContainerStarted","Data":"b6aa5415deed374fbcdd2587c595be2cc6a6199cccdb82f5f5fee819ae412b1f"} Dec 09 10:13:03 crc kubenswrapper[5002]: I1209 10:13:03.029425 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" event={"ID":"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1","Type":"ContainerStarted","Data":"887fda57625152c62da904ece771892d15c984e07b118151627d43ed27213d81"} Dec 09 10:13:05 crc kubenswrapper[5002]: I1209 10:13:05.055111 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" event={"ID":"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1","Type":"ContainerStarted","Data":"ffed92a6f8f98a5e94b8c8214a1c354c191857448fe6810db30db0bdb88c132f"} Dec 09 10:13:08 crc kubenswrapper[5002]: I1209 10:13:08.084114 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" event={"ID":"7c15dae5-92e8-4ce7-8646-28c1e4b0e1d1","Type":"ContainerStarted","Data":"f8a97b94fa891531690a356854399dbf229f0a72d42db3c476b13ac170f55fe7"} Dec 09 10:13:08 crc kubenswrapper[5002]: I1209 10:13:08.122112 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" podStartSLOduration=8.122092856 podStartE2EDuration="8.122092856s" podCreationTimestamp="2025-12-09 10:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:13:08.119094015 +0000 UTC m=+720.511145106" watchObservedRunningTime="2025-12-09 10:13:08.122092856 +0000 UTC m=+720.514143947" Dec 09 10:13:08 crc kubenswrapper[5002]: I1209 10:13:08.326080 5002 scope.go:117] "RemoveContainer" containerID="fdfff3a756cc7a96ff82f208e082d8c283f7d558a14540a6953bcdc664b63f23" Dec 09 10:13:09 crc kubenswrapper[5002]: I1209 10:13:09.098546 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rgf44_28ed6e93-eda5-4648-b185-25d2960ce0f0/kube-multus/2.log" Dec 09 10:13:09 crc kubenswrapper[5002]: I1209 10:13:09.099387 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:09 crc kubenswrapper[5002]: I1209 10:13:09.099460 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:09 crc kubenswrapper[5002]: I1209 10:13:09.099656 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:09 crc kubenswrapper[5002]: I1209 10:13:09.146913 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:09 crc kubenswrapper[5002]: I1209 10:13:09.149599 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:15 crc kubenswrapper[5002]: I1209 10:13:15.060127 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:15 crc kubenswrapper[5002]: I1209 10:13:15.060859 5002 scope.go:117] "RemoveContainer" containerID="687b3ce71d2c60749a207725755ffed8681eeb08793418478b64111660f7d33d" Dec 09 10:13:15 crc kubenswrapper[5002]: E1209 10:13:15.061786 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-rgf44_openshift-multus(28ed6e93-eda5-4648-b185-25d2960ce0f0)\"" pod="openshift-multus/multus-rgf44" podUID="28ed6e93-eda5-4648-b185-25d2960ce0f0" Dec 09 10:13:15 crc kubenswrapper[5002]: I1209 10:13:15.061906 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:15 crc kubenswrapper[5002]: E1209 10:13:15.104603 5002 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(bf951cdb9dcc949acd8a0cbe12a73c14ae64fb827258c5d0554b39c6d46015e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 10:13:15 crc kubenswrapper[5002]: E1209 10:13:15.104686 5002 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(bf951cdb9dcc949acd8a0cbe12a73c14ae64fb827258c5d0554b39c6d46015e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:15 crc kubenswrapper[5002]: E1209 10:13:15.104717 5002 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(bf951cdb9dcc949acd8a0cbe12a73c14ae64fb827258c5d0554b39c6d46015e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:15 crc kubenswrapper[5002]: E1209 10:13:15.104779 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-4qffn_crc-storage(7b0ffb30-1518-4d4c-ac8b-b619bd493a5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-4qffn_crc-storage(7b0ffb30-1518-4d4c-ac8b-b619bd493a5b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(bf951cdb9dcc949acd8a0cbe12a73c14ae64fb827258c5d0554b39c6d46015e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-4qffn" podUID="7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" Dec 09 10:13:28 crc kubenswrapper[5002]: I1209 10:13:28.060249 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:28 crc kubenswrapper[5002]: I1209 10:13:28.066459 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:28 crc kubenswrapper[5002]: E1209 10:13:28.121239 5002 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(5925b02effc5d470a6029845f7dd3e0afe4e33a6290956eef7a13ec451071061): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 10:13:28 crc kubenswrapper[5002]: E1209 10:13:28.121600 5002 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(5925b02effc5d470a6029845f7dd3e0afe4e33a6290956eef7a13ec451071061): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:28 crc kubenswrapper[5002]: E1209 10:13:28.121627 5002 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(5925b02effc5d470a6029845f7dd3e0afe4e33a6290956eef7a13ec451071061): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:28 crc kubenswrapper[5002]: E1209 10:13:28.121760 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-4qffn_crc-storage(7b0ffb30-1518-4d4c-ac8b-b619bd493a5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-4qffn_crc-storage(7b0ffb30-1518-4d4c-ac8b-b619bd493a5b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4qffn_crc-storage_7b0ffb30-1518-4d4c-ac8b-b619bd493a5b_0(5925b02effc5d470a6029845f7dd3e0afe4e33a6290956eef7a13ec451071061): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-4qffn" podUID="7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" Dec 09 10:13:29 crc kubenswrapper[5002]: I1209 10:13:29.060633 5002 scope.go:117] "RemoveContainer" containerID="687b3ce71d2c60749a207725755ffed8681eeb08793418478b64111660f7d33d" Dec 09 10:13:30 crc kubenswrapper[5002]: I1209 10:13:30.234598 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rgf44_28ed6e93-eda5-4648-b185-25d2960ce0f0/kube-multus/2.log" Dec 09 10:13:30 crc kubenswrapper[5002]: I1209 10:13:30.235007 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rgf44" event={"ID":"28ed6e93-eda5-4648-b185-25d2960ce0f0","Type":"ContainerStarted","Data":"0e1883565ed7ae8176692b7353e7ee2d28f68f5eb3cfd5c3bd2bfb92221ef17b"} Dec 09 10:13:31 crc kubenswrapper[5002]: I1209 10:13:31.309958 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppjt6" Dec 09 10:13:41 crc kubenswrapper[5002]: I1209 10:13:41.059932 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:41 crc kubenswrapper[5002]: I1209 10:13:41.061778 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:41 crc kubenswrapper[5002]: I1209 10:13:41.426870 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-4qffn"] Dec 09 10:13:41 crc kubenswrapper[5002]: W1209 10:13:41.435669 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b0ffb30_1518_4d4c_ac8b_b619bd493a5b.slice/crio-1b731e9a22349af32c5e539cb107523d007014623dde255d298d02751862c924 WatchSource:0}: Error finding container 1b731e9a22349af32c5e539cb107523d007014623dde255d298d02751862c924: Status 404 returned error can't find the container with id 1b731e9a22349af32c5e539cb107523d007014623dde255d298d02751862c924 Dec 09 10:13:41 crc kubenswrapper[5002]: I1209 10:13:41.438324 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 10:13:42 crc kubenswrapper[5002]: I1209 10:13:42.326740 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4qffn" event={"ID":"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b","Type":"ContainerStarted","Data":"1b731e9a22349af32c5e539cb107523d007014623dde255d298d02751862c924"} Dec 09 10:13:43 crc kubenswrapper[5002]: I1209 10:13:43.335006 5002 generic.go:334] "Generic (PLEG): container finished" podID="7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" containerID="9cd0cf638903e9cf4ac27f38a9d808ccdcf8521186458e924d0f193ef5e4b6f6" exitCode=0 Dec 09 10:13:43 crc kubenswrapper[5002]: I1209 10:13:43.335064 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4qffn" event={"ID":"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b","Type":"ContainerDied","Data":"9cd0cf638903e9cf4ac27f38a9d808ccdcf8521186458e924d0f193ef5e4b6f6"} Dec 09 10:13:44 crc kubenswrapper[5002]: I1209 10:13:44.590379 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:44 crc kubenswrapper[5002]: I1209 10:13:44.758295 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-crc-storage\") pod \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " Dec 09 10:13:44 crc kubenswrapper[5002]: I1209 10:13:44.758420 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgzj5\" (UniqueName: \"kubernetes.io/projected/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-kube-api-access-pgzj5\") pod \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " Dec 09 10:13:44 crc kubenswrapper[5002]: I1209 10:13:44.758464 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-node-mnt\") pod \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\" (UID: \"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b\") " Dec 09 10:13:44 crc kubenswrapper[5002]: I1209 10:13:44.758659 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" (UID: "7b0ffb30-1518-4d4c-ac8b-b619bd493a5b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:13:44 crc kubenswrapper[5002]: I1209 10:13:44.758921 5002 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:44 crc kubenswrapper[5002]: I1209 10:13:44.763632 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-kube-api-access-pgzj5" (OuterVolumeSpecName: "kube-api-access-pgzj5") pod "7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" (UID: "7b0ffb30-1518-4d4c-ac8b-b619bd493a5b"). InnerVolumeSpecName "kube-api-access-pgzj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:13:44 crc kubenswrapper[5002]: I1209 10:13:44.771523 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" (UID: "7b0ffb30-1518-4d4c-ac8b-b619bd493a5b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:13:44 crc kubenswrapper[5002]: I1209 10:13:44.860323 5002 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:44 crc kubenswrapper[5002]: I1209 10:13:44.860364 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgzj5\" (UniqueName: \"kubernetes.io/projected/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b-kube-api-access-pgzj5\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:45 crc kubenswrapper[5002]: I1209 10:13:45.349280 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4qffn" event={"ID":"7b0ffb30-1518-4d4c-ac8b-b619bd493a5b","Type":"ContainerDied","Data":"1b731e9a22349af32c5e539cb107523d007014623dde255d298d02751862c924"} Dec 09 10:13:45 crc kubenswrapper[5002]: I1209 10:13:45.349584 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b731e9a22349af32c5e539cb107523d007014623dde255d298d02751862c924" Dec 09 10:13:45 crc kubenswrapper[5002]: I1209 10:13:45.349363 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qffn" Dec 09 10:13:48 crc kubenswrapper[5002]: I1209 10:13:48.767623 5002 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.438340 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz"] Dec 09 10:13:52 crc kubenswrapper[5002]: E1209 10:13:52.438807 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" containerName="storage" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.438836 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" containerName="storage" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.438935 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" containerName="storage" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.439648 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.441617 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.450509 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz"] Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.466862 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb466\" (UniqueName: \"kubernetes.io/projected/90bbec9d-d897-4556-845c-249b0d52c202-kube-api-access-mb466\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.466909 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.467043 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.568017 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.568146 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb466\" (UniqueName: \"kubernetes.io/projected/90bbec9d-d897-4556-845c-249b0d52c202-kube-api-access-mb466\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.568186 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.568952 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.568976 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.605073 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb466\" (UniqueName: \"kubernetes.io/projected/90bbec9d-d897-4556-845c-249b0d52c202-kube-api-access-mb466\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:52 crc kubenswrapper[5002]: I1209 10:13:52.755140 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:53 crc kubenswrapper[5002]: I1209 10:13:53.242474 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz"] Dec 09 10:13:53 crc kubenswrapper[5002]: W1209 10:13:53.257110 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90bbec9d_d897_4556_845c_249b0d52c202.slice/crio-f149b81de074781efb3b886551908df64016037b25c7d22b2c40bd25a7bb6fb6 WatchSource:0}: Error finding container f149b81de074781efb3b886551908df64016037b25c7d22b2c40bd25a7bb6fb6: Status 404 returned error can't find the container with id f149b81de074781efb3b886551908df64016037b25c7d22b2c40bd25a7bb6fb6 Dec 09 10:13:53 crc kubenswrapper[5002]: I1209 10:13:53.396168 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" event={"ID":"90bbec9d-d897-4556-845c-249b0d52c202","Type":"ContainerStarted","Data":"f149b81de074781efb3b886551908df64016037b25c7d22b2c40bd25a7bb6fb6"} Dec 09 10:13:54 crc kubenswrapper[5002]: I1209 10:13:54.405739 5002 generic.go:334] "Generic (PLEG): container finished" podID="90bbec9d-d897-4556-845c-249b0d52c202" containerID="7bb6ed045256f75b4b16f4e46435642805fcdf518a17189ce8e6f022a2629d45" exitCode=0 Dec 09 10:13:54 crc kubenswrapper[5002]: I1209 10:13:54.405864 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" event={"ID":"90bbec9d-d897-4556-845c-249b0d52c202","Type":"ContainerDied","Data":"7bb6ed045256f75b4b16f4e46435642805fcdf518a17189ce8e6f022a2629d45"} Dec 09 10:13:54 crc kubenswrapper[5002]: I1209 10:13:54.709369 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d2pdw"] Dec 09 10:13:54 crc kubenswrapper[5002]: I1209 10:13:54.711935 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:13:54 crc kubenswrapper[5002]: I1209 10:13:54.722434 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2pdw"] Dec 09 10:13:54 crc kubenswrapper[5002]: I1209 10:13:54.905384 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4pgt\" (UniqueName: \"kubernetes.io/projected/646295cd-b0c3-419c-936c-32178b49a757-kube-api-access-z4pgt\") pod \"redhat-operators-d2pdw\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:13:54 crc kubenswrapper[5002]: I1209 10:13:54.905526 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-catalog-content\") pod \"redhat-operators-d2pdw\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:13:54 crc kubenswrapper[5002]: I1209 10:13:54.905632 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-utilities\") pod \"redhat-operators-d2pdw\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:13:55 crc kubenswrapper[5002]: I1209 10:13:55.006669 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-utilities\") pod \"redhat-operators-d2pdw\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:13:55 crc kubenswrapper[5002]: I1209 10:13:55.006731 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4pgt\" (UniqueName: \"kubernetes.io/projected/646295cd-b0c3-419c-936c-32178b49a757-kube-api-access-z4pgt\") pod \"redhat-operators-d2pdw\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:13:55 crc kubenswrapper[5002]: I1209 10:13:55.006770 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-catalog-content\") pod \"redhat-operators-d2pdw\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:13:55 crc kubenswrapper[5002]: I1209 10:13:55.007862 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-catalog-content\") pod \"redhat-operators-d2pdw\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:13:55 crc kubenswrapper[5002]: I1209 10:13:55.007983 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-utilities\") pod \"redhat-operators-d2pdw\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:13:55 crc kubenswrapper[5002]: I1209 10:13:55.043060 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4pgt\" (UniqueName: \"kubernetes.io/projected/646295cd-b0c3-419c-936c-32178b49a757-kube-api-access-z4pgt\") pod \"redhat-operators-d2pdw\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:13:55 crc kubenswrapper[5002]: I1209 10:13:55.052308 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:13:55 crc kubenswrapper[5002]: I1209 10:13:55.321433 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2pdw"] Dec 09 10:13:55 crc kubenswrapper[5002]: W1209 10:13:55.325713 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod646295cd_b0c3_419c_936c_32178b49a757.slice/crio-f1350a018652a45d46e284bdc9d34532804cd3976ab91232f2480c31ad727178 WatchSource:0}: Error finding container f1350a018652a45d46e284bdc9d34532804cd3976ab91232f2480c31ad727178: Status 404 returned error can't find the container with id f1350a018652a45d46e284bdc9d34532804cd3976ab91232f2480c31ad727178 Dec 09 10:13:55 crc kubenswrapper[5002]: I1209 10:13:55.411647 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2pdw" event={"ID":"646295cd-b0c3-419c-936c-32178b49a757","Type":"ContainerStarted","Data":"f1350a018652a45d46e284bdc9d34532804cd3976ab91232f2480c31ad727178"} Dec 09 10:13:56 crc kubenswrapper[5002]: I1209 10:13:56.420771 5002 generic.go:334] "Generic (PLEG): container finished" podID="90bbec9d-d897-4556-845c-249b0d52c202" containerID="2afc006909705348c850ae3391a7bce13fa2b6a2cf032653232b60c92f2bc371" exitCode=0 Dec 09 10:13:56 crc kubenswrapper[5002]: I1209 10:13:56.420889 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" event={"ID":"90bbec9d-d897-4556-845c-249b0d52c202","Type":"ContainerDied","Data":"2afc006909705348c850ae3391a7bce13fa2b6a2cf032653232b60c92f2bc371"} Dec 09 10:13:56 crc kubenswrapper[5002]: I1209 10:13:56.423514 5002 generic.go:334] "Generic (PLEG): container finished" podID="646295cd-b0c3-419c-936c-32178b49a757" containerID="55639e7be73d39043f1a6c39193f9cb278c9e323ba72cb5d4a4ba62eececc767" exitCode=0 Dec 09 10:13:56 crc kubenswrapper[5002]: I1209 10:13:56.423590 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2pdw" event={"ID":"646295cd-b0c3-419c-936c-32178b49a757","Type":"ContainerDied","Data":"55639e7be73d39043f1a6c39193f9cb278c9e323ba72cb5d4a4ba62eececc767"} Dec 09 10:13:57 crc kubenswrapper[5002]: I1209 10:13:57.439232 5002 generic.go:334] "Generic (PLEG): container finished" podID="90bbec9d-d897-4556-845c-249b0d52c202" containerID="dc597c2fe2ff841b99a867029a90c8f53281995c8a2c6ccf28df53060cec5096" exitCode=0 Dec 09 10:13:57 crc kubenswrapper[5002]: I1209 10:13:57.439292 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" event={"ID":"90bbec9d-d897-4556-845c-249b0d52c202","Type":"ContainerDied","Data":"dc597c2fe2ff841b99a867029a90c8f53281995c8a2c6ccf28df53060cec5096"} Dec 09 10:13:57 crc kubenswrapper[5002]: I1209 10:13:57.444892 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2pdw" event={"ID":"646295cd-b0c3-419c-936c-32178b49a757","Type":"ContainerStarted","Data":"717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971"} Dec 09 10:13:58 crc kubenswrapper[5002]: I1209 10:13:58.454095 5002 generic.go:334] "Generic (PLEG): container finished" podID="646295cd-b0c3-419c-936c-32178b49a757" containerID="717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971" exitCode=0 Dec 09 10:13:58 crc kubenswrapper[5002]: I1209 10:13:58.454219 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2pdw" event={"ID":"646295cd-b0c3-419c-936c-32178b49a757","Type":"ContainerDied","Data":"717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971"} Dec 09 10:13:58 crc kubenswrapper[5002]: I1209 10:13:58.730472 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:58 crc kubenswrapper[5002]: I1209 10:13:58.857174 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-bundle\") pod \"90bbec9d-d897-4556-845c-249b0d52c202\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " Dec 09 10:13:58 crc kubenswrapper[5002]: I1209 10:13:58.857303 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb466\" (UniqueName: \"kubernetes.io/projected/90bbec9d-d897-4556-845c-249b0d52c202-kube-api-access-mb466\") pod \"90bbec9d-d897-4556-845c-249b0d52c202\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " Dec 09 10:13:58 crc kubenswrapper[5002]: I1209 10:13:58.857349 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-util\") pod \"90bbec9d-d897-4556-845c-249b0d52c202\" (UID: \"90bbec9d-d897-4556-845c-249b0d52c202\") " Dec 09 10:13:58 crc kubenswrapper[5002]: I1209 10:13:58.860167 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-bundle" (OuterVolumeSpecName: "bundle") pod "90bbec9d-d897-4556-845c-249b0d52c202" (UID: "90bbec9d-d897-4556-845c-249b0d52c202"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:13:58 crc kubenswrapper[5002]: I1209 10:13:58.866554 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90bbec9d-d897-4556-845c-249b0d52c202-kube-api-access-mb466" (OuterVolumeSpecName: "kube-api-access-mb466") pod "90bbec9d-d897-4556-845c-249b0d52c202" (UID: "90bbec9d-d897-4556-845c-249b0d52c202"). InnerVolumeSpecName "kube-api-access-mb466". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:13:58 crc kubenswrapper[5002]: I1209 10:13:58.958587 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb466\" (UniqueName: \"kubernetes.io/projected/90bbec9d-d897-4556-845c-249b0d52c202-kube-api-access-mb466\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:58 crc kubenswrapper[5002]: I1209 10:13:58.958635 5002 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:59 crc kubenswrapper[5002]: I1209 10:13:59.113316 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-util" (OuterVolumeSpecName: "util") pod "90bbec9d-d897-4556-845c-249b0d52c202" (UID: "90bbec9d-d897-4556-845c-249b0d52c202"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:13:59 crc kubenswrapper[5002]: I1209 10:13:59.161463 5002 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90bbec9d-d897-4556-845c-249b0d52c202-util\") on node \"crc\" DevicePath \"\"" Dec 09 10:13:59 crc kubenswrapper[5002]: I1209 10:13:59.463178 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" event={"ID":"90bbec9d-d897-4556-845c-249b0d52c202","Type":"ContainerDied","Data":"f149b81de074781efb3b886551908df64016037b25c7d22b2c40bd25a7bb6fb6"} Dec 09 10:13:59 crc kubenswrapper[5002]: I1209 10:13:59.463266 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f149b81de074781efb3b886551908df64016037b25c7d22b2c40bd25a7bb6fb6" Dec 09 10:13:59 crc kubenswrapper[5002]: I1209 10:13:59.463206 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz" Dec 09 10:13:59 crc kubenswrapper[5002]: I1209 10:13:59.466272 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2pdw" event={"ID":"646295cd-b0c3-419c-936c-32178b49a757","Type":"ContainerStarted","Data":"59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d"} Dec 09 10:13:59 crc kubenswrapper[5002]: I1209 10:13:59.780692 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d2pdw" podStartSLOduration=3.298418785 podStartE2EDuration="5.780677837s" podCreationTimestamp="2025-12-09 10:13:54 +0000 UTC" firstStartedPulling="2025-12-09 10:13:56.426729036 +0000 UTC m=+768.818780157" lastFinishedPulling="2025-12-09 10:13:58.908988128 +0000 UTC m=+771.301039209" observedRunningTime="2025-12-09 10:13:59.492183636 +0000 UTC m=+771.884234807" watchObservedRunningTime="2025-12-09 10:13:59.780677837 +0000 UTC m=+772.172728908" Dec 09 10:14:02 crc kubenswrapper[5002]: I1209 10:14:02.946746 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-dzhl5"] Dec 09 10:14:02 crc kubenswrapper[5002]: E1209 10:14:02.947978 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90bbec9d-d897-4556-845c-249b0d52c202" containerName="extract" Dec 09 10:14:02 crc kubenswrapper[5002]: I1209 10:14:02.948063 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="90bbec9d-d897-4556-845c-249b0d52c202" containerName="extract" Dec 09 10:14:02 crc kubenswrapper[5002]: E1209 10:14:02.948138 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90bbec9d-d897-4556-845c-249b0d52c202" containerName="pull" Dec 09 10:14:02 crc kubenswrapper[5002]: I1209 10:14:02.948199 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="90bbec9d-d897-4556-845c-249b0d52c202" containerName="pull" Dec 09 10:14:02 crc kubenswrapper[5002]: E1209 10:14:02.948268 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90bbec9d-d897-4556-845c-249b0d52c202" containerName="util" Dec 09 10:14:02 crc kubenswrapper[5002]: I1209 10:14:02.948347 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="90bbec9d-d897-4556-845c-249b0d52c202" containerName="util" Dec 09 10:14:02 crc kubenswrapper[5002]: I1209 10:14:02.948519 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="90bbec9d-d897-4556-845c-249b0d52c202" containerName="extract" Dec 09 10:14:02 crc kubenswrapper[5002]: I1209 10:14:02.949031 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dzhl5" Dec 09 10:14:02 crc kubenswrapper[5002]: I1209 10:14:02.951391 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-qvpvv" Dec 09 10:14:02 crc kubenswrapper[5002]: I1209 10:14:02.951591 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 09 10:14:02 crc kubenswrapper[5002]: I1209 10:14:02.951844 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 09 10:14:02 crc kubenswrapper[5002]: I1209 10:14:02.962199 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-dzhl5"] Dec 09 10:14:03 crc kubenswrapper[5002]: I1209 10:14:03.113868 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zzqq\" (UniqueName: \"kubernetes.io/projected/c1916146-c9a5-4a76-ba95-d677b5bc1263-kube-api-access-6zzqq\") pod \"nmstate-operator-5b5b58f5c8-dzhl5\" (UID: \"c1916146-c9a5-4a76-ba95-d677b5bc1263\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dzhl5" Dec 09 10:14:03 crc kubenswrapper[5002]: I1209 10:14:03.214667 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zzqq\" (UniqueName: \"kubernetes.io/projected/c1916146-c9a5-4a76-ba95-d677b5bc1263-kube-api-access-6zzqq\") pod \"nmstate-operator-5b5b58f5c8-dzhl5\" (UID: \"c1916146-c9a5-4a76-ba95-d677b5bc1263\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dzhl5" Dec 09 10:14:03 crc kubenswrapper[5002]: I1209 10:14:03.238610 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zzqq\" (UniqueName: \"kubernetes.io/projected/c1916146-c9a5-4a76-ba95-d677b5bc1263-kube-api-access-6zzqq\") pod \"nmstate-operator-5b5b58f5c8-dzhl5\" (UID: \"c1916146-c9a5-4a76-ba95-d677b5bc1263\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dzhl5" Dec 09 10:14:03 crc kubenswrapper[5002]: I1209 10:14:03.264413 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dzhl5" Dec 09 10:14:03 crc kubenswrapper[5002]: I1209 10:14:03.503124 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-dzhl5"] Dec 09 10:14:03 crc kubenswrapper[5002]: W1209 10:14:03.506516 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1916146_c9a5_4a76_ba95_d677b5bc1263.slice/crio-37e83b6b299e6137d0e0032fdf3a2a09157a5b489bc1c15223dd14d8b1605809 WatchSource:0}: Error finding container 37e83b6b299e6137d0e0032fdf3a2a09157a5b489bc1c15223dd14d8b1605809: Status 404 returned error can't find the container with id 37e83b6b299e6137d0e0032fdf3a2a09157a5b489bc1c15223dd14d8b1605809 Dec 09 10:14:04 crc kubenswrapper[5002]: I1209 10:14:04.499786 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dzhl5" event={"ID":"c1916146-c9a5-4a76-ba95-d677b5bc1263","Type":"ContainerStarted","Data":"37e83b6b299e6137d0e0032fdf3a2a09157a5b489bc1c15223dd14d8b1605809"} Dec 09 10:14:05 crc kubenswrapper[5002]: I1209 10:14:05.052775 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:14:05 crc kubenswrapper[5002]: I1209 10:14:05.052883 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:14:06 crc kubenswrapper[5002]: I1209 10:14:06.122232 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d2pdw" podUID="646295cd-b0c3-419c-936c-32178b49a757" containerName="registry-server" probeResult="failure" output=< Dec 09 10:14:06 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 10:14:06 crc kubenswrapper[5002]: > Dec 09 10:14:07 crc kubenswrapper[5002]: I1209 10:14:07.525858 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dzhl5" event={"ID":"c1916146-c9a5-4a76-ba95-d677b5bc1263","Type":"ContainerStarted","Data":"f1e2e07bc9a78304bbfd985cbb4858f2ed7d4f7e0043d4e1ef8c8f5b253ff003"} Dec 09 10:14:07 crc kubenswrapper[5002]: I1209 10:14:07.555683 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dzhl5" podStartSLOduration=2.655158512 podStartE2EDuration="5.555665123s" podCreationTimestamp="2025-12-09 10:14:02 +0000 UTC" firstStartedPulling="2025-12-09 10:14:03.508974753 +0000 UTC m=+775.901025834" lastFinishedPulling="2025-12-09 10:14:06.409481364 +0000 UTC m=+778.801532445" observedRunningTime="2025-12-09 10:14:07.552885888 +0000 UTC m=+779.944936979" watchObservedRunningTime="2025-12-09 10:14:07.555665123 +0000 UTC m=+779.947716214" Dec 09 10:14:07 crc kubenswrapper[5002]: I1209 10:14:07.964435 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:14:07 crc kubenswrapper[5002]: I1209 10:14:07.964494 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.035523 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-xw282"] Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.036870 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xw282" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.039264 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-np9qz" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.040188 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm"] Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.040864 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.044000 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.062266 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm"] Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.069753 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-xw282"] Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.093749 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-q4qht"] Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.094557 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.164271 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt"] Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.165119 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.175181 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.175213 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.175359 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hpq6s" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.180437 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f62b3fbf-c7ef-4477-a655-5432c8392db4-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-crlfm\" (UID: \"f62b3fbf-c7ef-4477-a655-5432c8392db4\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.180471 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtkbt\" (UniqueName: \"kubernetes.io/projected/f62b3fbf-c7ef-4477-a655-5432c8392db4-kube-api-access-dtkbt\") pod \"nmstate-webhook-5f6d4c5ccb-crlfm\" (UID: \"f62b3fbf-c7ef-4477-a655-5432c8392db4\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.180493 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vks8\" (UniqueName: \"kubernetes.io/projected/14984393-4074-4385-80a1-04fc38c0580f-kube-api-access-6vks8\") pod \"nmstate-metrics-7f946cbc9-xw282\" (UID: \"14984393-4074-4385-80a1-04fc38c0580f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xw282" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.187182 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt"] Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.282340 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svggl\" (UniqueName: \"kubernetes.io/projected/802b1295-7ccc-4cf1-a241-045ff519eae4-kube-api-access-svggl\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.282411 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcee10e4-739d-4559-a50f-3fbb1300ad78-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-qmxnt\" (UID: \"fcee10e4-739d-4559-a50f-3fbb1300ad78\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.282441 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-449d4\" (UniqueName: \"kubernetes.io/projected/fcee10e4-739d-4559-a50f-3fbb1300ad78-kube-api-access-449d4\") pod \"nmstate-console-plugin-7fbb5f6569-qmxnt\" (UID: \"fcee10e4-739d-4559-a50f-3fbb1300ad78\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.282472 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/802b1295-7ccc-4cf1-a241-045ff519eae4-ovs-socket\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.282538 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/802b1295-7ccc-4cf1-a241-045ff519eae4-nmstate-lock\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.282566 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcee10e4-739d-4559-a50f-3fbb1300ad78-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-qmxnt\" (UID: \"fcee10e4-739d-4559-a50f-3fbb1300ad78\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.282590 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/802b1295-7ccc-4cf1-a241-045ff519eae4-dbus-socket\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.282651 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f62b3fbf-c7ef-4477-a655-5432c8392db4-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-crlfm\" (UID: \"f62b3fbf-c7ef-4477-a655-5432c8392db4\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.282679 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtkbt\" (UniqueName: \"kubernetes.io/projected/f62b3fbf-c7ef-4477-a655-5432c8392db4-kube-api-access-dtkbt\") pod \"nmstate-webhook-5f6d4c5ccb-crlfm\" (UID: \"f62b3fbf-c7ef-4477-a655-5432c8392db4\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.282710 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vks8\" (UniqueName: \"kubernetes.io/projected/14984393-4074-4385-80a1-04fc38c0580f-kube-api-access-6vks8\") pod \"nmstate-metrics-7f946cbc9-xw282\" (UID: \"14984393-4074-4385-80a1-04fc38c0580f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xw282" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.299052 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f62b3fbf-c7ef-4477-a655-5432c8392db4-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-crlfm\" (UID: \"f62b3fbf-c7ef-4477-a655-5432c8392db4\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.305289 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtkbt\" (UniqueName: \"kubernetes.io/projected/f62b3fbf-c7ef-4477-a655-5432c8392db4-kube-api-access-dtkbt\") pod \"nmstate-webhook-5f6d4c5ccb-crlfm\" (UID: \"f62b3fbf-c7ef-4477-a655-5432c8392db4\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.306240 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vks8\" (UniqueName: \"kubernetes.io/projected/14984393-4074-4385-80a1-04fc38c0580f-kube-api-access-6vks8\") pod \"nmstate-metrics-7f946cbc9-xw282\" (UID: \"14984393-4074-4385-80a1-04fc38c0580f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xw282" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.346867 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5547b9b954-cp759"] Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.347635 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.353896 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xw282" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.363497 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.375379 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5547b9b954-cp759"] Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.384758 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcee10e4-739d-4559-a50f-3fbb1300ad78-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-qmxnt\" (UID: \"fcee10e4-739d-4559-a50f-3fbb1300ad78\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.384838 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-449d4\" (UniqueName: \"kubernetes.io/projected/fcee10e4-739d-4559-a50f-3fbb1300ad78-kube-api-access-449d4\") pod \"nmstate-console-plugin-7fbb5f6569-qmxnt\" (UID: \"fcee10e4-739d-4559-a50f-3fbb1300ad78\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.384883 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/802b1295-7ccc-4cf1-a241-045ff519eae4-ovs-socket\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.384941 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/802b1295-7ccc-4cf1-a241-045ff519eae4-nmstate-lock\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.384971 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcee10e4-739d-4559-a50f-3fbb1300ad78-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-qmxnt\" (UID: \"fcee10e4-739d-4559-a50f-3fbb1300ad78\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.385009 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/802b1295-7ccc-4cf1-a241-045ff519eae4-dbus-socket\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.385063 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svggl\" (UniqueName: \"kubernetes.io/projected/802b1295-7ccc-4cf1-a241-045ff519eae4-kube-api-access-svggl\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.385210 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/802b1295-7ccc-4cf1-a241-045ff519eae4-ovs-socket\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.385458 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/802b1295-7ccc-4cf1-a241-045ff519eae4-nmstate-lock\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.386617 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcee10e4-739d-4559-a50f-3fbb1300ad78-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-qmxnt\" (UID: \"fcee10e4-739d-4559-a50f-3fbb1300ad78\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.394347 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcee10e4-739d-4559-a50f-3fbb1300ad78-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-qmxnt\" (UID: \"fcee10e4-739d-4559-a50f-3fbb1300ad78\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.394672 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/802b1295-7ccc-4cf1-a241-045ff519eae4-dbus-socket\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.407335 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svggl\" (UniqueName: \"kubernetes.io/projected/802b1295-7ccc-4cf1-a241-045ff519eae4-kube-api-access-svggl\") pod \"nmstate-handler-q4qht\" (UID: \"802b1295-7ccc-4cf1-a241-045ff519eae4\") " pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.409137 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.410245 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-449d4\" (UniqueName: \"kubernetes.io/projected/fcee10e4-739d-4559-a50f-3fbb1300ad78-kube-api-access-449d4\") pod \"nmstate-console-plugin-7fbb5f6569-qmxnt\" (UID: \"fcee10e4-739d-4559-a50f-3fbb1300ad78\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.485258 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.489290 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-oauth-serving-cert\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.489324 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-trusted-ca-bundle\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.489373 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-console-config\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.489410 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzwvn\" (UniqueName: \"kubernetes.io/projected/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-kube-api-access-jzwvn\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.489490 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-service-ca\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.489547 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-console-oauth-config\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.489586 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-console-serving-cert\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.565887 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q4qht" event={"ID":"802b1295-7ccc-4cf1-a241-045ff519eae4","Type":"ContainerStarted","Data":"f1ffec9a53d084dc5cc5f3820d77bffd30ec9e1fd2f1c2f3a178ba3965618ae7"} Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.590610 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-console-config\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.590657 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzwvn\" (UniqueName: \"kubernetes.io/projected/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-kube-api-access-jzwvn\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.590681 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-service-ca\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.590704 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-console-oauth-config\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.590725 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-console-serving-cert\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.590742 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-oauth-serving-cert\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.590756 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-trusted-ca-bundle\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.591585 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-console-config\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.593840 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-trusted-ca-bundle\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.594016 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-service-ca\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.594442 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-console-oauth-config\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.594482 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-oauth-serving-cert\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.595230 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-console-serving-cert\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.606375 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzwvn\" (UniqueName: \"kubernetes.io/projected/fb2fd211-c4c9-40d9-a7bc-af4792b9e058-kube-api-access-jzwvn\") pod \"console-5547b9b954-cp759\" (UID: \"fb2fd211-c4c9-40d9-a7bc-af4792b9e058\") " pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.660028 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt"] Dec 09 10:14:13 crc kubenswrapper[5002]: W1209 10:14:13.664602 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcee10e4_739d_4559_a50f_3fbb1300ad78.slice/crio-94ac986a0de077660bc817dd67b1d3cdfcd7328b64f76b91c5def8dde79ad70f WatchSource:0}: Error finding container 94ac986a0de077660bc817dd67b1d3cdfcd7328b64f76b91c5def8dde79ad70f: Status 404 returned error can't find the container with id 94ac986a0de077660bc817dd67b1d3cdfcd7328b64f76b91c5def8dde79ad70f Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.665384 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.781755 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm"] Dec 09 10:14:13 crc kubenswrapper[5002]: W1209 10:14:13.788982 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf62b3fbf_c7ef_4477_a655_5432c8392db4.slice/crio-bbb3f3a4bcebe3555ffe6b03000626546e8691edf8cbcd933f09450d6bf5100a WatchSource:0}: Error finding container bbb3f3a4bcebe3555ffe6b03000626546e8691edf8cbcd933f09450d6bf5100a: Status 404 returned error can't find the container with id bbb3f3a4bcebe3555ffe6b03000626546e8691edf8cbcd933f09450d6bf5100a Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.824202 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-xw282"] Dec 09 10:14:13 crc kubenswrapper[5002]: W1209 10:14:13.833521 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14984393_4074_4385_80a1_04fc38c0580f.slice/crio-2c3baec39c966f164069945d2dd6eccf90f157ce8bba04de2450046e8d798215 WatchSource:0}: Error finding container 2c3baec39c966f164069945d2dd6eccf90f157ce8bba04de2450046e8d798215: Status 404 returned error can't find the container with id 2c3baec39c966f164069945d2dd6eccf90f157ce8bba04de2450046e8d798215 Dec 09 10:14:13 crc kubenswrapper[5002]: I1209 10:14:13.854207 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5547b9b954-cp759"] Dec 09 10:14:13 crc kubenswrapper[5002]: W1209 10:14:13.860113 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2fd211_c4c9_40d9_a7bc_af4792b9e058.slice/crio-b20d8273a1597bb2fbe104531b0d79dfa4408173198f296e77a8ed7f7ac4e19f WatchSource:0}: Error finding container b20d8273a1597bb2fbe104531b0d79dfa4408173198f296e77a8ed7f7ac4e19f: Status 404 returned error can't find the container with id b20d8273a1597bb2fbe104531b0d79dfa4408173198f296e77a8ed7f7ac4e19f Dec 09 10:14:14 crc kubenswrapper[5002]: I1209 10:14:14.575156 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5547b9b954-cp759" event={"ID":"fb2fd211-c4c9-40d9-a7bc-af4792b9e058","Type":"ContainerStarted","Data":"72043ad19c6503f1a8d78ed6f311bcbfa305effcd42851a49b6d56fe6e387ef6"} Dec 09 10:14:14 crc kubenswrapper[5002]: I1209 10:14:14.575557 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5547b9b954-cp759" event={"ID":"fb2fd211-c4c9-40d9-a7bc-af4792b9e058","Type":"ContainerStarted","Data":"b20d8273a1597bb2fbe104531b0d79dfa4408173198f296e77a8ed7f7ac4e19f"} Dec 09 10:14:14 crc kubenswrapper[5002]: I1209 10:14:14.576599 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xw282" event={"ID":"14984393-4074-4385-80a1-04fc38c0580f","Type":"ContainerStarted","Data":"2c3baec39c966f164069945d2dd6eccf90f157ce8bba04de2450046e8d798215"} Dec 09 10:14:14 crc kubenswrapper[5002]: I1209 10:14:14.577880 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" event={"ID":"f62b3fbf-c7ef-4477-a655-5432c8392db4","Type":"ContainerStarted","Data":"bbb3f3a4bcebe3555ffe6b03000626546e8691edf8cbcd933f09450d6bf5100a"} Dec 09 10:14:14 crc kubenswrapper[5002]: I1209 10:14:14.579196 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" event={"ID":"fcee10e4-739d-4559-a50f-3fbb1300ad78","Type":"ContainerStarted","Data":"94ac986a0de077660bc817dd67b1d3cdfcd7328b64f76b91c5def8dde79ad70f"} Dec 09 10:14:14 crc kubenswrapper[5002]: I1209 10:14:14.593581 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5547b9b954-cp759" podStartSLOduration=1.5935661140000001 podStartE2EDuration="1.593566114s" podCreationTimestamp="2025-12-09 10:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:14:14.593079491 +0000 UTC m=+786.985130582" watchObservedRunningTime="2025-12-09 10:14:14.593566114 +0000 UTC m=+786.985617195" Dec 09 10:14:15 crc kubenswrapper[5002]: I1209 10:14:15.106623 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:14:15 crc kubenswrapper[5002]: I1209 10:14:15.158602 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.491096 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d2pdw"] Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.492007 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d2pdw" podUID="646295cd-b0c3-419c-936c-32178b49a757" containerName="registry-server" containerID="cri-o://59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d" gracePeriod=2 Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.597429 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xw282" event={"ID":"14984393-4074-4385-80a1-04fc38c0580f","Type":"ContainerStarted","Data":"c5cf1bd071b29c76bf11f837f747d5bce58eee9d61e4e82bb32f22f7f0280179"} Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.602595 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" event={"ID":"f62b3fbf-c7ef-4477-a655-5432c8392db4","Type":"ContainerStarted","Data":"1378bebe73129c7435deb6b08dd1366ca985e987cbcefad1c307aa8e35f786e8"} Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.602807 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.604846 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q4qht" event={"ID":"802b1295-7ccc-4cf1-a241-045ff519eae4","Type":"ContainerStarted","Data":"f033e2a69fa079c5177592219535c9df10b1e3c832d77bdbdf54ff2a157a8385"} Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.605855 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.606293 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" event={"ID":"fcee10e4-739d-4559-a50f-3fbb1300ad78","Type":"ContainerStarted","Data":"b380606d39cfdf8f00d0d2dcdde15df55c62f8f48948b128ef69680cfaae302c"} Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.635163 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" podStartSLOduration=1.927021941 podStartE2EDuration="4.635132664s" podCreationTimestamp="2025-12-09 10:14:13 +0000 UTC" firstStartedPulling="2025-12-09 10:14:13.792973058 +0000 UTC m=+786.185024169" lastFinishedPulling="2025-12-09 10:14:16.501083811 +0000 UTC m=+788.893134892" observedRunningTime="2025-12-09 10:14:17.628085124 +0000 UTC m=+790.020136235" watchObservedRunningTime="2025-12-09 10:14:17.635132664 +0000 UTC m=+790.027183775" Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.658006 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-q4qht" podStartSLOduration=1.575182518 podStartE2EDuration="4.65797617s" podCreationTimestamp="2025-12-09 10:14:13 +0000 UTC" firstStartedPulling="2025-12-09 10:14:13.42979728 +0000 UTC m=+785.821848351" lastFinishedPulling="2025-12-09 10:14:16.512590892 +0000 UTC m=+788.904642003" observedRunningTime="2025-12-09 10:14:17.649209404 +0000 UTC m=+790.041260525" watchObservedRunningTime="2025-12-09 10:14:17.65797617 +0000 UTC m=+790.050027291" Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.672093 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qmxnt" podStartSLOduration=1.8567377569999999 podStartE2EDuration="4.67206443s" podCreationTimestamp="2025-12-09 10:14:13 +0000 UTC" firstStartedPulling="2025-12-09 10:14:13.672646065 +0000 UTC m=+786.064697156" lastFinishedPulling="2025-12-09 10:14:16.487972748 +0000 UTC m=+788.880023829" observedRunningTime="2025-12-09 10:14:17.671918746 +0000 UTC m=+790.063969867" watchObservedRunningTime="2025-12-09 10:14:17.67206443 +0000 UTC m=+790.064115551" Dec 09 10:14:17 crc kubenswrapper[5002]: I1209 10:14:17.981141 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.176038 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-catalog-content\") pod \"646295cd-b0c3-419c-936c-32178b49a757\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.176119 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4pgt\" (UniqueName: \"kubernetes.io/projected/646295cd-b0c3-419c-936c-32178b49a757-kube-api-access-z4pgt\") pod \"646295cd-b0c3-419c-936c-32178b49a757\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.176186 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-utilities\") pod \"646295cd-b0c3-419c-936c-32178b49a757\" (UID: \"646295cd-b0c3-419c-936c-32178b49a757\") " Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.177093 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-utilities" (OuterVolumeSpecName: "utilities") pod "646295cd-b0c3-419c-936c-32178b49a757" (UID: "646295cd-b0c3-419c-936c-32178b49a757"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.200907 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/646295cd-b0c3-419c-936c-32178b49a757-kube-api-access-z4pgt" (OuterVolumeSpecName: "kube-api-access-z4pgt") pod "646295cd-b0c3-419c-936c-32178b49a757" (UID: "646295cd-b0c3-419c-936c-32178b49a757"). InnerVolumeSpecName "kube-api-access-z4pgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.277846 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4pgt\" (UniqueName: \"kubernetes.io/projected/646295cd-b0c3-419c-936c-32178b49a757-kube-api-access-z4pgt\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.277882 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.300401 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "646295cd-b0c3-419c-936c-32178b49a757" (UID: "646295cd-b0c3-419c-936c-32178b49a757"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.379056 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/646295cd-b0c3-419c-936c-32178b49a757-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.616284 5002 generic.go:334] "Generic (PLEG): container finished" podID="646295cd-b0c3-419c-936c-32178b49a757" containerID="59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d" exitCode=0 Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.616869 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2pdw" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.618893 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2pdw" event={"ID":"646295cd-b0c3-419c-936c-32178b49a757","Type":"ContainerDied","Data":"59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d"} Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.618951 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2pdw" event={"ID":"646295cd-b0c3-419c-936c-32178b49a757","Type":"ContainerDied","Data":"f1350a018652a45d46e284bdc9d34532804cd3976ab91232f2480c31ad727178"} Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.618975 5002 scope.go:117] "RemoveContainer" containerID="59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.644055 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d2pdw"] Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.647419 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d2pdw"] Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.652941 5002 scope.go:117] "RemoveContainer" containerID="717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.668172 5002 scope.go:117] "RemoveContainer" containerID="55639e7be73d39043f1a6c39193f9cb278c9e323ba72cb5d4a4ba62eececc767" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.711302 5002 scope.go:117] "RemoveContainer" containerID="59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d" Dec 09 10:14:18 crc kubenswrapper[5002]: E1209 10:14:18.711780 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d\": container with ID starting with 59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d not found: ID does not exist" containerID="59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.711861 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d"} err="failed to get container status \"59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d\": rpc error: code = NotFound desc = could not find container \"59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d\": container with ID starting with 59a38b068796218e65b392dd9b6503f6b6424fddca821e8df2dc5432bd40e79d not found: ID does not exist" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.711901 5002 scope.go:117] "RemoveContainer" containerID="717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971" Dec 09 10:14:18 crc kubenswrapper[5002]: E1209 10:14:18.712305 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971\": container with ID starting with 717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971 not found: ID does not exist" containerID="717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.712346 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971"} err="failed to get container status \"717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971\": rpc error: code = NotFound desc = could not find container \"717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971\": container with ID starting with 717765310b305275bd300101c05169849a08bb507c408b75025fd65e13403971 not found: ID does not exist" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.712373 5002 scope.go:117] "RemoveContainer" containerID="55639e7be73d39043f1a6c39193f9cb278c9e323ba72cb5d4a4ba62eececc767" Dec 09 10:14:18 crc kubenswrapper[5002]: E1209 10:14:18.712671 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55639e7be73d39043f1a6c39193f9cb278c9e323ba72cb5d4a4ba62eececc767\": container with ID starting with 55639e7be73d39043f1a6c39193f9cb278c9e323ba72cb5d4a4ba62eececc767 not found: ID does not exist" containerID="55639e7be73d39043f1a6c39193f9cb278c9e323ba72cb5d4a4ba62eececc767" Dec 09 10:14:18 crc kubenswrapper[5002]: I1209 10:14:18.712709 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55639e7be73d39043f1a6c39193f9cb278c9e323ba72cb5d4a4ba62eececc767"} err="failed to get container status \"55639e7be73d39043f1a6c39193f9cb278c9e323ba72cb5d4a4ba62eececc767\": rpc error: code = NotFound desc = could not find container \"55639e7be73d39043f1a6c39193f9cb278c9e323ba72cb5d4a4ba62eececc767\": container with ID starting with 55639e7be73d39043f1a6c39193f9cb278c9e323ba72cb5d4a4ba62eececc767 not found: ID does not exist" Dec 09 10:14:19 crc kubenswrapper[5002]: I1209 10:14:19.626184 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xw282" event={"ID":"14984393-4074-4385-80a1-04fc38c0580f","Type":"ContainerStarted","Data":"e6691456cc4594e1501800e9c2d3e93afd47d50eb8896d362435c6d71c6e24bf"} Dec 09 10:14:19 crc kubenswrapper[5002]: I1209 10:14:19.655294 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xw282" podStartSLOduration=1.77484728 podStartE2EDuration="6.655264917s" podCreationTimestamp="2025-12-09 10:14:13 +0000 UTC" firstStartedPulling="2025-12-09 10:14:13.835869684 +0000 UTC m=+786.227920765" lastFinishedPulling="2025-12-09 10:14:18.716287321 +0000 UTC m=+791.108338402" observedRunningTime="2025-12-09 10:14:19.648545456 +0000 UTC m=+792.040596567" watchObservedRunningTime="2025-12-09 10:14:19.655264917 +0000 UTC m=+792.047316028" Dec 09 10:14:20 crc kubenswrapper[5002]: I1209 10:14:20.114402 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="646295cd-b0c3-419c-936c-32178b49a757" path="/var/lib/kubelet/pods/646295cd-b0c3-419c-936c-32178b49a757/volumes" Dec 09 10:14:23 crc kubenswrapper[5002]: I1209 10:14:23.444118 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-q4qht" Dec 09 10:14:23 crc kubenswrapper[5002]: I1209 10:14:23.666647 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:23 crc kubenswrapper[5002]: I1209 10:14:23.667036 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:23 crc kubenswrapper[5002]: I1209 10:14:23.672949 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:24 crc kubenswrapper[5002]: I1209 10:14:24.683283 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5547b9b954-cp759" Dec 09 10:14:24 crc kubenswrapper[5002]: I1209 10:14:24.745246 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-n967v"] Dec 09 10:14:33 crc kubenswrapper[5002]: I1209 10:14:33.371204 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-crlfm" Dec 09 10:14:37 crc kubenswrapper[5002]: I1209 10:14:37.965635 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:14:37 crc kubenswrapper[5002]: I1209 10:14:37.966282 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:14:46 crc kubenswrapper[5002]: I1209 10:14:46.954398 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2"] Dec 09 10:14:46 crc kubenswrapper[5002]: E1209 10:14:46.955264 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646295cd-b0c3-419c-936c-32178b49a757" containerName="extract-utilities" Dec 09 10:14:46 crc kubenswrapper[5002]: I1209 10:14:46.955284 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="646295cd-b0c3-419c-936c-32178b49a757" containerName="extract-utilities" Dec 09 10:14:46 crc kubenswrapper[5002]: E1209 10:14:46.955300 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646295cd-b0c3-419c-936c-32178b49a757" containerName="extract-content" Dec 09 10:14:46 crc kubenswrapper[5002]: I1209 10:14:46.955308 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="646295cd-b0c3-419c-936c-32178b49a757" containerName="extract-content" Dec 09 10:14:46 crc kubenswrapper[5002]: E1209 10:14:46.955330 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646295cd-b0c3-419c-936c-32178b49a757" containerName="registry-server" Dec 09 10:14:46 crc kubenswrapper[5002]: I1209 10:14:46.955339 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="646295cd-b0c3-419c-936c-32178b49a757" containerName="registry-server" Dec 09 10:14:46 crc kubenswrapper[5002]: I1209 10:14:46.955451 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="646295cd-b0c3-419c-936c-32178b49a757" containerName="registry-server" Dec 09 10:14:46 crc kubenswrapper[5002]: I1209 10:14:46.956377 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:46 crc kubenswrapper[5002]: I1209 10:14:46.963971 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 10:14:46 crc kubenswrapper[5002]: I1209 10:14:46.964024 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2"] Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.102963 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.103028 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.103130 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8jbb\" (UniqueName: \"kubernetes.io/projected/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-kube-api-access-d8jbb\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.204728 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8jbb\" (UniqueName: \"kubernetes.io/projected/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-kube-api-access-d8jbb\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.204916 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.205004 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.205625 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.206213 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.236599 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8jbb\" (UniqueName: \"kubernetes.io/projected/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-kube-api-access-d8jbb\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.282613 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.568474 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2"] Dec 09 10:14:47 crc kubenswrapper[5002]: W1209 10:14:47.578070 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ed47aa8_6b9f_4f89_b6b1_c6c87f8d567c.slice/crio-e3453cac64d4bb4b86efede6e2224b23d66d59cbec7b15df30955d8fd481437a WatchSource:0}: Error finding container e3453cac64d4bb4b86efede6e2224b23d66d59cbec7b15df30955d8fd481437a: Status 404 returned error can't find the container with id e3453cac64d4bb4b86efede6e2224b23d66d59cbec7b15df30955d8fd481437a Dec 09 10:14:47 crc kubenswrapper[5002]: I1209 10:14:47.829777 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" event={"ID":"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c","Type":"ContainerStarted","Data":"e3453cac64d4bb4b86efede6e2224b23d66d59cbec7b15df30955d8fd481437a"} Dec 09 10:14:48 crc kubenswrapper[5002]: I1209 10:14:48.848196 5002 generic.go:334] "Generic (PLEG): container finished" podID="4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" containerID="df8499f2fb96dbc652b5bbae0f4887f60a5b8cb4253a10aa6e3289eaacc7f082" exitCode=0 Dec 09 10:14:48 crc kubenswrapper[5002]: I1209 10:14:48.848371 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" event={"ID":"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c","Type":"ContainerDied","Data":"df8499f2fb96dbc652b5bbae0f4887f60a5b8cb4253a10aa6e3289eaacc7f082"} Dec 09 10:14:49 crc kubenswrapper[5002]: I1209 10:14:49.786329 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-n967v" podUID="95a7b196-74b4-4d67-a0e0-3e4b92e468ed" containerName="console" containerID="cri-o://e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a" gracePeriod=15 Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.273327 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-n967v_95a7b196-74b4-4d67-a0e0-3e4b92e468ed/console/0.log" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.273663 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.348931 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-service-ca\") pod \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.348980 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-trusted-ca-bundle\") pod \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.349009 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwzvz\" (UniqueName: \"kubernetes.io/projected/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-kube-api-access-nwzvz\") pod \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.349033 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-oauth-serving-cert\") pod \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.349084 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-serving-cert\") pod \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.349103 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-config\") pod \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.349135 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-oauth-config\") pod \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\" (UID: \"95a7b196-74b4-4d67-a0e0-3e4b92e468ed\") " Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.350189 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "95a7b196-74b4-4d67-a0e0-3e4b92e468ed" (UID: "95a7b196-74b4-4d67-a0e0-3e4b92e468ed"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.350205 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-config" (OuterVolumeSpecName: "console-config") pod "95a7b196-74b4-4d67-a0e0-3e4b92e468ed" (UID: "95a7b196-74b4-4d67-a0e0-3e4b92e468ed"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.350319 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-service-ca" (OuterVolumeSpecName: "service-ca") pod "95a7b196-74b4-4d67-a0e0-3e4b92e468ed" (UID: "95a7b196-74b4-4d67-a0e0-3e4b92e468ed"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.350657 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "95a7b196-74b4-4d67-a0e0-3e4b92e468ed" (UID: "95a7b196-74b4-4d67-a0e0-3e4b92e468ed"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.355631 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "95a7b196-74b4-4d67-a0e0-3e4b92e468ed" (UID: "95a7b196-74b4-4d67-a0e0-3e4b92e468ed"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.355689 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-kube-api-access-nwzvz" (OuterVolumeSpecName: "kube-api-access-nwzvz") pod "95a7b196-74b4-4d67-a0e0-3e4b92e468ed" (UID: "95a7b196-74b4-4d67-a0e0-3e4b92e468ed"). InnerVolumeSpecName "kube-api-access-nwzvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.357113 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "95a7b196-74b4-4d67-a0e0-3e4b92e468ed" (UID: "95a7b196-74b4-4d67-a0e0-3e4b92e468ed"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.450473 5002 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.450513 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwzvz\" (UniqueName: \"kubernetes.io/projected/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-kube-api-access-nwzvz\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.450528 5002 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.450539 5002 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.450551 5002 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.450562 5002 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.450573 5002 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95a7b196-74b4-4d67-a0e0-3e4b92e468ed-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.862754 5002 generic.go:334] "Generic (PLEG): container finished" podID="4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" containerID="7a21f58f6fcb2e202aade94e02feb0f783d4c44318f8751532b1a737dbbeb9b9" exitCode=0 Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.862839 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" event={"ID":"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c","Type":"ContainerDied","Data":"7a21f58f6fcb2e202aade94e02feb0f783d4c44318f8751532b1a737dbbeb9b9"} Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.864370 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-n967v_95a7b196-74b4-4d67-a0e0-3e4b92e468ed/console/0.log" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.864405 5002 generic.go:334] "Generic (PLEG): container finished" podID="95a7b196-74b4-4d67-a0e0-3e4b92e468ed" containerID="e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a" exitCode=2 Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.864427 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n967v" event={"ID":"95a7b196-74b4-4d67-a0e0-3e4b92e468ed","Type":"ContainerDied","Data":"e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a"} Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.864445 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n967v" event={"ID":"95a7b196-74b4-4d67-a0e0-3e4b92e468ed","Type":"ContainerDied","Data":"09059a35ce6aa88786547afb643646c7fe5bec07def4529d5f5ece5863215bd5"} Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.864445 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n967v" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.864499 5002 scope.go:117] "RemoveContainer" containerID="e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.944756 5002 scope.go:117] "RemoveContainer" containerID="e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a" Dec 09 10:14:50 crc kubenswrapper[5002]: E1209 10:14:50.945114 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a\": container with ID starting with e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a not found: ID does not exist" containerID="e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.945143 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a"} err="failed to get container status \"e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a\": rpc error: code = NotFound desc = could not find container \"e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a\": container with ID starting with e9caf65bafab7832ef59432a756beb5c823c0967b6ee7935eb0976ea6de5c45a not found: ID does not exist" Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.974626 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-n967v"] Dec 09 10:14:50 crc kubenswrapper[5002]: I1209 10:14:50.977764 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-n967v"] Dec 09 10:14:51 crc kubenswrapper[5002]: I1209 10:14:51.878585 5002 generic.go:334] "Generic (PLEG): container finished" podID="4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" containerID="77a6cf85ec553e67c91a35c4d4e41c6158419da8150f6fd1cc6a994ac3e49029" exitCode=0 Dec 09 10:14:51 crc kubenswrapper[5002]: I1209 10:14:51.878648 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" event={"ID":"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c","Type":"ContainerDied","Data":"77a6cf85ec553e67c91a35c4d4e41c6158419da8150f6fd1cc6a994ac3e49029"} Dec 09 10:14:52 crc kubenswrapper[5002]: I1209 10:14:52.071441 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a7b196-74b4-4d67-a0e0-3e4b92e468ed" path="/var/lib/kubelet/pods/95a7b196-74b4-4d67-a0e0-3e4b92e468ed/volumes" Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.167531 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.194650 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-util\") pod \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.194771 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-bundle\") pod \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.194865 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8jbb\" (UniqueName: \"kubernetes.io/projected/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-kube-api-access-d8jbb\") pod \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\" (UID: \"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c\") " Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.196922 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-bundle" (OuterVolumeSpecName: "bundle") pod "4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" (UID: "4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.200327 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-kube-api-access-d8jbb" (OuterVolumeSpecName: "kube-api-access-d8jbb") pod "4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" (UID: "4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c"). InnerVolumeSpecName "kube-api-access-d8jbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.216645 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-util" (OuterVolumeSpecName: "util") pod "4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" (UID: "4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.296226 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8jbb\" (UniqueName: \"kubernetes.io/projected/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-kube-api-access-d8jbb\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.296273 5002 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-util\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.296289 5002 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.892626 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" event={"ID":"4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c","Type":"ContainerDied","Data":"e3453cac64d4bb4b86efede6e2224b23d66d59cbec7b15df30955d8fd481437a"} Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.892693 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3453cac64d4bb4b86efede6e2224b23d66d59cbec7b15df30955d8fd481437a" Dec 09 10:14:53 crc kubenswrapper[5002]: I1209 10:14:53.892789 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.162197 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v"] Dec 09 10:15:00 crc kubenswrapper[5002]: E1209 10:15:00.163039 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" containerName="extract" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.163059 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" containerName="extract" Dec 09 10:15:00 crc kubenswrapper[5002]: E1209 10:15:00.163070 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" containerName="pull" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.163077 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" containerName="pull" Dec 09 10:15:00 crc kubenswrapper[5002]: E1209 10:15:00.163084 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a7b196-74b4-4d67-a0e0-3e4b92e468ed" containerName="console" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.163091 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a7b196-74b4-4d67-a0e0-3e4b92e468ed" containerName="console" Dec 09 10:15:00 crc kubenswrapper[5002]: E1209 10:15:00.163130 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" containerName="util" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.163138 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" containerName="util" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.163252 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c" containerName="extract" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.163266 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a7b196-74b4-4d67-a0e0-3e4b92e468ed" containerName="console" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.163715 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.166756 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.168676 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.180863 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v"] Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.281711 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93168b2c-7da4-41aa-911a-3501ac4931e6-secret-volume\") pod \"collect-profiles-29421255-9bs7v\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.281758 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c66rd\" (UniqueName: \"kubernetes.io/projected/93168b2c-7da4-41aa-911a-3501ac4931e6-kube-api-access-c66rd\") pod \"collect-profiles-29421255-9bs7v\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.281807 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93168b2c-7da4-41aa-911a-3501ac4931e6-config-volume\") pod \"collect-profiles-29421255-9bs7v\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.382562 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93168b2c-7da4-41aa-911a-3501ac4931e6-secret-volume\") pod \"collect-profiles-29421255-9bs7v\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.382617 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c66rd\" (UniqueName: \"kubernetes.io/projected/93168b2c-7da4-41aa-911a-3501ac4931e6-kube-api-access-c66rd\") pod \"collect-profiles-29421255-9bs7v\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.382673 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93168b2c-7da4-41aa-911a-3501ac4931e6-config-volume\") pod \"collect-profiles-29421255-9bs7v\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.383734 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93168b2c-7da4-41aa-911a-3501ac4931e6-config-volume\") pod \"collect-profiles-29421255-9bs7v\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.388584 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93168b2c-7da4-41aa-911a-3501ac4931e6-secret-volume\") pod \"collect-profiles-29421255-9bs7v\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.398907 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c66rd\" (UniqueName: \"kubernetes.io/projected/93168b2c-7da4-41aa-911a-3501ac4931e6-kube-api-access-c66rd\") pod \"collect-profiles-29421255-9bs7v\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.481730 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.705506 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v"] Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.932342 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" event={"ID":"93168b2c-7da4-41aa-911a-3501ac4931e6","Type":"ContainerStarted","Data":"9c557a866dd6f766e68014badca9ed55533ce9360c966a957fdbab95e6fe7edc"} Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.932725 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" event={"ID":"93168b2c-7da4-41aa-911a-3501ac4931e6","Type":"ContainerStarted","Data":"20527e16118bd8a422546eac171118f36683d08c2cb4db97ad0b4f17eb609955"} Dec 09 10:15:00 crc kubenswrapper[5002]: I1209 10:15:00.948141 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" podStartSLOduration=0.948119193 podStartE2EDuration="948.119193ms" podCreationTimestamp="2025-12-09 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:15:00.944594478 +0000 UTC m=+833.336645559" watchObservedRunningTime="2025-12-09 10:15:00.948119193 +0000 UTC m=+833.340170294" Dec 09 10:15:01 crc kubenswrapper[5002]: I1209 10:15:01.941402 5002 generic.go:334] "Generic (PLEG): container finished" podID="93168b2c-7da4-41aa-911a-3501ac4931e6" containerID="9c557a866dd6f766e68014badca9ed55533ce9360c966a957fdbab95e6fe7edc" exitCode=0 Dec 09 10:15:01 crc kubenswrapper[5002]: I1209 10:15:01.941450 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" event={"ID":"93168b2c-7da4-41aa-911a-3501ac4931e6","Type":"ContainerDied","Data":"9c557a866dd6f766e68014badca9ed55533ce9360c966a957fdbab95e6fe7edc"} Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.317375 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr"] Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.318186 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.320508 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.320588 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.320758 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-nbzgk" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.320844 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.321629 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.347239 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr"] Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.511476 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3a83a4b-db5e-4bed-817a-64fefad2fb7c-webhook-cert\") pod \"metallb-operator-controller-manager-86c8bdfb4c-jrzfr\" (UID: \"d3a83a4b-db5e-4bed-817a-64fefad2fb7c\") " pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.511569 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3a83a4b-db5e-4bed-817a-64fefad2fb7c-apiservice-cert\") pod \"metallb-operator-controller-manager-86c8bdfb4c-jrzfr\" (UID: \"d3a83a4b-db5e-4bed-817a-64fefad2fb7c\") " pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.511621 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crlfd\" (UniqueName: \"kubernetes.io/projected/d3a83a4b-db5e-4bed-817a-64fefad2fb7c-kube-api-access-crlfd\") pod \"metallb-operator-controller-manager-86c8bdfb4c-jrzfr\" (UID: \"d3a83a4b-db5e-4bed-817a-64fefad2fb7c\") " pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.612928 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crlfd\" (UniqueName: \"kubernetes.io/projected/d3a83a4b-db5e-4bed-817a-64fefad2fb7c-kube-api-access-crlfd\") pod \"metallb-operator-controller-manager-86c8bdfb4c-jrzfr\" (UID: \"d3a83a4b-db5e-4bed-817a-64fefad2fb7c\") " pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.613043 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3a83a4b-db5e-4bed-817a-64fefad2fb7c-webhook-cert\") pod \"metallb-operator-controller-manager-86c8bdfb4c-jrzfr\" (UID: \"d3a83a4b-db5e-4bed-817a-64fefad2fb7c\") " pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.613087 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3a83a4b-db5e-4bed-817a-64fefad2fb7c-apiservice-cert\") pod \"metallb-operator-controller-manager-86c8bdfb4c-jrzfr\" (UID: \"d3a83a4b-db5e-4bed-817a-64fefad2fb7c\") " pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.620774 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3a83a4b-db5e-4bed-817a-64fefad2fb7c-apiservice-cert\") pod \"metallb-operator-controller-manager-86c8bdfb4c-jrzfr\" (UID: \"d3a83a4b-db5e-4bed-817a-64fefad2fb7c\") " pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.620987 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3a83a4b-db5e-4bed-817a-64fefad2fb7c-webhook-cert\") pod \"metallb-operator-controller-manager-86c8bdfb4c-jrzfr\" (UID: \"d3a83a4b-db5e-4bed-817a-64fefad2fb7c\") " pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.638032 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crlfd\" (UniqueName: \"kubernetes.io/projected/d3a83a4b-db5e-4bed-817a-64fefad2fb7c-kube-api-access-crlfd\") pod \"metallb-operator-controller-manager-86c8bdfb4c-jrzfr\" (UID: \"d3a83a4b-db5e-4bed-817a-64fefad2fb7c\") " pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.679499 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv"] Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.680430 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.683250 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.684100 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.686014 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-4xrhw" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.699216 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv"] Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.714508 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72f4f251-fe6b-4c0e-8d63-927a2d37eba5-webhook-cert\") pod \"metallb-operator-webhook-server-75dc995dc4-dczqv\" (UID: \"72f4f251-fe6b-4c0e-8d63-927a2d37eba5\") " pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.714568 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79f5t\" (UniqueName: \"kubernetes.io/projected/72f4f251-fe6b-4c0e-8d63-927a2d37eba5-kube-api-access-79f5t\") pod \"metallb-operator-webhook-server-75dc995dc4-dczqv\" (UID: \"72f4f251-fe6b-4c0e-8d63-927a2d37eba5\") " pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.714593 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72f4f251-fe6b-4c0e-8d63-927a2d37eba5-apiservice-cert\") pod \"metallb-operator-webhook-server-75dc995dc4-dczqv\" (UID: \"72f4f251-fe6b-4c0e-8d63-927a2d37eba5\") " pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.815351 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72f4f251-fe6b-4c0e-8d63-927a2d37eba5-webhook-cert\") pod \"metallb-operator-webhook-server-75dc995dc4-dczqv\" (UID: \"72f4f251-fe6b-4c0e-8d63-927a2d37eba5\") " pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.815417 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79f5t\" (UniqueName: \"kubernetes.io/projected/72f4f251-fe6b-4c0e-8d63-927a2d37eba5-kube-api-access-79f5t\") pod \"metallb-operator-webhook-server-75dc995dc4-dczqv\" (UID: \"72f4f251-fe6b-4c0e-8d63-927a2d37eba5\") " pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.815453 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72f4f251-fe6b-4c0e-8d63-927a2d37eba5-apiservice-cert\") pod \"metallb-operator-webhook-server-75dc995dc4-dczqv\" (UID: \"72f4f251-fe6b-4c0e-8d63-927a2d37eba5\") " pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.823632 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72f4f251-fe6b-4c0e-8d63-927a2d37eba5-apiservice-cert\") pod \"metallb-operator-webhook-server-75dc995dc4-dczqv\" (UID: \"72f4f251-fe6b-4c0e-8d63-927a2d37eba5\") " pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.823652 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72f4f251-fe6b-4c0e-8d63-927a2d37eba5-webhook-cert\") pod \"metallb-operator-webhook-server-75dc995dc4-dczqv\" (UID: \"72f4f251-fe6b-4c0e-8d63-927a2d37eba5\") " pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.837987 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79f5t\" (UniqueName: \"kubernetes.io/projected/72f4f251-fe6b-4c0e-8d63-927a2d37eba5-kube-api-access-79f5t\") pod \"metallb-operator-webhook-server-75dc995dc4-dczqv\" (UID: \"72f4f251-fe6b-4c0e-8d63-927a2d37eba5\") " pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.935982 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:02 crc kubenswrapper[5002]: I1209 10:15:02.997164 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.225930 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.322858 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93168b2c-7da4-41aa-911a-3501ac4931e6-config-volume\") pod \"93168b2c-7da4-41aa-911a-3501ac4931e6\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.323004 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c66rd\" (UniqueName: \"kubernetes.io/projected/93168b2c-7da4-41aa-911a-3501ac4931e6-kube-api-access-c66rd\") pod \"93168b2c-7da4-41aa-911a-3501ac4931e6\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.323039 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93168b2c-7da4-41aa-911a-3501ac4931e6-secret-volume\") pod \"93168b2c-7da4-41aa-911a-3501ac4931e6\" (UID: \"93168b2c-7da4-41aa-911a-3501ac4931e6\") " Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.323885 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93168b2c-7da4-41aa-911a-3501ac4931e6-config-volume" (OuterVolumeSpecName: "config-volume") pod "93168b2c-7da4-41aa-911a-3501ac4931e6" (UID: "93168b2c-7da4-41aa-911a-3501ac4931e6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.328652 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93168b2c-7da4-41aa-911a-3501ac4931e6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "93168b2c-7da4-41aa-911a-3501ac4931e6" (UID: "93168b2c-7da4-41aa-911a-3501ac4931e6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.328956 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93168b2c-7da4-41aa-911a-3501ac4931e6-kube-api-access-c66rd" (OuterVolumeSpecName: "kube-api-access-c66rd") pod "93168b2c-7da4-41aa-911a-3501ac4931e6" (UID: "93168b2c-7da4-41aa-911a-3501ac4931e6"). InnerVolumeSpecName "kube-api-access-c66rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.425453 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c66rd\" (UniqueName: \"kubernetes.io/projected/93168b2c-7da4-41aa-911a-3501ac4931e6-kube-api-access-c66rd\") on node \"crc\" DevicePath \"\"" Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.426120 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93168b2c-7da4-41aa-911a-3501ac4931e6-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.426231 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93168b2c-7da4-41aa-911a-3501ac4931e6-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.483534 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr"] Dec 09 10:15:03 crc kubenswrapper[5002]: W1209 10:15:03.488855 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3a83a4b_db5e_4bed_817a_64fefad2fb7c.slice/crio-ebc8c292e547c8477ba18ce4af88ce874b3a21d34bb90a8aaac5ff87a8a5cf33 WatchSource:0}: Error finding container ebc8c292e547c8477ba18ce4af88ce874b3a21d34bb90a8aaac5ff87a8a5cf33: Status 404 returned error can't find the container with id ebc8c292e547c8477ba18ce4af88ce874b3a21d34bb90a8aaac5ff87a8a5cf33 Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.573783 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv"] Dec 09 10:15:03 crc kubenswrapper[5002]: W1209 10:15:03.575297 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72f4f251_fe6b_4c0e_8d63_927a2d37eba5.slice/crio-bfc199b690f33224721590a21615543367f9d4937f3e19ee39cbea8d0dca7333 WatchSource:0}: Error finding container bfc199b690f33224721590a21615543367f9d4937f3e19ee39cbea8d0dca7333: Status 404 returned error can't find the container with id bfc199b690f33224721590a21615543367f9d4937f3e19ee39cbea8d0dca7333 Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.952275 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" event={"ID":"93168b2c-7da4-41aa-911a-3501ac4931e6","Type":"ContainerDied","Data":"20527e16118bd8a422546eac171118f36683d08c2cb4db97ad0b4f17eb609955"} Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.952333 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20527e16118bd8a422546eac171118f36683d08c2cb4db97ad0b4f17eb609955" Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.952301 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v" Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.953756 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" event={"ID":"d3a83a4b-db5e-4bed-817a-64fefad2fb7c","Type":"ContainerStarted","Data":"ebc8c292e547c8477ba18ce4af88ce874b3a21d34bb90a8aaac5ff87a8a5cf33"} Dec 09 10:15:03 crc kubenswrapper[5002]: I1209 10:15:03.955160 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" event={"ID":"72f4f251-fe6b-4c0e-8d63-927a2d37eba5","Type":"ContainerStarted","Data":"bfc199b690f33224721590a21615543367f9d4937f3e19ee39cbea8d0dca7333"} Dec 09 10:15:07 crc kubenswrapper[5002]: I1209 10:15:07.964701 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:15:07 crc kubenswrapper[5002]: I1209 10:15:07.965058 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:15:07 crc kubenswrapper[5002]: I1209 10:15:07.965125 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:15:07 crc kubenswrapper[5002]: I1209 10:15:07.965703 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"637ff8a569cd1216521e8fa15ac9579d9708df3205b27a6dad02b958376b5a55"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:15:07 crc kubenswrapper[5002]: I1209 10:15:07.965750 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://637ff8a569cd1216521e8fa15ac9579d9708df3205b27a6dad02b958376b5a55" gracePeriod=600 Dec 09 10:15:08 crc kubenswrapper[5002]: I1209 10:15:08.997719 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="637ff8a569cd1216521e8fa15ac9579d9708df3205b27a6dad02b958376b5a55" exitCode=0 Dec 09 10:15:08 crc kubenswrapper[5002]: I1209 10:15:08.997996 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"637ff8a569cd1216521e8fa15ac9579d9708df3205b27a6dad02b958376b5a55"} Dec 09 10:15:08 crc kubenswrapper[5002]: I1209 10:15:08.998233 5002 scope.go:117] "RemoveContainer" containerID="6e8ddc7938962efdbd7e068e9a049cef2b10bcb309a4f79600438289cac64211" Dec 09 10:15:10 crc kubenswrapper[5002]: I1209 10:15:10.006893 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"9580dfbfe31ac43f61d3b220c2620f364cbabb0180f5e1a555d93ea2015032be"} Dec 09 10:15:10 crc kubenswrapper[5002]: I1209 10:15:10.009496 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" event={"ID":"d3a83a4b-db5e-4bed-817a-64fefad2fb7c","Type":"ContainerStarted","Data":"07dfb2e7c390cc2675c6be7f82cebaa5889898e48229e6fb31cf8373b68ab74e"} Dec 09 10:15:10 crc kubenswrapper[5002]: I1209 10:15:10.009710 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:10 crc kubenswrapper[5002]: I1209 10:15:10.011105 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" event={"ID":"72f4f251-fe6b-4c0e-8d63-927a2d37eba5","Type":"ContainerStarted","Data":"bc5a08854db0948c2eaba50521bec1f9ed97bfed2be3f7dc0ef56e654d975d1b"} Dec 09 10:15:10 crc kubenswrapper[5002]: I1209 10:15:10.011366 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:10 crc kubenswrapper[5002]: I1209 10:15:10.085590 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" podStartSLOduration=2.790356171 podStartE2EDuration="8.085570217s" podCreationTimestamp="2025-12-09 10:15:02 +0000 UTC" firstStartedPulling="2025-12-09 10:15:03.491788604 +0000 UTC m=+835.883839685" lastFinishedPulling="2025-12-09 10:15:08.78700264 +0000 UTC m=+841.179053731" observedRunningTime="2025-12-09 10:15:10.081399534 +0000 UTC m=+842.473450635" watchObservedRunningTime="2025-12-09 10:15:10.085570217 +0000 UTC m=+842.477621298" Dec 09 10:15:10 crc kubenswrapper[5002]: I1209 10:15:10.086250 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" podStartSLOduration=2.858891769 podStartE2EDuration="8.086244315s" podCreationTimestamp="2025-12-09 10:15:02 +0000 UTC" firstStartedPulling="2025-12-09 10:15:03.57806751 +0000 UTC m=+835.970118591" lastFinishedPulling="2025-12-09 10:15:08.805420046 +0000 UTC m=+841.197471137" observedRunningTime="2025-12-09 10:15:10.052116235 +0000 UTC m=+842.444167326" watchObservedRunningTime="2025-12-09 10:15:10.086244315 +0000 UTC m=+842.478295396" Dec 09 10:15:23 crc kubenswrapper[5002]: I1209 10:15:23.002654 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-75dc995dc4-dczqv" Dec 09 10:15:42 crc kubenswrapper[5002]: I1209 10:15:42.939791 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-86c8bdfb4c-jrzfr" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.304322 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vxlcz"] Dec 09 10:15:43 crc kubenswrapper[5002]: E1209 10:15:43.304590 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93168b2c-7da4-41aa-911a-3501ac4931e6" containerName="collect-profiles" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.304606 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="93168b2c-7da4-41aa-911a-3501ac4931e6" containerName="collect-profiles" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.304725 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="93168b2c-7da4-41aa-911a-3501ac4931e6" containerName="collect-profiles" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.305622 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.319960 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vxlcz"] Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.399471 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-utilities\") pod \"certified-operators-vxlcz\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.399531 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6cr2\" (UniqueName: \"kubernetes.io/projected/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-kube-api-access-m6cr2\") pod \"certified-operators-vxlcz\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.399584 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-catalog-content\") pod \"certified-operators-vxlcz\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.500673 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6cr2\" (UniqueName: \"kubernetes.io/projected/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-kube-api-access-m6cr2\") pod \"certified-operators-vxlcz\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.500761 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-catalog-content\") pod \"certified-operators-vxlcz\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.500851 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-utilities\") pod \"certified-operators-vxlcz\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.501255 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-catalog-content\") pod \"certified-operators-vxlcz\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.501376 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-utilities\") pod \"certified-operators-vxlcz\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.524356 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6cr2\" (UniqueName: \"kubernetes.io/projected/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-kube-api-access-m6cr2\") pod \"certified-operators-vxlcz\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.623033 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.701021 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-8cbcs"] Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.705396 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.707617 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.707853 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-sfprz" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.708179 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.723682 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6"] Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.726942 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.728732 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.738768 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6"] Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.805587 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb8964b1-8a1b-4af6-8340-b7678fef088c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-qwxw6\" (UID: \"cb8964b1-8a1b-4af6-8340-b7678fef088c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.805627 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-metrics\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.805652 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-frr-sockets\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.805675 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvln5\" (UniqueName: \"kubernetes.io/projected/21791258-f31e-49b6-8470-1915bc504a3f-kube-api-access-tvln5\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.805700 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-reloader\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.805721 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21791258-f31e-49b6-8470-1915bc504a3f-metrics-certs\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.805760 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-frr-conf\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.805828 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m4j4\" (UniqueName: \"kubernetes.io/projected/cb8964b1-8a1b-4af6-8340-b7678fef088c-kube-api-access-6m4j4\") pod \"frr-k8s-webhook-server-7fcb986d4-qwxw6\" (UID: \"cb8964b1-8a1b-4af6-8340-b7678fef088c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.805932 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/21791258-f31e-49b6-8470-1915bc504a3f-frr-startup\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.845853 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jdlbp"] Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.846870 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jdlbp" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.848660 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.848991 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.848997 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-v2gfv"] Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.849158 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.860746 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-d5sqf" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.867154 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-v2gfv"] Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.868933 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.875352 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.908385 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m4j4\" (UniqueName: \"kubernetes.io/projected/cb8964b1-8a1b-4af6-8340-b7678fef088c-kube-api-access-6m4j4\") pod \"frr-k8s-webhook-server-7fcb986d4-qwxw6\" (UID: \"cb8964b1-8a1b-4af6-8340-b7678fef088c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.908452 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/21791258-f31e-49b6-8470-1915bc504a3f-frr-startup\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.908498 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb8964b1-8a1b-4af6-8340-b7678fef088c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-qwxw6\" (UID: \"cb8964b1-8a1b-4af6-8340-b7678fef088c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.908521 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-metrics\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.908541 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-frr-sockets\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.908562 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvln5\" (UniqueName: \"kubernetes.io/projected/21791258-f31e-49b6-8470-1915bc504a3f-kube-api-access-tvln5\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.908584 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-reloader\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.908608 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21791258-f31e-49b6-8470-1915bc504a3f-metrics-certs\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.908644 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-frr-conf\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: E1209 10:15:43.908787 5002 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 09 10:15:43 crc kubenswrapper[5002]: E1209 10:15:43.908881 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb8964b1-8a1b-4af6-8340-b7678fef088c-cert podName:cb8964b1-8a1b-4af6-8340-b7678fef088c nodeName:}" failed. No retries permitted until 2025-12-09 10:15:44.408864438 +0000 UTC m=+876.800915529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cb8964b1-8a1b-4af6-8340-b7678fef088c-cert") pod "frr-k8s-webhook-server-7fcb986d4-qwxw6" (UID: "cb8964b1-8a1b-4af6-8340-b7678fef088c") : secret "frr-k8s-webhook-server-cert" not found Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.909017 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-frr-conf\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: E1209 10:15:43.909093 5002 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.909257 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-reloader\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.909335 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/21791258-f31e-49b6-8470-1915bc504a3f-frr-startup\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.909534 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-metrics\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: E1209 10:15:43.909578 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21791258-f31e-49b6-8470-1915bc504a3f-metrics-certs podName:21791258-f31e-49b6-8470-1915bc504a3f nodeName:}" failed. No retries permitted until 2025-12-09 10:15:44.409115655 +0000 UTC m=+876.801166736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21791258-f31e-49b6-8470-1915bc504a3f-metrics-certs") pod "frr-k8s-8cbcs" (UID: "21791258-f31e-49b6-8470-1915bc504a3f") : secret "frr-k8s-certs-secret" not found Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.910085 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/21791258-f31e-49b6-8470-1915bc504a3f-frr-sockets\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.936831 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m4j4\" (UniqueName: \"kubernetes.io/projected/cb8964b1-8a1b-4af6-8340-b7678fef088c-kube-api-access-6m4j4\") pod \"frr-k8s-webhook-server-7fcb986d4-qwxw6\" (UID: \"cb8964b1-8a1b-4af6-8340-b7678fef088c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" Dec 09 10:15:43 crc kubenswrapper[5002]: I1209 10:15:43.954595 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvln5\" (UniqueName: \"kubernetes.io/projected/21791258-f31e-49b6-8470-1915bc504a3f-kube-api-access-tvln5\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.001311 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vxlcz"] Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.013442 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-metrics-certs\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.013489 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tmrj\" (UniqueName: \"kubernetes.io/projected/74c8a781-bfca-4207-89e2-31bc463c8db9-kube-api-access-9tmrj\") pod \"controller-f8648f98b-v2gfv\" (UID: \"74c8a781-bfca-4207-89e2-31bc463c8db9\") " pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.013542 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-memberlist\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.013565 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74c8a781-bfca-4207-89e2-31bc463c8db9-cert\") pod \"controller-f8648f98b-v2gfv\" (UID: \"74c8a781-bfca-4207-89e2-31bc463c8db9\") " pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.013609 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2b43b64e-4d97-41ba-8514-550bbb0f497b-metallb-excludel2\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.013635 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b2b6\" (UniqueName: \"kubernetes.io/projected/2b43b64e-4d97-41ba-8514-550bbb0f497b-kube-api-access-4b2b6\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.013667 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74c8a781-bfca-4207-89e2-31bc463c8db9-metrics-certs\") pod \"controller-f8648f98b-v2gfv\" (UID: \"74c8a781-bfca-4207-89e2-31bc463c8db9\") " pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.115243 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-memberlist\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.115606 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74c8a781-bfca-4207-89e2-31bc463c8db9-cert\") pod \"controller-f8648f98b-v2gfv\" (UID: \"74c8a781-bfca-4207-89e2-31bc463c8db9\") " pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.115639 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2b43b64e-4d97-41ba-8514-550bbb0f497b-metallb-excludel2\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.115661 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b2b6\" (UniqueName: \"kubernetes.io/projected/2b43b64e-4d97-41ba-8514-550bbb0f497b-kube-api-access-4b2b6\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.115681 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74c8a781-bfca-4207-89e2-31bc463c8db9-metrics-certs\") pod \"controller-f8648f98b-v2gfv\" (UID: \"74c8a781-bfca-4207-89e2-31bc463c8db9\") " pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.115725 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-metrics-certs\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: E1209 10:15:44.115428 5002 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.115751 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tmrj\" (UniqueName: \"kubernetes.io/projected/74c8a781-bfca-4207-89e2-31bc463c8db9-kube-api-access-9tmrj\") pod \"controller-f8648f98b-v2gfv\" (UID: \"74c8a781-bfca-4207-89e2-31bc463c8db9\") " pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:44 crc kubenswrapper[5002]: E1209 10:15:44.115796 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-memberlist podName:2b43b64e-4d97-41ba-8514-550bbb0f497b nodeName:}" failed. No retries permitted until 2025-12-09 10:15:44.615777422 +0000 UTC m=+877.007828503 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-memberlist") pod "speaker-jdlbp" (UID: "2b43b64e-4d97-41ba-8514-550bbb0f497b") : secret "metallb-memberlist" not found Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.116954 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2b43b64e-4d97-41ba-8514-550bbb0f497b-metallb-excludel2\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.118289 5002 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.119752 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-metrics-certs\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.120753 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74c8a781-bfca-4207-89e2-31bc463c8db9-metrics-certs\") pod \"controller-f8648f98b-v2gfv\" (UID: \"74c8a781-bfca-4207-89e2-31bc463c8db9\") " pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.133250 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74c8a781-bfca-4207-89e2-31bc463c8db9-cert\") pod \"controller-f8648f98b-v2gfv\" (UID: \"74c8a781-bfca-4207-89e2-31bc463c8db9\") " pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.143717 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tmrj\" (UniqueName: \"kubernetes.io/projected/74c8a781-bfca-4207-89e2-31bc463c8db9-kube-api-access-9tmrj\") pod \"controller-f8648f98b-v2gfv\" (UID: \"74c8a781-bfca-4207-89e2-31bc463c8db9\") " pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.144952 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b2b6\" (UniqueName: \"kubernetes.io/projected/2b43b64e-4d97-41ba-8514-550bbb0f497b-kube-api-access-4b2b6\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.208447 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.221877 5002 generic.go:334] "Generic (PLEG): container finished" podID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" containerID="374e323b8f314f9d8e872eb8d5911e8a4add4b06d3fe89299622cb6f549996f3" exitCode=0 Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.221921 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxlcz" event={"ID":"174ba3d1-ccf7-4886-998f-f1c125c1fb4b","Type":"ContainerDied","Data":"374e323b8f314f9d8e872eb8d5911e8a4add4b06d3fe89299622cb6f549996f3"} Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.221976 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxlcz" event={"ID":"174ba3d1-ccf7-4886-998f-f1c125c1fb4b","Type":"ContainerStarted","Data":"fa06fd1b0d4d03e7eba22570b8d6abcf40d147cfdd25cb85c296e18ed084faef"} Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.419371 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb8964b1-8a1b-4af6-8340-b7678fef088c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-qwxw6\" (UID: \"cb8964b1-8a1b-4af6-8340-b7678fef088c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.419756 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21791258-f31e-49b6-8470-1915bc504a3f-metrics-certs\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.425002 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb8964b1-8a1b-4af6-8340-b7678fef088c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-qwxw6\" (UID: \"cb8964b1-8a1b-4af6-8340-b7678fef088c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.425540 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21791258-f31e-49b6-8470-1915bc504a3f-metrics-certs\") pod \"frr-k8s-8cbcs\" (UID: \"21791258-f31e-49b6-8470-1915bc504a3f\") " pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.469114 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-v2gfv"] Dec 09 10:15:44 crc kubenswrapper[5002]: W1209 10:15:44.479357 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74c8a781_bfca_4207_89e2_31bc463c8db9.slice/crio-c1cbe7e2d74a148681064d9b7943cea5dcda400ab9a9e1dd9d668c25aac63553 WatchSource:0}: Error finding container c1cbe7e2d74a148681064d9b7943cea5dcda400ab9a9e1dd9d668c25aac63553: Status 404 returned error can't find the container with id c1cbe7e2d74a148681064d9b7943cea5dcda400ab9a9e1dd9d668c25aac63553 Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.622228 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-memberlist\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:44 crc kubenswrapper[5002]: E1209 10:15:44.622497 5002 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 09 10:15:44 crc kubenswrapper[5002]: E1209 10:15:44.622577 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-memberlist podName:2b43b64e-4d97-41ba-8514-550bbb0f497b nodeName:}" failed. No retries permitted until 2025-12-09 10:15:45.622554831 +0000 UTC m=+878.014605932 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-memberlist") pod "speaker-jdlbp" (UID: "2b43b64e-4d97-41ba-8514-550bbb0f497b") : secret "metallb-memberlist" not found Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.626399 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.647565 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" Dec 09 10:15:44 crc kubenswrapper[5002]: W1209 10:15:44.886936 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb8964b1_8a1b_4af6_8340_b7678fef088c.slice/crio-325e44155f6a46894c2af421ac3e4f683272c6bb994e3a2ecef72295949b1595 WatchSource:0}: Error finding container 325e44155f6a46894c2af421ac3e4f683272c6bb994e3a2ecef72295949b1595: Status 404 returned error can't find the container with id 325e44155f6a46894c2af421ac3e4f683272c6bb994e3a2ecef72295949b1595 Dec 09 10:15:44 crc kubenswrapper[5002]: I1209 10:15:44.888931 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6"] Dec 09 10:15:45 crc kubenswrapper[5002]: I1209 10:15:45.230875 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxlcz" event={"ID":"174ba3d1-ccf7-4886-998f-f1c125c1fb4b","Type":"ContainerStarted","Data":"57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24"} Dec 09 10:15:45 crc kubenswrapper[5002]: I1209 10:15:45.232172 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" event={"ID":"cb8964b1-8a1b-4af6-8340-b7678fef088c","Type":"ContainerStarted","Data":"325e44155f6a46894c2af421ac3e4f683272c6bb994e3a2ecef72295949b1595"} Dec 09 10:15:45 crc kubenswrapper[5002]: I1209 10:15:45.234741 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-v2gfv" event={"ID":"74c8a781-bfca-4207-89e2-31bc463c8db9","Type":"ContainerStarted","Data":"dc729ca85bf90eeb9d6e9022634999bde351e5bbce1e958c36afb2c71dd67164"} Dec 09 10:15:45 crc kubenswrapper[5002]: I1209 10:15:45.234783 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-v2gfv" event={"ID":"74c8a781-bfca-4207-89e2-31bc463c8db9","Type":"ContainerStarted","Data":"2b6ef030b18717002716553a2dbc54e086fd91f90ce1c61069badbe768c9d318"} Dec 09 10:15:45 crc kubenswrapper[5002]: I1209 10:15:45.234799 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-v2gfv" event={"ID":"74c8a781-bfca-4207-89e2-31bc463c8db9","Type":"ContainerStarted","Data":"c1cbe7e2d74a148681064d9b7943cea5dcda400ab9a9e1dd9d668c25aac63553"} Dec 09 10:15:45 crc kubenswrapper[5002]: I1209 10:15:45.234876 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:45 crc kubenswrapper[5002]: I1209 10:15:45.236339 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cbcs" event={"ID":"21791258-f31e-49b6-8470-1915bc504a3f","Type":"ContainerStarted","Data":"6e701cd09f3fa00071cb976911ff2f264bb2474f759a7ad4571bea2c30e9b142"} Dec 09 10:15:45 crc kubenswrapper[5002]: I1209 10:15:45.271974 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-v2gfv" podStartSLOduration=2.2719556020000002 podStartE2EDuration="2.271955602s" podCreationTimestamp="2025-12-09 10:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:15:45.269607739 +0000 UTC m=+877.661658830" watchObservedRunningTime="2025-12-09 10:15:45.271955602 +0000 UTC m=+877.664006703" Dec 09 10:15:45 crc kubenswrapper[5002]: I1209 10:15:45.635156 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-memberlist\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:45 crc kubenswrapper[5002]: I1209 10:15:45.640275 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2b43b64e-4d97-41ba-8514-550bbb0f497b-memberlist\") pod \"speaker-jdlbp\" (UID: \"2b43b64e-4d97-41ba-8514-550bbb0f497b\") " pod="metallb-system/speaker-jdlbp" Dec 09 10:15:45 crc kubenswrapper[5002]: I1209 10:15:45.673712 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jdlbp" Dec 09 10:15:45 crc kubenswrapper[5002]: W1209 10:15:45.692035 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b43b64e_4d97_41ba_8514_550bbb0f497b.slice/crio-4945abd495e829dc8a5d4a5a7b70b67efecfa10b322f0e645e8f0f4e8376d4dc WatchSource:0}: Error finding container 4945abd495e829dc8a5d4a5a7b70b67efecfa10b322f0e645e8f0f4e8376d4dc: Status 404 returned error can't find the container with id 4945abd495e829dc8a5d4a5a7b70b67efecfa10b322f0e645e8f0f4e8376d4dc Dec 09 10:15:46 crc kubenswrapper[5002]: I1209 10:15:46.250560 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jdlbp" event={"ID":"2b43b64e-4d97-41ba-8514-550bbb0f497b","Type":"ContainerStarted","Data":"2be7846823fd4faa15aafa6bf300581648cef2ea233c075e9c3977c685b39b63"} Dec 09 10:15:46 crc kubenswrapper[5002]: I1209 10:15:46.251531 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jdlbp" event={"ID":"2b43b64e-4d97-41ba-8514-550bbb0f497b","Type":"ContainerStarted","Data":"4945abd495e829dc8a5d4a5a7b70b67efecfa10b322f0e645e8f0f4e8376d4dc"} Dec 09 10:15:46 crc kubenswrapper[5002]: I1209 10:15:46.262450 5002 generic.go:334] "Generic (PLEG): container finished" podID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" containerID="57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24" exitCode=0 Dec 09 10:15:46 crc kubenswrapper[5002]: I1209 10:15:46.262512 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxlcz" event={"ID":"174ba3d1-ccf7-4886-998f-f1c125c1fb4b","Type":"ContainerDied","Data":"57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24"} Dec 09 10:15:47 crc kubenswrapper[5002]: I1209 10:15:47.273277 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jdlbp" event={"ID":"2b43b64e-4d97-41ba-8514-550bbb0f497b","Type":"ContainerStarted","Data":"12ebe8e09478ded0e9d9302ab6b54217d0051bf2d2a191fb7d368976c76b74cc"} Dec 09 10:15:47 crc kubenswrapper[5002]: I1209 10:15:47.273326 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jdlbp" Dec 09 10:15:47 crc kubenswrapper[5002]: I1209 10:15:47.276727 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxlcz" event={"ID":"174ba3d1-ccf7-4886-998f-f1c125c1fb4b","Type":"ContainerStarted","Data":"63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f"} Dec 09 10:15:47 crc kubenswrapper[5002]: I1209 10:15:47.293273 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jdlbp" podStartSLOduration=4.293256435 podStartE2EDuration="4.293256435s" podCreationTimestamp="2025-12-09 10:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:15:47.290896712 +0000 UTC m=+879.682947803" watchObservedRunningTime="2025-12-09 10:15:47.293256435 +0000 UTC m=+879.685307516" Dec 09 10:15:47 crc kubenswrapper[5002]: I1209 10:15:47.314008 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vxlcz" podStartSLOduration=1.8153660280000001 podStartE2EDuration="4.313990441s" podCreationTimestamp="2025-12-09 10:15:43 +0000 UTC" firstStartedPulling="2025-12-09 10:15:44.224750592 +0000 UTC m=+876.616801673" lastFinishedPulling="2025-12-09 10:15:46.723374995 +0000 UTC m=+879.115426086" observedRunningTime="2025-12-09 10:15:47.311199856 +0000 UTC m=+879.703250977" watchObservedRunningTime="2025-12-09 10:15:47.313990441 +0000 UTC m=+879.706041522" Dec 09 10:15:50 crc kubenswrapper[5002]: I1209 10:15:50.515874 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7qbf7"] Dec 09 10:15:50 crc kubenswrapper[5002]: I1209 10:15:50.518050 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:15:50 crc kubenswrapper[5002]: I1209 10:15:50.533296 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7qbf7"] Dec 09 10:15:50 crc kubenswrapper[5002]: I1209 10:15:50.608607 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-utilities\") pod \"community-operators-7qbf7\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:15:50 crc kubenswrapper[5002]: I1209 10:15:50.608673 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsg9x\" (UniqueName: \"kubernetes.io/projected/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-kube-api-access-xsg9x\") pod \"community-operators-7qbf7\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:15:50 crc kubenswrapper[5002]: I1209 10:15:50.608694 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-catalog-content\") pod \"community-operators-7qbf7\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:15:51 crc kubenswrapper[5002]: I1209 10:15:50.710060 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-utilities\") pod \"community-operators-7qbf7\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:15:51 crc kubenswrapper[5002]: I1209 10:15:50.710125 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsg9x\" (UniqueName: \"kubernetes.io/projected/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-kube-api-access-xsg9x\") pod \"community-operators-7qbf7\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:15:51 crc kubenswrapper[5002]: I1209 10:15:50.710145 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-catalog-content\") pod \"community-operators-7qbf7\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:15:51 crc kubenswrapper[5002]: I1209 10:15:50.783049 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-utilities\") pod \"community-operators-7qbf7\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:15:51 crc kubenswrapper[5002]: I1209 10:15:50.783093 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-catalog-content\") pod \"community-operators-7qbf7\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:15:51 crc kubenswrapper[5002]: I1209 10:15:50.783922 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsg9x\" (UniqueName: \"kubernetes.io/projected/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-kube-api-access-xsg9x\") pod \"community-operators-7qbf7\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:15:51 crc kubenswrapper[5002]: I1209 10:15:50.879495 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:15:52 crc kubenswrapper[5002]: I1209 10:15:52.026902 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7qbf7"] Dec 09 10:15:52 crc kubenswrapper[5002]: W1209 10:15:52.029146 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73b6d3ec_aaa0_4e5d_8213_ebe5e0b2dc9f.slice/crio-fb3acbb6641ae2b2b39c634971979cd4c7c9cbf35588c617754b72562bd15c0f WatchSource:0}: Error finding container fb3acbb6641ae2b2b39c634971979cd4c7c9cbf35588c617754b72562bd15c0f: Status 404 returned error can't find the container with id fb3acbb6641ae2b2b39c634971979cd4c7c9cbf35588c617754b72562bd15c0f Dec 09 10:15:52 crc kubenswrapper[5002]: I1209 10:15:52.311102 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" event={"ID":"cb8964b1-8a1b-4af6-8340-b7678fef088c","Type":"ContainerStarted","Data":"db512848187ad169b1198c1d788da78c947c9464b51fa6b25e423eec58eefdc4"} Dec 09 10:15:52 crc kubenswrapper[5002]: I1209 10:15:52.311411 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" Dec 09 10:15:52 crc kubenswrapper[5002]: I1209 10:15:52.314426 5002 generic.go:334] "Generic (PLEG): container finished" podID="21791258-f31e-49b6-8470-1915bc504a3f" containerID="7c5bdca43cab978c8bc6fc4f4ec6e6055d9fc2743c710ba96a89471ae4e55ad5" exitCode=0 Dec 09 10:15:52 crc kubenswrapper[5002]: I1209 10:15:52.314474 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cbcs" event={"ID":"21791258-f31e-49b6-8470-1915bc504a3f","Type":"ContainerDied","Data":"7c5bdca43cab978c8bc6fc4f4ec6e6055d9fc2743c710ba96a89471ae4e55ad5"} Dec 09 10:15:52 crc kubenswrapper[5002]: I1209 10:15:52.316714 5002 generic.go:334] "Generic (PLEG): container finished" podID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" containerID="82f7cd3a22e9d9a4e075f67cd2e6d40ce01cbbee9adfd17372a5dbf9e07f9f9d" exitCode=0 Dec 09 10:15:52 crc kubenswrapper[5002]: I1209 10:15:52.316753 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qbf7" event={"ID":"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f","Type":"ContainerDied","Data":"82f7cd3a22e9d9a4e075f67cd2e6d40ce01cbbee9adfd17372a5dbf9e07f9f9d"} Dec 09 10:15:52 crc kubenswrapper[5002]: I1209 10:15:52.316777 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qbf7" event={"ID":"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f","Type":"ContainerStarted","Data":"fb3acbb6641ae2b2b39c634971979cd4c7c9cbf35588c617754b72562bd15c0f"} Dec 09 10:15:52 crc kubenswrapper[5002]: I1209 10:15:52.342360 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" podStartSLOduration=2.330881842 podStartE2EDuration="9.342328299s" podCreationTimestamp="2025-12-09 10:15:43 +0000 UTC" firstStartedPulling="2025-12-09 10:15:44.888926879 +0000 UTC m=+877.280977960" lastFinishedPulling="2025-12-09 10:15:51.900373336 +0000 UTC m=+884.292424417" observedRunningTime="2025-12-09 10:15:52.3416141 +0000 UTC m=+884.733665251" watchObservedRunningTime="2025-12-09 10:15:52.342328299 +0000 UTC m=+884.734379480" Dec 09 10:15:53 crc kubenswrapper[5002]: I1209 10:15:53.324767 5002 generic.go:334] "Generic (PLEG): container finished" podID="21791258-f31e-49b6-8470-1915bc504a3f" containerID="0b9d4b11879f90d780d44952064f0994f7cbe227768b2d97a4a776794c6aa687" exitCode=0 Dec 09 10:15:53 crc kubenswrapper[5002]: I1209 10:15:53.324871 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cbcs" event={"ID":"21791258-f31e-49b6-8470-1915bc504a3f","Type":"ContainerDied","Data":"0b9d4b11879f90d780d44952064f0994f7cbe227768b2d97a4a776794c6aa687"} Dec 09 10:15:53 crc kubenswrapper[5002]: I1209 10:15:53.328892 5002 generic.go:334] "Generic (PLEG): container finished" podID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" containerID="7a12ee046d249efada5bbc8980bd29ab56a32fb57be4b4888b180d4184674e18" exitCode=0 Dec 09 10:15:53 crc kubenswrapper[5002]: I1209 10:15:53.329917 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qbf7" event={"ID":"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f","Type":"ContainerDied","Data":"7a12ee046d249efada5bbc8980bd29ab56a32fb57be4b4888b180d4184674e18"} Dec 09 10:15:53 crc kubenswrapper[5002]: I1209 10:15:53.624048 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:53 crc kubenswrapper[5002]: I1209 10:15:53.624192 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:53 crc kubenswrapper[5002]: I1209 10:15:53.685476 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:54 crc kubenswrapper[5002]: I1209 10:15:54.216859 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-v2gfv" Dec 09 10:15:54 crc kubenswrapper[5002]: I1209 10:15:54.335933 5002 generic.go:334] "Generic (PLEG): container finished" podID="21791258-f31e-49b6-8470-1915bc504a3f" containerID="4d9ac226a6114beec5c4e3cba497da4c85c15836b0a0565231dc55cecbb4ce3a" exitCode=0 Dec 09 10:15:54 crc kubenswrapper[5002]: I1209 10:15:54.336001 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cbcs" event={"ID":"21791258-f31e-49b6-8470-1915bc504a3f","Type":"ContainerDied","Data":"4d9ac226a6114beec5c4e3cba497da4c85c15836b0a0565231dc55cecbb4ce3a"} Dec 09 10:15:54 crc kubenswrapper[5002]: I1209 10:15:54.338211 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qbf7" event={"ID":"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f","Type":"ContainerStarted","Data":"9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81"} Dec 09 10:15:54 crc kubenswrapper[5002]: I1209 10:15:54.391094 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:54 crc kubenswrapper[5002]: I1209 10:15:54.406834 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7qbf7" podStartSLOduration=2.984941609 podStartE2EDuration="4.406801589s" podCreationTimestamp="2025-12-09 10:15:50 +0000 UTC" firstStartedPulling="2025-12-09 10:15:52.318624484 +0000 UTC m=+884.710675595" lastFinishedPulling="2025-12-09 10:15:53.740484494 +0000 UTC m=+886.132535575" observedRunningTime="2025-12-09 10:15:54.387044209 +0000 UTC m=+886.779095290" watchObservedRunningTime="2025-12-09 10:15:54.406801589 +0000 UTC m=+886.798852670" Dec 09 10:15:55 crc kubenswrapper[5002]: I1209 10:15:55.357453 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cbcs" event={"ID":"21791258-f31e-49b6-8470-1915bc504a3f","Type":"ContainerStarted","Data":"dbe4fc1f3d7ebdd117fcfe0f24355c09446ede61966e3b719735bee9866d3f66"} Dec 09 10:15:55 crc kubenswrapper[5002]: I1209 10:15:55.678308 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jdlbp" Dec 09 10:15:56 crc kubenswrapper[5002]: I1209 10:15:56.102840 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vxlcz"] Dec 09 10:15:56 crc kubenswrapper[5002]: I1209 10:15:56.371636 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cbcs" event={"ID":"21791258-f31e-49b6-8470-1915bc504a3f","Type":"ContainerStarted","Data":"6c5fdf1dcc7b24df9e36d346243b6c688eb97a9af43cc739cbabb12a1f06287e"} Dec 09 10:15:56 crc kubenswrapper[5002]: I1209 10:15:56.371670 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cbcs" event={"ID":"21791258-f31e-49b6-8470-1915bc504a3f","Type":"ContainerStarted","Data":"7a268854fa6f33f4abc8f483cfbd6c11ffab8b8f0b6489aef18a5fa81cd1c6a8"} Dec 09 10:15:56 crc kubenswrapper[5002]: I1209 10:15:56.371680 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cbcs" event={"ID":"21791258-f31e-49b6-8470-1915bc504a3f","Type":"ContainerStarted","Data":"d5c8ce56a2af8bcea33c655c1be3b426f2a7b86208322a0a4c081f51f66f7967"} Dec 09 10:15:56 crc kubenswrapper[5002]: I1209 10:15:56.371689 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cbcs" event={"ID":"21791258-f31e-49b6-8470-1915bc504a3f","Type":"ContainerStarted","Data":"5d72d3249b452294cd5c10280e04c8165f5e382231e833e8dd454058eb0c01a9"} Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.387043 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cbcs" event={"ID":"21791258-f31e-49b6-8470-1915bc504a3f","Type":"ContainerStarted","Data":"21b04bd03eabd1fbf8f4624004bfcf1c803a50685d0e7bd23c0d55f7e96e9024"} Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.388224 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.387245 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vxlcz" podUID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" containerName="registry-server" containerID="cri-o://63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f" gracePeriod=2 Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.556381 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-8cbcs" podStartSLOduration=7.50735782 podStartE2EDuration="14.556360193s" podCreationTimestamp="2025-12-09 10:15:43 +0000 UTC" firstStartedPulling="2025-12-09 10:15:44.83037051 +0000 UTC m=+877.222421591" lastFinishedPulling="2025-12-09 10:15:51.879372883 +0000 UTC m=+884.271423964" observedRunningTime="2025-12-09 10:15:57.42562961 +0000 UTC m=+889.817680731" watchObservedRunningTime="2025-12-09 10:15:57.556360193 +0000 UTC m=+889.948411284" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.559451 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx"] Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.560952 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.563740 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.574290 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx"] Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.725365 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.725515 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.725571 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmc8r\" (UniqueName: \"kubernetes.io/projected/98fed57c-cdf3-4e3e-a7e5-422973325063-kube-api-access-rmc8r\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.827088 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.827217 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.827270 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmc8r\" (UniqueName: \"kubernetes.io/projected/98fed57c-cdf3-4e3e-a7e5-422973325063-kube-api-access-rmc8r\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.828904 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.830183 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.840635 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.849703 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmc8r\" (UniqueName: \"kubernetes.io/projected/98fed57c-cdf3-4e3e-a7e5-422973325063-kube-api-access-rmc8r\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.891407 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.928443 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-catalog-content\") pod \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.928863 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6cr2\" (UniqueName: \"kubernetes.io/projected/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-kube-api-access-m6cr2\") pod \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.928928 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-utilities\") pod \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\" (UID: \"174ba3d1-ccf7-4886-998f-f1c125c1fb4b\") " Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.930341 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-utilities" (OuterVolumeSpecName: "utilities") pod "174ba3d1-ccf7-4886-998f-f1c125c1fb4b" (UID: "174ba3d1-ccf7-4886-998f-f1c125c1fb4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.935640 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-kube-api-access-m6cr2" (OuterVolumeSpecName: "kube-api-access-m6cr2") pod "174ba3d1-ccf7-4886-998f-f1c125c1fb4b" (UID: "174ba3d1-ccf7-4886-998f-f1c125c1fb4b"). InnerVolumeSpecName "kube-api-access-m6cr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:15:57 crc kubenswrapper[5002]: I1209 10:15:57.976197 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "174ba3d1-ccf7-4886-998f-f1c125c1fb4b" (UID: "174ba3d1-ccf7-4886-998f-f1c125c1fb4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.030738 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.030768 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.030785 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6cr2\" (UniqueName: \"kubernetes.io/projected/174ba3d1-ccf7-4886-998f-f1c125c1fb4b-kube-api-access-m6cr2\") on node \"crc\" DevicePath \"\"" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.107435 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-22jb8"] Dec 09 10:15:58 crc kubenswrapper[5002]: E1209 10:15:58.107669 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" containerName="extract-utilities" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.107683 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" containerName="extract-utilities" Dec 09 10:15:58 crc kubenswrapper[5002]: E1209 10:15:58.107704 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" containerName="extract-content" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.107711 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" containerName="extract-content" Dec 09 10:15:58 crc kubenswrapper[5002]: E1209 10:15:58.107719 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" containerName="registry-server" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.107725 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" containerName="registry-server" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.107977 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" containerName="registry-server" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.108700 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.122392 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-22jb8"] Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.233203 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbk5w\" (UniqueName: \"kubernetes.io/projected/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-kube-api-access-dbk5w\") pod \"redhat-marketplace-22jb8\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.233254 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-catalog-content\") pod \"redhat-marketplace-22jb8\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.233289 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-utilities\") pod \"redhat-marketplace-22jb8\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.300273 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx"] Dec 09 10:15:58 crc kubenswrapper[5002]: W1209 10:15:58.304110 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98fed57c_cdf3_4e3e_a7e5_422973325063.slice/crio-132a59786b781cbb7c10ad6c402593c40f9b011105801e6a828a298bcec6a0ad WatchSource:0}: Error finding container 132a59786b781cbb7c10ad6c402593c40f9b011105801e6a828a298bcec6a0ad: Status 404 returned error can't find the container with id 132a59786b781cbb7c10ad6c402593c40f9b011105801e6a828a298bcec6a0ad Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.333983 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-utilities\") pod \"redhat-marketplace-22jb8\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.334081 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbk5w\" (UniqueName: \"kubernetes.io/projected/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-kube-api-access-dbk5w\") pod \"redhat-marketplace-22jb8\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.334107 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-catalog-content\") pod \"redhat-marketplace-22jb8\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.334597 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-utilities\") pod \"redhat-marketplace-22jb8\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.334624 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-catalog-content\") pod \"redhat-marketplace-22jb8\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.355504 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbk5w\" (UniqueName: \"kubernetes.io/projected/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-kube-api-access-dbk5w\") pod \"redhat-marketplace-22jb8\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.396885 5002 generic.go:334] "Generic (PLEG): container finished" podID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" containerID="63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f" exitCode=0 Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.396960 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxlcz" event={"ID":"174ba3d1-ccf7-4886-998f-f1c125c1fb4b","Type":"ContainerDied","Data":"63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f"} Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.396989 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxlcz" event={"ID":"174ba3d1-ccf7-4886-998f-f1c125c1fb4b","Type":"ContainerDied","Data":"fa06fd1b0d4d03e7eba22570b8d6abcf40d147cfdd25cb85c296e18ed084faef"} Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.396996 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vxlcz" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.397007 5002 scope.go:117] "RemoveContainer" containerID="63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.398420 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" event={"ID":"98fed57c-cdf3-4e3e-a7e5-422973325063","Type":"ContainerStarted","Data":"132a59786b781cbb7c10ad6c402593c40f9b011105801e6a828a298bcec6a0ad"} Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.413961 5002 scope.go:117] "RemoveContainer" containerID="57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.419973 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vxlcz"] Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.424537 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vxlcz"] Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.432587 5002 scope.go:117] "RemoveContainer" containerID="374e323b8f314f9d8e872eb8d5911e8a4add4b06d3fe89299622cb6f549996f3" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.432987 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.449927 5002 scope.go:117] "RemoveContainer" containerID="63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f" Dec 09 10:15:58 crc kubenswrapper[5002]: E1209 10:15:58.451141 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f\": container with ID starting with 63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f not found: ID does not exist" containerID="63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.451195 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f"} err="failed to get container status \"63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f\": rpc error: code = NotFound desc = could not find container \"63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f\": container with ID starting with 63352309af6f67771715bb69d30f174dbc65a8b14c3a5bb8d2d41d6df6a9558f not found: ID does not exist" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.451229 5002 scope.go:117] "RemoveContainer" containerID="57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24" Dec 09 10:15:58 crc kubenswrapper[5002]: E1209 10:15:58.451599 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24\": container with ID starting with 57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24 not found: ID does not exist" containerID="57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.451631 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24"} err="failed to get container status \"57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24\": rpc error: code = NotFound desc = could not find container \"57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24\": container with ID starting with 57a2a41a90dd3e958ed9dd384030cfb1e25dca29497086cd1f954b62c98a0f24 not found: ID does not exist" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.451651 5002 scope.go:117] "RemoveContainer" containerID="374e323b8f314f9d8e872eb8d5911e8a4add4b06d3fe89299622cb6f549996f3" Dec 09 10:15:58 crc kubenswrapper[5002]: E1209 10:15:58.451971 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374e323b8f314f9d8e872eb8d5911e8a4add4b06d3fe89299622cb6f549996f3\": container with ID starting with 374e323b8f314f9d8e872eb8d5911e8a4add4b06d3fe89299622cb6f549996f3 not found: ID does not exist" containerID="374e323b8f314f9d8e872eb8d5911e8a4add4b06d3fe89299622cb6f549996f3" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.452006 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374e323b8f314f9d8e872eb8d5911e8a4add4b06d3fe89299622cb6f549996f3"} err="failed to get container status \"374e323b8f314f9d8e872eb8d5911e8a4add4b06d3fe89299622cb6f549996f3\": rpc error: code = NotFound desc = could not find container \"374e323b8f314f9d8e872eb8d5911e8a4add4b06d3fe89299622cb6f549996f3\": container with ID starting with 374e323b8f314f9d8e872eb8d5911e8a4add4b06d3fe89299622cb6f549996f3 not found: ID does not exist" Dec 09 10:15:58 crc kubenswrapper[5002]: I1209 10:15:58.895927 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-22jb8"] Dec 09 10:15:59 crc kubenswrapper[5002]: I1209 10:15:59.411863 5002 generic.go:334] "Generic (PLEG): container finished" podID="98fed57c-cdf3-4e3e-a7e5-422973325063" containerID="227c1b2c15a1b6976e388ef796f2d6979dedd9a4add9955984c1cac1bb5c6779" exitCode=0 Dec 09 10:15:59 crc kubenswrapper[5002]: I1209 10:15:59.412214 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" event={"ID":"98fed57c-cdf3-4e3e-a7e5-422973325063","Type":"ContainerDied","Data":"227c1b2c15a1b6976e388ef796f2d6979dedd9a4add9955984c1cac1bb5c6779"} Dec 09 10:15:59 crc kubenswrapper[5002]: I1209 10:15:59.417294 5002 generic.go:334] "Generic (PLEG): container finished" podID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" containerID="aced69f28f4299e4d81b87bc88d49e5eec9dfae3505761bb6e90e665613b8755" exitCode=0 Dec 09 10:15:59 crc kubenswrapper[5002]: I1209 10:15:59.417329 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22jb8" event={"ID":"d10c96a5-2f3f-4a79-a452-5a84dee8dc35","Type":"ContainerDied","Data":"aced69f28f4299e4d81b87bc88d49e5eec9dfae3505761bb6e90e665613b8755"} Dec 09 10:15:59 crc kubenswrapper[5002]: I1209 10:15:59.417353 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22jb8" event={"ID":"d10c96a5-2f3f-4a79-a452-5a84dee8dc35","Type":"ContainerStarted","Data":"a8fb13063633fac77e9416783658c2628b6af5f4b337644684ba405793302b56"} Dec 09 10:15:59 crc kubenswrapper[5002]: I1209 10:15:59.627698 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:15:59 crc kubenswrapper[5002]: I1209 10:15:59.664107 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:16:00 crc kubenswrapper[5002]: I1209 10:16:00.069805 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="174ba3d1-ccf7-4886-998f-f1c125c1fb4b" path="/var/lib/kubelet/pods/174ba3d1-ccf7-4886-998f-f1c125c1fb4b/volumes" Dec 09 10:16:00 crc kubenswrapper[5002]: I1209 10:16:00.880681 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:16:00 crc kubenswrapper[5002]: I1209 10:16:00.881083 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:16:00 crc kubenswrapper[5002]: I1209 10:16:00.930128 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:16:01 crc kubenswrapper[5002]: I1209 10:16:01.432429 5002 generic.go:334] "Generic (PLEG): container finished" podID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" containerID="a2cb46cd3f361c06a60bc4f672101374d935e89e0a03d75781df6ae10cb265a8" exitCode=0 Dec 09 10:16:01 crc kubenswrapper[5002]: I1209 10:16:01.432498 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22jb8" event={"ID":"d10c96a5-2f3f-4a79-a452-5a84dee8dc35","Type":"ContainerDied","Data":"a2cb46cd3f361c06a60bc4f672101374d935e89e0a03d75781df6ae10cb265a8"} Dec 09 10:16:01 crc kubenswrapper[5002]: I1209 10:16:01.496120 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:16:03 crc kubenswrapper[5002]: I1209 10:16:03.459408 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" event={"ID":"98fed57c-cdf3-4e3e-a7e5-422973325063","Type":"ContainerStarted","Data":"272c3e21f3dc98872705f10cb6e26e2e88c0f8305dec2ccc12694648b6b061b2"} Dec 09 10:16:03 crc kubenswrapper[5002]: I1209 10:16:03.462743 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22jb8" event={"ID":"d10c96a5-2f3f-4a79-a452-5a84dee8dc35","Type":"ContainerStarted","Data":"bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d"} Dec 09 10:16:03 crc kubenswrapper[5002]: I1209 10:16:03.501559 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-22jb8" podStartSLOduration=1.928820425 podStartE2EDuration="5.501539719s" podCreationTimestamp="2025-12-09 10:15:58 +0000 UTC" firstStartedPulling="2025-12-09 10:15:59.421514841 +0000 UTC m=+891.813565922" lastFinishedPulling="2025-12-09 10:16:02.994234125 +0000 UTC m=+895.386285216" observedRunningTime="2025-12-09 10:16:03.498284392 +0000 UTC m=+895.890335493" watchObservedRunningTime="2025-12-09 10:16:03.501539719 +0000 UTC m=+895.893590820" Dec 09 10:16:04 crc kubenswrapper[5002]: I1209 10:16:04.470335 5002 generic.go:334] "Generic (PLEG): container finished" podID="98fed57c-cdf3-4e3e-a7e5-422973325063" containerID="272c3e21f3dc98872705f10cb6e26e2e88c0f8305dec2ccc12694648b6b061b2" exitCode=0 Dec 09 10:16:04 crc kubenswrapper[5002]: I1209 10:16:04.470377 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" event={"ID":"98fed57c-cdf3-4e3e-a7e5-422973325063","Type":"ContainerDied","Data":"272c3e21f3dc98872705f10cb6e26e2e88c0f8305dec2ccc12694648b6b061b2"} Dec 09 10:16:04 crc kubenswrapper[5002]: I1209 10:16:04.656885 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qwxw6" Dec 09 10:16:05 crc kubenswrapper[5002]: I1209 10:16:05.482472 5002 generic.go:334] "Generic (PLEG): container finished" podID="98fed57c-cdf3-4e3e-a7e5-422973325063" containerID="dd09e4fd568fc40d68be87607c7d73b29eac67a0c9a60a5bb649a597a84a8ab8" exitCode=0 Dec 09 10:16:05 crc kubenswrapper[5002]: I1209 10:16:05.482550 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" event={"ID":"98fed57c-cdf3-4e3e-a7e5-422973325063","Type":"ContainerDied","Data":"dd09e4fd568fc40d68be87607c7d73b29eac67a0c9a60a5bb649a597a84a8ab8"} Dec 09 10:16:05 crc kubenswrapper[5002]: I1209 10:16:05.499203 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7qbf7"] Dec 09 10:16:05 crc kubenswrapper[5002]: I1209 10:16:05.499492 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7qbf7" podUID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" containerName="registry-server" containerID="cri-o://9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81" gracePeriod=2 Dec 09 10:16:05 crc kubenswrapper[5002]: I1209 10:16:05.954985 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.087440 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsg9x\" (UniqueName: \"kubernetes.io/projected/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-kube-api-access-xsg9x\") pod \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.087514 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-utilities\") pod \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.087535 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-catalog-content\") pod \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\" (UID: \"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f\") " Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.088465 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-utilities" (OuterVolumeSpecName: "utilities") pod "73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" (UID: "73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.100005 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-kube-api-access-xsg9x" (OuterVolumeSpecName: "kube-api-access-xsg9x") pod "73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" (UID: "73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f"). InnerVolumeSpecName "kube-api-access-xsg9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.135966 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" (UID: "73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.188558 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsg9x\" (UniqueName: \"kubernetes.io/projected/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-kube-api-access-xsg9x\") on node \"crc\" DevicePath \"\"" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.188589 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.188600 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.490723 5002 generic.go:334] "Generic (PLEG): container finished" podID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" containerID="9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81" exitCode=0 Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.490855 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qbf7" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.490847 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qbf7" event={"ID":"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f","Type":"ContainerDied","Data":"9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81"} Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.491653 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qbf7" event={"ID":"73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f","Type":"ContainerDied","Data":"fb3acbb6641ae2b2b39c634971979cd4c7c9cbf35588c617754b72562bd15c0f"} Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.491676 5002 scope.go:117] "RemoveContainer" containerID="9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.524247 5002 scope.go:117] "RemoveContainer" containerID="7a12ee046d249efada5bbc8980bd29ab56a32fb57be4b4888b180d4184674e18" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.529260 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7qbf7"] Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.533130 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7qbf7"] Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.558244 5002 scope.go:117] "RemoveContainer" containerID="82f7cd3a22e9d9a4e075f67cd2e6d40ce01cbbee9adfd17372a5dbf9e07f9f9d" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.586139 5002 scope.go:117] "RemoveContainer" containerID="9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81" Dec 09 10:16:06 crc kubenswrapper[5002]: E1209 10:16:06.586705 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81\": container with ID starting with 9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81 not found: ID does not exist" containerID="9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.586748 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81"} err="failed to get container status \"9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81\": rpc error: code = NotFound desc = could not find container \"9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81\": container with ID starting with 9a960bbe2a2ddd1bce6f63018a261ad9a9e84ba85e0ee8e98897c1cd8353ce81 not found: ID does not exist" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.586775 5002 scope.go:117] "RemoveContainer" containerID="7a12ee046d249efada5bbc8980bd29ab56a32fb57be4b4888b180d4184674e18" Dec 09 10:16:06 crc kubenswrapper[5002]: E1209 10:16:06.587189 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a12ee046d249efada5bbc8980bd29ab56a32fb57be4b4888b180d4184674e18\": container with ID starting with 7a12ee046d249efada5bbc8980bd29ab56a32fb57be4b4888b180d4184674e18 not found: ID does not exist" containerID="7a12ee046d249efada5bbc8980bd29ab56a32fb57be4b4888b180d4184674e18" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.587222 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a12ee046d249efada5bbc8980bd29ab56a32fb57be4b4888b180d4184674e18"} err="failed to get container status \"7a12ee046d249efada5bbc8980bd29ab56a32fb57be4b4888b180d4184674e18\": rpc error: code = NotFound desc = could not find container \"7a12ee046d249efada5bbc8980bd29ab56a32fb57be4b4888b180d4184674e18\": container with ID starting with 7a12ee046d249efada5bbc8980bd29ab56a32fb57be4b4888b180d4184674e18 not found: ID does not exist" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.587240 5002 scope.go:117] "RemoveContainer" containerID="82f7cd3a22e9d9a4e075f67cd2e6d40ce01cbbee9adfd17372a5dbf9e07f9f9d" Dec 09 10:16:06 crc kubenswrapper[5002]: E1209 10:16:06.587611 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82f7cd3a22e9d9a4e075f67cd2e6d40ce01cbbee9adfd17372a5dbf9e07f9f9d\": container with ID starting with 82f7cd3a22e9d9a4e075f67cd2e6d40ce01cbbee9adfd17372a5dbf9e07f9f9d not found: ID does not exist" containerID="82f7cd3a22e9d9a4e075f67cd2e6d40ce01cbbee9adfd17372a5dbf9e07f9f9d" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.587642 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f7cd3a22e9d9a4e075f67cd2e6d40ce01cbbee9adfd17372a5dbf9e07f9f9d"} err="failed to get container status \"82f7cd3a22e9d9a4e075f67cd2e6d40ce01cbbee9adfd17372a5dbf9e07f9f9d\": rpc error: code = NotFound desc = could not find container \"82f7cd3a22e9d9a4e075f67cd2e6d40ce01cbbee9adfd17372a5dbf9e07f9f9d\": container with ID starting with 82f7cd3a22e9d9a4e075f67cd2e6d40ce01cbbee9adfd17372a5dbf9e07f9f9d not found: ID does not exist" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.815130 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.999411 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmc8r\" (UniqueName: \"kubernetes.io/projected/98fed57c-cdf3-4e3e-a7e5-422973325063-kube-api-access-rmc8r\") pod \"98fed57c-cdf3-4e3e-a7e5-422973325063\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.999528 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-bundle\") pod \"98fed57c-cdf3-4e3e-a7e5-422973325063\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " Dec 09 10:16:06 crc kubenswrapper[5002]: I1209 10:16:06.999704 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-util\") pod \"98fed57c-cdf3-4e3e-a7e5-422973325063\" (UID: \"98fed57c-cdf3-4e3e-a7e5-422973325063\") " Dec 09 10:16:07 crc kubenswrapper[5002]: I1209 10:16:07.001290 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-bundle" (OuterVolumeSpecName: "bundle") pod "98fed57c-cdf3-4e3e-a7e5-422973325063" (UID: "98fed57c-cdf3-4e3e-a7e5-422973325063"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:16:07 crc kubenswrapper[5002]: I1209 10:16:07.007707 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98fed57c-cdf3-4e3e-a7e5-422973325063-kube-api-access-rmc8r" (OuterVolumeSpecName: "kube-api-access-rmc8r") pod "98fed57c-cdf3-4e3e-a7e5-422973325063" (UID: "98fed57c-cdf3-4e3e-a7e5-422973325063"). InnerVolumeSpecName "kube-api-access-rmc8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:16:07 crc kubenswrapper[5002]: I1209 10:16:07.026786 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-util" (OuterVolumeSpecName: "util") pod "98fed57c-cdf3-4e3e-a7e5-422973325063" (UID: "98fed57c-cdf3-4e3e-a7e5-422973325063"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:16:07 crc kubenswrapper[5002]: I1209 10:16:07.102072 5002 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-util\") on node \"crc\" DevicePath \"\"" Dec 09 10:16:07 crc kubenswrapper[5002]: I1209 10:16:07.102121 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmc8r\" (UniqueName: \"kubernetes.io/projected/98fed57c-cdf3-4e3e-a7e5-422973325063-kube-api-access-rmc8r\") on node \"crc\" DevicePath \"\"" Dec 09 10:16:07 crc kubenswrapper[5002]: I1209 10:16:07.102186 5002 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98fed57c-cdf3-4e3e-a7e5-422973325063-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:16:07 crc kubenswrapper[5002]: I1209 10:16:07.500526 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" event={"ID":"98fed57c-cdf3-4e3e-a7e5-422973325063","Type":"ContainerDied","Data":"132a59786b781cbb7c10ad6c402593c40f9b011105801e6a828a298bcec6a0ad"} Dec 09 10:16:07 crc kubenswrapper[5002]: I1209 10:16:07.500555 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx" Dec 09 10:16:07 crc kubenswrapper[5002]: I1209 10:16:07.500582 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="132a59786b781cbb7c10ad6c402593c40f9b011105801e6a828a298bcec6a0ad" Dec 09 10:16:08 crc kubenswrapper[5002]: I1209 10:16:08.069925 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" path="/var/lib/kubelet/pods/73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f/volumes" Dec 09 10:16:08 crc kubenswrapper[5002]: I1209 10:16:08.433804 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:16:08 crc kubenswrapper[5002]: I1209 10:16:08.433947 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:16:08 crc kubenswrapper[5002]: I1209 10:16:08.499794 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:16:08 crc kubenswrapper[5002]: I1209 10:16:08.606228 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.096945 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-22jb8"] Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.097554 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-22jb8" podUID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" containerName="registry-server" containerID="cri-o://bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d" gracePeriod=2 Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.443623 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6"] Dec 09 10:16:13 crc kubenswrapper[5002]: E1209 10:16:13.444199 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fed57c-cdf3-4e3e-a7e5-422973325063" containerName="extract" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.444223 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fed57c-cdf3-4e3e-a7e5-422973325063" containerName="extract" Dec 09 10:16:13 crc kubenswrapper[5002]: E1209 10:16:13.444237 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" containerName="extract-utilities" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.444245 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" containerName="extract-utilities" Dec 09 10:16:13 crc kubenswrapper[5002]: E1209 10:16:13.444254 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" containerName="extract-content" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.444262 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" containerName="extract-content" Dec 09 10:16:13 crc kubenswrapper[5002]: E1209 10:16:13.444276 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" containerName="registry-server" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.444283 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" containerName="registry-server" Dec 09 10:16:13 crc kubenswrapper[5002]: E1209 10:16:13.444291 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fed57c-cdf3-4e3e-a7e5-422973325063" containerName="pull" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.444298 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fed57c-cdf3-4e3e-a7e5-422973325063" containerName="pull" Dec 09 10:16:13 crc kubenswrapper[5002]: E1209 10:16:13.444329 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fed57c-cdf3-4e3e-a7e5-422973325063" containerName="util" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.444336 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fed57c-cdf3-4e3e-a7e5-422973325063" containerName="util" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.444480 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="73b6d3ec-aaa0-4e5d-8213-ebe5e0b2dc9f" containerName="registry-server" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.444493 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fed57c-cdf3-4e3e-a7e5-422973325063" containerName="extract" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.445017 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.446690 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.447108 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.449200 5002 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-zxtpx" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.458341 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6"] Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.606566 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kw6c\" (UniqueName: \"kubernetes.io/projected/46fae402-f213-4829-b136-e2d6cb44a505-kube-api-access-7kw6c\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gb4b6\" (UID: \"46fae402-f213-4829-b136-e2d6cb44a505\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.606667 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46fae402-f213-4829-b136-e2d6cb44a505-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gb4b6\" (UID: \"46fae402-f213-4829-b136-e2d6cb44a505\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.707326 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kw6c\" (UniqueName: \"kubernetes.io/projected/46fae402-f213-4829-b136-e2d6cb44a505-kube-api-access-7kw6c\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gb4b6\" (UID: \"46fae402-f213-4829-b136-e2d6cb44a505\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.707401 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46fae402-f213-4829-b136-e2d6cb44a505-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gb4b6\" (UID: \"46fae402-f213-4829-b136-e2d6cb44a505\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.707891 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46fae402-f213-4829-b136-e2d6cb44a505-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gb4b6\" (UID: \"46fae402-f213-4829-b136-e2d6cb44a505\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.748499 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kw6c\" (UniqueName: \"kubernetes.io/projected/46fae402-f213-4829-b136-e2d6cb44a505-kube-api-access-7kw6c\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gb4b6\" (UID: \"46fae402-f213-4829-b136-e2d6cb44a505\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6" Dec 09 10:16:13 crc kubenswrapper[5002]: I1209 10:16:13.760302 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.047189 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.214296 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-catalog-content\") pod \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.214357 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbk5w\" (UniqueName: \"kubernetes.io/projected/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-kube-api-access-dbk5w\") pod \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.214538 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-utilities\") pod \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\" (UID: \"d10c96a5-2f3f-4a79-a452-5a84dee8dc35\") " Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.215883 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-utilities" (OuterVolumeSpecName: "utilities") pod "d10c96a5-2f3f-4a79-a452-5a84dee8dc35" (UID: "d10c96a5-2f3f-4a79-a452-5a84dee8dc35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.222801 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-kube-api-access-dbk5w" (OuterVolumeSpecName: "kube-api-access-dbk5w") pod "d10c96a5-2f3f-4a79-a452-5a84dee8dc35" (UID: "d10c96a5-2f3f-4a79-a452-5a84dee8dc35"). InnerVolumeSpecName "kube-api-access-dbk5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.234285 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d10c96a5-2f3f-4a79-a452-5a84dee8dc35" (UID: "d10c96a5-2f3f-4a79-a452-5a84dee8dc35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.260118 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6"] Dec 09 10:16:14 crc kubenswrapper[5002]: W1209 10:16:14.275548 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46fae402_f213_4829_b136_e2d6cb44a505.slice/crio-b7fa619d35a2336cdf2c99ab69bf5686207ed39d6fc0d02078b4adc6b9583f9b WatchSource:0}: Error finding container b7fa619d35a2336cdf2c99ab69bf5686207ed39d6fc0d02078b4adc6b9583f9b: Status 404 returned error can't find the container with id b7fa619d35a2336cdf2c99ab69bf5686207ed39d6fc0d02078b4adc6b9583f9b Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.316025 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.316068 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.316082 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbk5w\" (UniqueName: \"kubernetes.io/projected/d10c96a5-2f3f-4a79-a452-5a84dee8dc35-kube-api-access-dbk5w\") on node \"crc\" DevicePath \"\"" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.557928 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6" event={"ID":"46fae402-f213-4829-b136-e2d6cb44a505","Type":"ContainerStarted","Data":"b7fa619d35a2336cdf2c99ab69bf5686207ed39d6fc0d02078b4adc6b9583f9b"} Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.559918 5002 generic.go:334] "Generic (PLEG): container finished" podID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" containerID="bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d" exitCode=0 Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.559970 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22jb8" event={"ID":"d10c96a5-2f3f-4a79-a452-5a84dee8dc35","Type":"ContainerDied","Data":"bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d"} Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.560002 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22jb8" event={"ID":"d10c96a5-2f3f-4a79-a452-5a84dee8dc35","Type":"ContainerDied","Data":"a8fb13063633fac77e9416783658c2628b6af5f4b337644684ba405793302b56"} Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.560020 5002 scope.go:117] "RemoveContainer" containerID="bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.559977 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22jb8" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.578741 5002 scope.go:117] "RemoveContainer" containerID="a2cb46cd3f361c06a60bc4f672101374d935e89e0a03d75781df6ae10cb265a8" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.596474 5002 scope.go:117] "RemoveContainer" containerID="aced69f28f4299e4d81b87bc88d49e5eec9dfae3505761bb6e90e665613b8755" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.603737 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-22jb8"] Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.613174 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-22jb8"] Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.621463 5002 scope.go:117] "RemoveContainer" containerID="bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d" Dec 09 10:16:14 crc kubenswrapper[5002]: E1209 10:16:14.621939 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d\": container with ID starting with bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d not found: ID does not exist" containerID="bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.621986 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d"} err="failed to get container status \"bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d\": rpc error: code = NotFound desc = could not find container \"bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d\": container with ID starting with bea89da576b011b6976e11064493c3395ba8466a842abd0899da3c5974670e4d not found: ID does not exist" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.622020 5002 scope.go:117] "RemoveContainer" containerID="a2cb46cd3f361c06a60bc4f672101374d935e89e0a03d75781df6ae10cb265a8" Dec 09 10:16:14 crc kubenswrapper[5002]: E1209 10:16:14.622325 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2cb46cd3f361c06a60bc4f672101374d935e89e0a03d75781df6ae10cb265a8\": container with ID starting with a2cb46cd3f361c06a60bc4f672101374d935e89e0a03d75781df6ae10cb265a8 not found: ID does not exist" containerID="a2cb46cd3f361c06a60bc4f672101374d935e89e0a03d75781df6ae10cb265a8" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.622355 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2cb46cd3f361c06a60bc4f672101374d935e89e0a03d75781df6ae10cb265a8"} err="failed to get container status \"a2cb46cd3f361c06a60bc4f672101374d935e89e0a03d75781df6ae10cb265a8\": rpc error: code = NotFound desc = could not find container \"a2cb46cd3f361c06a60bc4f672101374d935e89e0a03d75781df6ae10cb265a8\": container with ID starting with a2cb46cd3f361c06a60bc4f672101374d935e89e0a03d75781df6ae10cb265a8 not found: ID does not exist" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.622375 5002 scope.go:117] "RemoveContainer" containerID="aced69f28f4299e4d81b87bc88d49e5eec9dfae3505761bb6e90e665613b8755" Dec 09 10:16:14 crc kubenswrapper[5002]: E1209 10:16:14.622589 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aced69f28f4299e4d81b87bc88d49e5eec9dfae3505761bb6e90e665613b8755\": container with ID starting with aced69f28f4299e4d81b87bc88d49e5eec9dfae3505761bb6e90e665613b8755 not found: ID does not exist" containerID="aced69f28f4299e4d81b87bc88d49e5eec9dfae3505761bb6e90e665613b8755" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.622623 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aced69f28f4299e4d81b87bc88d49e5eec9dfae3505761bb6e90e665613b8755"} err="failed to get container status \"aced69f28f4299e4d81b87bc88d49e5eec9dfae3505761bb6e90e665613b8755\": rpc error: code = NotFound desc = could not find container \"aced69f28f4299e4d81b87bc88d49e5eec9dfae3505761bb6e90e665613b8755\": container with ID starting with aced69f28f4299e4d81b87bc88d49e5eec9dfae3505761bb6e90e665613b8755 not found: ID does not exist" Dec 09 10:16:14 crc kubenswrapper[5002]: I1209 10:16:14.636097 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-8cbcs" Dec 09 10:16:16 crc kubenswrapper[5002]: I1209 10:16:16.094572 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" path="/var/lib/kubelet/pods/d10c96a5-2f3f-4a79-a452-5a84dee8dc35/volumes" Dec 09 10:16:22 crc kubenswrapper[5002]: I1209 10:16:22.632963 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6" event={"ID":"46fae402-f213-4829-b136-e2d6cb44a505","Type":"ContainerStarted","Data":"243325d16d0874a4acb8dc2fc23e1e5184aeeced47d36a8c957b0f7a5eac9426"} Dec 09 10:16:22 crc kubenswrapper[5002]: I1209 10:16:22.670666 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gb4b6" podStartSLOduration=1.907089474 podStartE2EDuration="9.670637056s" podCreationTimestamp="2025-12-09 10:16:13 +0000 UTC" firstStartedPulling="2025-12-09 10:16:14.276944555 +0000 UTC m=+906.668995646" lastFinishedPulling="2025-12-09 10:16:22.040492127 +0000 UTC m=+914.432543228" observedRunningTime="2025-12-09 10:16:22.666752313 +0000 UTC m=+915.058803454" watchObservedRunningTime="2025-12-09 10:16:22.670637056 +0000 UTC m=+915.062688187" Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.827226 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-nm6zd"] Dec 09 10:16:25 crc kubenswrapper[5002]: E1209 10:16:25.827701 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" containerName="registry-server" Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.827712 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" containerName="registry-server" Dec 09 10:16:25 crc kubenswrapper[5002]: E1209 10:16:25.827729 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" containerName="extract-content" Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.827736 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" containerName="extract-content" Dec 09 10:16:25 crc kubenswrapper[5002]: E1209 10:16:25.827752 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" containerName="extract-utilities" Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.827757 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" containerName="extract-utilities" Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.827877 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10c96a5-2f3f-4a79-a452-5a84dee8dc35" containerName="registry-server" Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.828262 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.829971 5002 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-dlk9j" Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.830233 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.833238 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.844650 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-nm6zd"] Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.969893 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg65q\" (UniqueName: \"kubernetes.io/projected/19550cb9-1d04-4081-b7e9-f0f678c45926-kube-api-access-vg65q\") pod \"cert-manager-webhook-f4fb5df64-nm6zd\" (UID: \"19550cb9-1d04-4081-b7e9-f0f678c45926\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" Dec 09 10:16:25 crc kubenswrapper[5002]: I1209 10:16:25.970104 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19550cb9-1d04-4081-b7e9-f0f678c45926-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-nm6zd\" (UID: \"19550cb9-1d04-4081-b7e9-f0f678c45926\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" Dec 09 10:16:26 crc kubenswrapper[5002]: I1209 10:16:26.071697 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19550cb9-1d04-4081-b7e9-f0f678c45926-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-nm6zd\" (UID: \"19550cb9-1d04-4081-b7e9-f0f678c45926\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" Dec 09 10:16:26 crc kubenswrapper[5002]: I1209 10:16:26.071810 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg65q\" (UniqueName: \"kubernetes.io/projected/19550cb9-1d04-4081-b7e9-f0f678c45926-kube-api-access-vg65q\") pod \"cert-manager-webhook-f4fb5df64-nm6zd\" (UID: \"19550cb9-1d04-4081-b7e9-f0f678c45926\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" Dec 09 10:16:26 crc kubenswrapper[5002]: I1209 10:16:26.098466 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19550cb9-1d04-4081-b7e9-f0f678c45926-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-nm6zd\" (UID: \"19550cb9-1d04-4081-b7e9-f0f678c45926\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" Dec 09 10:16:26 crc kubenswrapper[5002]: I1209 10:16:26.098505 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg65q\" (UniqueName: \"kubernetes.io/projected/19550cb9-1d04-4081-b7e9-f0f678c45926-kube-api-access-vg65q\") pod \"cert-manager-webhook-f4fb5df64-nm6zd\" (UID: \"19550cb9-1d04-4081-b7e9-f0f678c45926\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" Dec 09 10:16:26 crc kubenswrapper[5002]: I1209 10:16:26.145270 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" Dec 09 10:16:26 crc kubenswrapper[5002]: I1209 10:16:26.619601 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-nm6zd"] Dec 09 10:16:26 crc kubenswrapper[5002]: I1209 10:16:26.657474 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" event={"ID":"19550cb9-1d04-4081-b7e9-f0f678c45926","Type":"ContainerStarted","Data":"8ec15484d5c75ebb6ed7dc3322f8b9a604281a17ace27ccd8fe5a80f5708d0be"} Dec 09 10:16:29 crc kubenswrapper[5002]: I1209 10:16:29.331709 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-wwznf"] Dec 09 10:16:29 crc kubenswrapper[5002]: I1209 10:16:29.332928 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-wwznf" Dec 09 10:16:29 crc kubenswrapper[5002]: I1209 10:16:29.336392 5002 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-pg7xk" Dec 09 10:16:29 crc kubenswrapper[5002]: I1209 10:16:29.348890 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-wwznf"] Dec 09 10:16:29 crc kubenswrapper[5002]: I1209 10:16:29.428992 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzch2\" (UniqueName: \"kubernetes.io/projected/0f05f554-fd63-4f53-bd2a-229c13a59ae2-kube-api-access-xzch2\") pod \"cert-manager-cainjector-855d9ccff4-wwznf\" (UID: \"0f05f554-fd63-4f53-bd2a-229c13a59ae2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-wwznf" Dec 09 10:16:29 crc kubenswrapper[5002]: I1209 10:16:29.429215 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f05f554-fd63-4f53-bd2a-229c13a59ae2-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-wwznf\" (UID: \"0f05f554-fd63-4f53-bd2a-229c13a59ae2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-wwznf" Dec 09 10:16:29 crc kubenswrapper[5002]: I1209 10:16:29.530652 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzch2\" (UniqueName: \"kubernetes.io/projected/0f05f554-fd63-4f53-bd2a-229c13a59ae2-kube-api-access-xzch2\") pod \"cert-manager-cainjector-855d9ccff4-wwznf\" (UID: \"0f05f554-fd63-4f53-bd2a-229c13a59ae2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-wwznf" Dec 09 10:16:29 crc kubenswrapper[5002]: I1209 10:16:29.530708 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f05f554-fd63-4f53-bd2a-229c13a59ae2-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-wwznf\" (UID: \"0f05f554-fd63-4f53-bd2a-229c13a59ae2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-wwznf" Dec 09 10:16:29 crc kubenswrapper[5002]: I1209 10:16:29.555323 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f05f554-fd63-4f53-bd2a-229c13a59ae2-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-wwznf\" (UID: \"0f05f554-fd63-4f53-bd2a-229c13a59ae2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-wwznf" Dec 09 10:16:29 crc kubenswrapper[5002]: I1209 10:16:29.558905 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzch2\" (UniqueName: \"kubernetes.io/projected/0f05f554-fd63-4f53-bd2a-229c13a59ae2-kube-api-access-xzch2\") pod \"cert-manager-cainjector-855d9ccff4-wwznf\" (UID: \"0f05f554-fd63-4f53-bd2a-229c13a59ae2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-wwznf" Dec 09 10:16:29 crc kubenswrapper[5002]: I1209 10:16:29.660891 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-wwznf" Dec 09 10:16:30 crc kubenswrapper[5002]: I1209 10:16:30.118883 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-wwznf"] Dec 09 10:16:30 crc kubenswrapper[5002]: I1209 10:16:30.682168 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-wwznf" event={"ID":"0f05f554-fd63-4f53-bd2a-229c13a59ae2","Type":"ContainerStarted","Data":"a85b66e2a75e15589e875a47f4bebd0ec4a2b2b7710738ca5962f7b06a7041aa"} Dec 09 10:16:35 crc kubenswrapper[5002]: I1209 10:16:35.712495 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" event={"ID":"19550cb9-1d04-4081-b7e9-f0f678c45926","Type":"ContainerStarted","Data":"f5945034512028549bc0797f9738eb6b7a9a56218197a91cbc53aec2d4c1c4df"} Dec 09 10:16:35 crc kubenswrapper[5002]: I1209 10:16:35.714053 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" Dec 09 10:16:35 crc kubenswrapper[5002]: I1209 10:16:35.714425 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-wwznf" event={"ID":"0f05f554-fd63-4f53-bd2a-229c13a59ae2","Type":"ContainerStarted","Data":"73dc6e8ef56a0ec7833a189720cc5de81dbd82a59a5304f04ff9ba1e3ea726ae"} Dec 09 10:16:35 crc kubenswrapper[5002]: I1209 10:16:35.733069 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" podStartSLOduration=2.638978345 podStartE2EDuration="10.733048918s" podCreationTimestamp="2025-12-09 10:16:25 +0000 UTC" firstStartedPulling="2025-12-09 10:16:26.627332999 +0000 UTC m=+919.019384080" lastFinishedPulling="2025-12-09 10:16:34.721403572 +0000 UTC m=+927.113454653" observedRunningTime="2025-12-09 10:16:35.727964664 +0000 UTC m=+928.120015745" watchObservedRunningTime="2025-12-09 10:16:35.733048918 +0000 UTC m=+928.125100009" Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.248448 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-wwznf" podStartSLOduration=5.678759618 podStartE2EDuration="10.248405535s" podCreationTimestamp="2025-12-09 10:16:29 +0000 UTC" firstStartedPulling="2025-12-09 10:16:30.135511586 +0000 UTC m=+922.527562667" lastFinishedPulling="2025-12-09 10:16:34.705157503 +0000 UTC m=+927.097208584" observedRunningTime="2025-12-09 10:16:35.749759289 +0000 UTC m=+928.141810380" watchObservedRunningTime="2025-12-09 10:16:39.248405535 +0000 UTC m=+931.640456636" Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.263091 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-lqcpz"] Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.264062 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-lqcpz" Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.268749 5002 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-kx4rv" Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.279845 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-lqcpz"] Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.364497 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jbmc\" (UniqueName: \"kubernetes.io/projected/e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d-kube-api-access-7jbmc\") pod \"cert-manager-86cb77c54b-lqcpz\" (UID: \"e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d\") " pod="cert-manager/cert-manager-86cb77c54b-lqcpz" Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.364787 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d-bound-sa-token\") pod \"cert-manager-86cb77c54b-lqcpz\" (UID: \"e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d\") " pod="cert-manager/cert-manager-86cb77c54b-lqcpz" Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.466170 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d-bound-sa-token\") pod \"cert-manager-86cb77c54b-lqcpz\" (UID: \"e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d\") " pod="cert-manager/cert-manager-86cb77c54b-lqcpz" Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.466526 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jbmc\" (UniqueName: \"kubernetes.io/projected/e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d-kube-api-access-7jbmc\") pod \"cert-manager-86cb77c54b-lqcpz\" (UID: \"e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d\") " pod="cert-manager/cert-manager-86cb77c54b-lqcpz" Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.490472 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d-bound-sa-token\") pod \"cert-manager-86cb77c54b-lqcpz\" (UID: \"e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d\") " pod="cert-manager/cert-manager-86cb77c54b-lqcpz" Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.490834 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jbmc\" (UniqueName: \"kubernetes.io/projected/e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d-kube-api-access-7jbmc\") pod \"cert-manager-86cb77c54b-lqcpz\" (UID: \"e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d\") " pod="cert-manager/cert-manager-86cb77c54b-lqcpz" Dec 09 10:16:39 crc kubenswrapper[5002]: I1209 10:16:39.594929 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-lqcpz" Dec 09 10:16:40 crc kubenswrapper[5002]: I1209 10:16:40.021144 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-lqcpz"] Dec 09 10:16:40 crc kubenswrapper[5002]: W1209 10:16:40.025461 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8dfcd91_b1dc_4187_98b6_5d7e10e0a92d.slice/crio-345295913cad705ae6919429e170ee6fe476ef5c6fddee8869372eae9f471afa WatchSource:0}: Error finding container 345295913cad705ae6919429e170ee6fe476ef5c6fddee8869372eae9f471afa: Status 404 returned error can't find the container with id 345295913cad705ae6919429e170ee6fe476ef5c6fddee8869372eae9f471afa Dec 09 10:16:40 crc kubenswrapper[5002]: I1209 10:16:40.761869 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-lqcpz" event={"ID":"e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d","Type":"ContainerStarted","Data":"90aa1d64a7fa0735b0f0c29b19d56601757f0fd35fe1ce8f761607dbc0d006c3"} Dec 09 10:16:40 crc kubenswrapper[5002]: I1209 10:16:40.762237 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-lqcpz" event={"ID":"e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d","Type":"ContainerStarted","Data":"345295913cad705ae6919429e170ee6fe476ef5c6fddee8869372eae9f471afa"} Dec 09 10:16:40 crc kubenswrapper[5002]: I1209 10:16:40.781806 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-lqcpz" podStartSLOduration=1.7817829889999999 podStartE2EDuration="1.781782989s" podCreationTimestamp="2025-12-09 10:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:16:40.777306901 +0000 UTC m=+933.169358022" watchObservedRunningTime="2025-12-09 10:16:40.781782989 +0000 UTC m=+933.173834080" Dec 09 10:16:41 crc kubenswrapper[5002]: I1209 10:16:41.147940 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-nm6zd" Dec 09 10:16:44 crc kubenswrapper[5002]: I1209 10:16:44.583720 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nfzvd"] Dec 09 10:16:44 crc kubenswrapper[5002]: I1209 10:16:44.584983 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nfzvd" Dec 09 10:16:44 crc kubenswrapper[5002]: I1209 10:16:44.586771 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-n5qrh" Dec 09 10:16:44 crc kubenswrapper[5002]: I1209 10:16:44.586941 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 09 10:16:44 crc kubenswrapper[5002]: I1209 10:16:44.587068 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 09 10:16:44 crc kubenswrapper[5002]: I1209 10:16:44.619638 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nfzvd"] Dec 09 10:16:44 crc kubenswrapper[5002]: I1209 10:16:44.638315 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8blt\" (UniqueName: \"kubernetes.io/projected/1444a761-8aeb-4dd1-b55b-22e924d924b0-kube-api-access-p8blt\") pod \"openstack-operator-index-nfzvd\" (UID: \"1444a761-8aeb-4dd1-b55b-22e924d924b0\") " pod="openstack-operators/openstack-operator-index-nfzvd" Dec 09 10:16:44 crc kubenswrapper[5002]: I1209 10:16:44.739345 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8blt\" (UniqueName: \"kubernetes.io/projected/1444a761-8aeb-4dd1-b55b-22e924d924b0-kube-api-access-p8blt\") pod \"openstack-operator-index-nfzvd\" (UID: \"1444a761-8aeb-4dd1-b55b-22e924d924b0\") " pod="openstack-operators/openstack-operator-index-nfzvd" Dec 09 10:16:44 crc kubenswrapper[5002]: I1209 10:16:44.766111 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8blt\" (UniqueName: \"kubernetes.io/projected/1444a761-8aeb-4dd1-b55b-22e924d924b0-kube-api-access-p8blt\") pod \"openstack-operator-index-nfzvd\" (UID: \"1444a761-8aeb-4dd1-b55b-22e924d924b0\") " pod="openstack-operators/openstack-operator-index-nfzvd" Dec 09 10:16:44 crc kubenswrapper[5002]: I1209 10:16:44.908490 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nfzvd" Dec 09 10:16:45 crc kubenswrapper[5002]: I1209 10:16:45.340713 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nfzvd"] Dec 09 10:16:45 crc kubenswrapper[5002]: I1209 10:16:45.791269 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nfzvd" event={"ID":"1444a761-8aeb-4dd1-b55b-22e924d924b0","Type":"ContainerStarted","Data":"84faae9d6bd72bf026df77bde28ffada6b1947a47c2dff4c71ffb5c2a5e93d18"} Dec 09 10:16:47 crc kubenswrapper[5002]: I1209 10:16:47.804485 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nfzvd" event={"ID":"1444a761-8aeb-4dd1-b55b-22e924d924b0","Type":"ContainerStarted","Data":"a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29"} Dec 09 10:16:47 crc kubenswrapper[5002]: I1209 10:16:47.830875 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nfzvd" podStartSLOduration=1.557009002 podStartE2EDuration="3.830856747s" podCreationTimestamp="2025-12-09 10:16:44 +0000 UTC" firstStartedPulling="2025-12-09 10:16:45.349581968 +0000 UTC m=+937.741633089" lastFinishedPulling="2025-12-09 10:16:47.623429753 +0000 UTC m=+940.015480834" observedRunningTime="2025-12-09 10:16:47.826075631 +0000 UTC m=+940.218126712" watchObservedRunningTime="2025-12-09 10:16:47.830856747 +0000 UTC m=+940.222907828" Dec 09 10:16:47 crc kubenswrapper[5002]: I1209 10:16:47.965085 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nfzvd"] Dec 09 10:16:48 crc kubenswrapper[5002]: I1209 10:16:48.576472 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-s67vq"] Dec 09 10:16:48 crc kubenswrapper[5002]: I1209 10:16:48.579288 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s67vq" Dec 09 10:16:48 crc kubenswrapper[5002]: I1209 10:16:48.587001 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s67vq"] Dec 09 10:16:48 crc kubenswrapper[5002]: I1209 10:16:48.612021 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njj96\" (UniqueName: \"kubernetes.io/projected/267396c4-1ded-4436-9caa-a45bb8a54e75-kube-api-access-njj96\") pod \"openstack-operator-index-s67vq\" (UID: \"267396c4-1ded-4436-9caa-a45bb8a54e75\") " pod="openstack-operators/openstack-operator-index-s67vq" Dec 09 10:16:48 crc kubenswrapper[5002]: I1209 10:16:48.714664 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njj96\" (UniqueName: \"kubernetes.io/projected/267396c4-1ded-4436-9caa-a45bb8a54e75-kube-api-access-njj96\") pod \"openstack-operator-index-s67vq\" (UID: \"267396c4-1ded-4436-9caa-a45bb8a54e75\") " pod="openstack-operators/openstack-operator-index-s67vq" Dec 09 10:16:48 crc kubenswrapper[5002]: I1209 10:16:48.736284 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njj96\" (UniqueName: \"kubernetes.io/projected/267396c4-1ded-4436-9caa-a45bb8a54e75-kube-api-access-njj96\") pod \"openstack-operator-index-s67vq\" (UID: \"267396c4-1ded-4436-9caa-a45bb8a54e75\") " pod="openstack-operators/openstack-operator-index-s67vq" Dec 09 10:16:48 crc kubenswrapper[5002]: I1209 10:16:48.912717 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s67vq" Dec 09 10:16:49 crc kubenswrapper[5002]: I1209 10:16:49.424034 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s67vq"] Dec 09 10:16:49 crc kubenswrapper[5002]: W1209 10:16:49.437639 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod267396c4_1ded_4436_9caa_a45bb8a54e75.slice/crio-a51390362e5a28421d74f01b1125322ff96b7d4439eb159295a05a293ce56eab WatchSource:0}: Error finding container a51390362e5a28421d74f01b1125322ff96b7d4439eb159295a05a293ce56eab: Status 404 returned error can't find the container with id a51390362e5a28421d74f01b1125322ff96b7d4439eb159295a05a293ce56eab Dec 09 10:16:49 crc kubenswrapper[5002]: I1209 10:16:49.819590 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s67vq" event={"ID":"267396c4-1ded-4436-9caa-a45bb8a54e75","Type":"ContainerStarted","Data":"0df8f96bd89beb0f81cc8113957da69870208984e65a553e451f0b3fa3c93958"} Dec 09 10:16:49 crc kubenswrapper[5002]: I1209 10:16:49.820320 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s67vq" event={"ID":"267396c4-1ded-4436-9caa-a45bb8a54e75","Type":"ContainerStarted","Data":"a51390362e5a28421d74f01b1125322ff96b7d4439eb159295a05a293ce56eab"} Dec 09 10:16:49 crc kubenswrapper[5002]: I1209 10:16:49.819784 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-nfzvd" podUID="1444a761-8aeb-4dd1-b55b-22e924d924b0" containerName="registry-server" containerID="cri-o://a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29" gracePeriod=2 Dec 09 10:16:49 crc kubenswrapper[5002]: I1209 10:16:49.842990 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-s67vq" podStartSLOduration=1.778278287 podStartE2EDuration="1.842971664s" podCreationTimestamp="2025-12-09 10:16:48 +0000 UTC" firstStartedPulling="2025-12-09 10:16:49.442471265 +0000 UTC m=+941.834522386" lastFinishedPulling="2025-12-09 10:16:49.507164672 +0000 UTC m=+941.899215763" observedRunningTime="2025-12-09 10:16:49.841992888 +0000 UTC m=+942.234043979" watchObservedRunningTime="2025-12-09 10:16:49.842971664 +0000 UTC m=+942.235022755" Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.198779 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nfzvd" Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.237293 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8blt\" (UniqueName: \"kubernetes.io/projected/1444a761-8aeb-4dd1-b55b-22e924d924b0-kube-api-access-p8blt\") pod \"1444a761-8aeb-4dd1-b55b-22e924d924b0\" (UID: \"1444a761-8aeb-4dd1-b55b-22e924d924b0\") " Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.242398 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1444a761-8aeb-4dd1-b55b-22e924d924b0-kube-api-access-p8blt" (OuterVolumeSpecName: "kube-api-access-p8blt") pod "1444a761-8aeb-4dd1-b55b-22e924d924b0" (UID: "1444a761-8aeb-4dd1-b55b-22e924d924b0"). InnerVolumeSpecName "kube-api-access-p8blt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.339607 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8blt\" (UniqueName: \"kubernetes.io/projected/1444a761-8aeb-4dd1-b55b-22e924d924b0-kube-api-access-p8blt\") on node \"crc\" DevicePath \"\"" Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.828880 5002 generic.go:334] "Generic (PLEG): container finished" podID="1444a761-8aeb-4dd1-b55b-22e924d924b0" containerID="a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29" exitCode=0 Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.828953 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nfzvd" event={"ID":"1444a761-8aeb-4dd1-b55b-22e924d924b0","Type":"ContainerDied","Data":"a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29"} Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.829234 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nfzvd" event={"ID":"1444a761-8aeb-4dd1-b55b-22e924d924b0","Type":"ContainerDied","Data":"84faae9d6bd72bf026df77bde28ffada6b1947a47c2dff4c71ffb5c2a5e93d18"} Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.829269 5002 scope.go:117] "RemoveContainer" containerID="a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29" Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.828996 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nfzvd" Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.851931 5002 scope.go:117] "RemoveContainer" containerID="a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29" Dec 09 10:16:50 crc kubenswrapper[5002]: E1209 10:16:50.852620 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29\": container with ID starting with a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29 not found: ID does not exist" containerID="a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29" Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.852666 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29"} err="failed to get container status \"a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29\": rpc error: code = NotFound desc = could not find container \"a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29\": container with ID starting with a6a3180ad41bcb0deb5cbbeb310cab6935415c16879bf1c599eed7bb08b3ca29 not found: ID does not exist" Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.869445 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nfzvd"] Dec 09 10:16:50 crc kubenswrapper[5002]: I1209 10:16:50.875687 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-nfzvd"] Dec 09 10:16:52 crc kubenswrapper[5002]: I1209 10:16:52.075690 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1444a761-8aeb-4dd1-b55b-22e924d924b0" path="/var/lib/kubelet/pods/1444a761-8aeb-4dd1-b55b-22e924d924b0/volumes" Dec 09 10:16:58 crc kubenswrapper[5002]: I1209 10:16:58.913346 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-s67vq" Dec 09 10:16:58 crc kubenswrapper[5002]: I1209 10:16:58.915582 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-s67vq" Dec 09 10:16:58 crc kubenswrapper[5002]: I1209 10:16:58.939447 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-s67vq" Dec 09 10:16:59 crc kubenswrapper[5002]: I1209 10:16:59.943169 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-s67vq" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.383492 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c"] Dec 09 10:17:07 crc kubenswrapper[5002]: E1209 10:17:07.384447 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1444a761-8aeb-4dd1-b55b-22e924d924b0" containerName="registry-server" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.384544 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1444a761-8aeb-4dd1-b55b-22e924d924b0" containerName="registry-server" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.384670 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1444a761-8aeb-4dd1-b55b-22e924d924b0" containerName="registry-server" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.385468 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.389256 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5x46w" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.394488 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c"] Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.575171 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-bundle\") pod \"538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.575249 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-util\") pod \"538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.575297 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhqz2\" (UniqueName: \"kubernetes.io/projected/c673bba7-aec6-4d60-a236-e168b4570805-kube-api-access-jhqz2\") pod \"538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.676723 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-bundle\") pod \"538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.676781 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-util\") pod \"538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.676843 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhqz2\" (UniqueName: \"kubernetes.io/projected/c673bba7-aec6-4d60-a236-e168b4570805-kube-api-access-jhqz2\") pod \"538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.677449 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-bundle\") pod \"538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.677525 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-util\") pod \"538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:07 crc kubenswrapper[5002]: I1209 10:17:07.706862 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhqz2\" (UniqueName: \"kubernetes.io/projected/c673bba7-aec6-4d60-a236-e168b4570805-kube-api-access-jhqz2\") pod \"538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:08 crc kubenswrapper[5002]: I1209 10:17:08.004476 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5x46w" Dec 09 10:17:08 crc kubenswrapper[5002]: I1209 10:17:08.011617 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:08 crc kubenswrapper[5002]: I1209 10:17:08.280600 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c"] Dec 09 10:17:08 crc kubenswrapper[5002]: I1209 10:17:08.960577 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" event={"ID":"c673bba7-aec6-4d60-a236-e168b4570805","Type":"ContainerStarted","Data":"1f23170ecb4e82d5d7744111ff563e044c109f3a6c36ff4954cccdcbda1002bc"} Dec 09 10:17:09 crc kubenswrapper[5002]: I1209 10:17:09.972430 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" event={"ID":"c673bba7-aec6-4d60-a236-e168b4570805","Type":"ContainerStarted","Data":"73a5166e5368be0b43bf18511ffa60df33dd0a1b07f67b59ff7bedcc6e8a5388"} Dec 09 10:17:10 crc kubenswrapper[5002]: I1209 10:17:10.982630 5002 generic.go:334] "Generic (PLEG): container finished" podID="c673bba7-aec6-4d60-a236-e168b4570805" containerID="73a5166e5368be0b43bf18511ffa60df33dd0a1b07f67b59ff7bedcc6e8a5388" exitCode=0 Dec 09 10:17:10 crc kubenswrapper[5002]: I1209 10:17:10.982695 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" event={"ID":"c673bba7-aec6-4d60-a236-e168b4570805","Type":"ContainerDied","Data":"73a5166e5368be0b43bf18511ffa60df33dd0a1b07f67b59ff7bedcc6e8a5388"} Dec 09 10:17:11 crc kubenswrapper[5002]: I1209 10:17:11.994359 5002 generic.go:334] "Generic (PLEG): container finished" podID="c673bba7-aec6-4d60-a236-e168b4570805" containerID="f5488ec4e02622a6addac72a9ed75e45ebe3ea4f041f437f7767af99953653b8" exitCode=0 Dec 09 10:17:11 crc kubenswrapper[5002]: I1209 10:17:11.994461 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" event={"ID":"c673bba7-aec6-4d60-a236-e168b4570805","Type":"ContainerDied","Data":"f5488ec4e02622a6addac72a9ed75e45ebe3ea4f041f437f7767af99953653b8"} Dec 09 10:17:13 crc kubenswrapper[5002]: I1209 10:17:13.008345 5002 generic.go:334] "Generic (PLEG): container finished" podID="c673bba7-aec6-4d60-a236-e168b4570805" containerID="49573b134b5a1f51772badcb669a829d97f0e6fb929c0a6e7f53afa92d81cd39" exitCode=0 Dec 09 10:17:13 crc kubenswrapper[5002]: I1209 10:17:13.008413 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" event={"ID":"c673bba7-aec6-4d60-a236-e168b4570805","Type":"ContainerDied","Data":"49573b134b5a1f51772badcb669a829d97f0e6fb929c0a6e7f53afa92d81cd39"} Dec 09 10:17:14 crc kubenswrapper[5002]: I1209 10:17:14.400770 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:14 crc kubenswrapper[5002]: I1209 10:17:14.578752 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-util\") pod \"c673bba7-aec6-4d60-a236-e168b4570805\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " Dec 09 10:17:14 crc kubenswrapper[5002]: I1209 10:17:14.578999 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-bundle\") pod \"c673bba7-aec6-4d60-a236-e168b4570805\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " Dec 09 10:17:14 crc kubenswrapper[5002]: I1209 10:17:14.579134 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhqz2\" (UniqueName: \"kubernetes.io/projected/c673bba7-aec6-4d60-a236-e168b4570805-kube-api-access-jhqz2\") pod \"c673bba7-aec6-4d60-a236-e168b4570805\" (UID: \"c673bba7-aec6-4d60-a236-e168b4570805\") " Dec 09 10:17:14 crc kubenswrapper[5002]: I1209 10:17:14.580042 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-bundle" (OuterVolumeSpecName: "bundle") pod "c673bba7-aec6-4d60-a236-e168b4570805" (UID: "c673bba7-aec6-4d60-a236-e168b4570805"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:17:14 crc kubenswrapper[5002]: I1209 10:17:14.587908 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c673bba7-aec6-4d60-a236-e168b4570805-kube-api-access-jhqz2" (OuterVolumeSpecName: "kube-api-access-jhqz2") pod "c673bba7-aec6-4d60-a236-e168b4570805" (UID: "c673bba7-aec6-4d60-a236-e168b4570805"). InnerVolumeSpecName "kube-api-access-jhqz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:17:14 crc kubenswrapper[5002]: I1209 10:17:14.608240 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-util" (OuterVolumeSpecName: "util") pod "c673bba7-aec6-4d60-a236-e168b4570805" (UID: "c673bba7-aec6-4d60-a236-e168b4570805"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:17:14 crc kubenswrapper[5002]: I1209 10:17:14.680878 5002 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-util\") on node \"crc\" DevicePath \"\"" Dec 09 10:17:14 crc kubenswrapper[5002]: I1209 10:17:14.680939 5002 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c673bba7-aec6-4d60-a236-e168b4570805-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:17:14 crc kubenswrapper[5002]: I1209 10:17:14.680963 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhqz2\" (UniqueName: \"kubernetes.io/projected/c673bba7-aec6-4d60-a236-e168b4570805-kube-api-access-jhqz2\") on node \"crc\" DevicePath \"\"" Dec 09 10:17:15 crc kubenswrapper[5002]: I1209 10:17:15.027946 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" event={"ID":"c673bba7-aec6-4d60-a236-e168b4570805","Type":"ContainerDied","Data":"1f23170ecb4e82d5d7744111ff563e044c109f3a6c36ff4954cccdcbda1002bc"} Dec 09 10:17:15 crc kubenswrapper[5002]: I1209 10:17:15.028536 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f23170ecb4e82d5d7744111ff563e044c109f3a6c36ff4954cccdcbda1002bc" Dec 09 10:17:15 crc kubenswrapper[5002]: I1209 10:17:15.028055 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c" Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.630888 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68"] Dec 09 10:17:19 crc kubenswrapper[5002]: E1209 10:17:19.631509 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c673bba7-aec6-4d60-a236-e168b4570805" containerName="util" Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.631528 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c673bba7-aec6-4d60-a236-e168b4570805" containerName="util" Dec 09 10:17:19 crc kubenswrapper[5002]: E1209 10:17:19.631541 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c673bba7-aec6-4d60-a236-e168b4570805" containerName="extract" Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.631550 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c673bba7-aec6-4d60-a236-e168b4570805" containerName="extract" Dec 09 10:17:19 crc kubenswrapper[5002]: E1209 10:17:19.631564 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c673bba7-aec6-4d60-a236-e168b4570805" containerName="pull" Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.631573 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c673bba7-aec6-4d60-a236-e168b4570805" containerName="pull" Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.631704 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c673bba7-aec6-4d60-a236-e168b4570805" containerName="extract" Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.632279 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68" Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.634312 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-x5tvc" Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.654850 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68"] Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.755432 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6tvr\" (UniqueName: \"kubernetes.io/projected/71024129-a040-442d-9e93-f3d0125153ac-kube-api-access-d6tvr\") pod \"openstack-operator-controller-operator-5d69f78f7c-pmv68\" (UID: \"71024129-a040-442d-9e93-f3d0125153ac\") " pod="openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68" Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.857089 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6tvr\" (UniqueName: \"kubernetes.io/projected/71024129-a040-442d-9e93-f3d0125153ac-kube-api-access-d6tvr\") pod \"openstack-operator-controller-operator-5d69f78f7c-pmv68\" (UID: \"71024129-a040-442d-9e93-f3d0125153ac\") " pod="openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68" Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.875760 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6tvr\" (UniqueName: \"kubernetes.io/projected/71024129-a040-442d-9e93-f3d0125153ac-kube-api-access-d6tvr\") pod \"openstack-operator-controller-operator-5d69f78f7c-pmv68\" (UID: \"71024129-a040-442d-9e93-f3d0125153ac\") " pod="openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68" Dec 09 10:17:19 crc kubenswrapper[5002]: I1209 10:17:19.951866 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68" Dec 09 10:17:20 crc kubenswrapper[5002]: I1209 10:17:20.425829 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68"] Dec 09 10:17:21 crc kubenswrapper[5002]: I1209 10:17:21.068170 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68" event={"ID":"71024129-a040-442d-9e93-f3d0125153ac","Type":"ContainerStarted","Data":"2ca758bc5dafa9de9a2902ec13ccc2e32789ea21220b83c8c5edacbe840796c5"} Dec 09 10:17:25 crc kubenswrapper[5002]: I1209 10:17:25.263305 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68" event={"ID":"71024129-a040-442d-9e93-f3d0125153ac","Type":"ContainerStarted","Data":"38f748f6e5220048415aca231bbd3cca2549894db7419e28d8d463351ec971d3"} Dec 09 10:17:25 crc kubenswrapper[5002]: I1209 10:17:25.263905 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68" Dec 09 10:17:25 crc kubenswrapper[5002]: I1209 10:17:25.315918 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68" podStartSLOduration=1.762447616 podStartE2EDuration="6.315902826s" podCreationTimestamp="2025-12-09 10:17:19 +0000 UTC" firstStartedPulling="2025-12-09 10:17:20.440853749 +0000 UTC m=+972.832904840" lastFinishedPulling="2025-12-09 10:17:24.994308969 +0000 UTC m=+977.386360050" observedRunningTime="2025-12-09 10:17:25.315317101 +0000 UTC m=+977.707368242" watchObservedRunningTime="2025-12-09 10:17:25.315902826 +0000 UTC m=+977.707953907" Dec 09 10:17:37 crc kubenswrapper[5002]: I1209 10:17:37.964856 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:17:37 crc kubenswrapper[5002]: I1209 10:17:37.965678 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:17:39 crc kubenswrapper[5002]: I1209 10:17:39.955416 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5d69f78f7c-pmv68" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.656070 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.657980 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.661294 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-l9kxz" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.667575 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.668546 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.674142 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-wghph" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.692235 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.701307 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-9th65"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.702285 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-9th65" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.704677 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-b9jcp" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.719874 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.727073 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-9th65"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.751543 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.753308 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.763524 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-9xpw5" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.781320 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.782933 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.784518 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.785190 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-4zc6v" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.793695 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmn9h\" (UniqueName: \"kubernetes.io/projected/78d17849-0dbc-4b43-b5e6-49bd7c766aa1-kube-api-access-fmn9h\") pod \"cinder-operator-controller-manager-6c677c69b-4dzfs\" (UID: \"78d17849-0dbc-4b43-b5e6-49bd7c766aa1\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.793759 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d59jn\" (UniqueName: \"kubernetes.io/projected/7fce722c-f3f5-48b6-a567-e867e4e1f27b-kube-api-access-d59jn\") pod \"barbican-operator-controller-manager-7d9dfd778-kj2xr\" (UID: \"7fce722c-f3f5-48b6-a567-e867e4e1f27b\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.816922 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.818457 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.826181 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-z5zgn" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.829535 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.851100 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.876837 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.877873 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.887731 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.887978 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zwx57" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.894752 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmn9h\" (UniqueName: \"kubernetes.io/projected/78d17849-0dbc-4b43-b5e6-49bd7c766aa1-kube-api-access-fmn9h\") pod \"cinder-operator-controller-manager-6c677c69b-4dzfs\" (UID: \"78d17849-0dbc-4b43-b5e6-49bd7c766aa1\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.894859 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn8wm\" (UniqueName: \"kubernetes.io/projected/ce6b14b8-1656-4ac0-b6e2-24fd9329faf4-kube-api-access-cn8wm\") pod \"glance-operator-controller-manager-5697bb5779-vcspp\" (UID: \"ce6b14b8-1656-4ac0-b6e2-24fd9329faf4\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.894892 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d59jn\" (UniqueName: \"kubernetes.io/projected/7fce722c-f3f5-48b6-a567-e867e4e1f27b-kube-api-access-d59jn\") pod \"barbican-operator-controller-manager-7d9dfd778-kj2xr\" (UID: \"7fce722c-f3f5-48b6-a567-e867e4e1f27b\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.894948 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk6r2\" (UniqueName: \"kubernetes.io/projected/f0f92bf3-053d-4ea9-bd30-4e6724c414c8-kube-api-access-pk6r2\") pod \"heat-operator-controller-manager-5f64f6f8bb-5pq7d\" (UID: \"f0f92bf3-053d-4ea9-bd30-4e6724c414c8\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.895163 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxvfn\" (UniqueName: \"kubernetes.io/projected/80762edf-2f88-41a7-b96b-2856b2e2c00e-kube-api-access-pxvfn\") pod \"designate-operator-controller-manager-697fb699cf-9th65\" (UID: \"80762edf-2f88-41a7-b96b-2856b2e2c00e\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-9th65" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.902286 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.912167 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.913517 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.918289 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-nd7sq" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.925543 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d59jn\" (UniqueName: \"kubernetes.io/projected/7fce722c-f3f5-48b6-a567-e867e4e1f27b-kube-api-access-d59jn\") pod \"barbican-operator-controller-manager-7d9dfd778-kj2xr\" (UID: \"7fce722c-f3f5-48b6-a567-e867e4e1f27b\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.942414 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmn9h\" (UniqueName: \"kubernetes.io/projected/78d17849-0dbc-4b43-b5e6-49bd7c766aa1-kube-api-access-fmn9h\") pod \"cinder-operator-controller-manager-6c677c69b-4dzfs\" (UID: \"78d17849-0dbc-4b43-b5e6-49bd7c766aa1\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.942972 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.944495 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.949324 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-57fvt" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.952175 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.963663 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.964847 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.968848 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lmtgc" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.980845 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.982108 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs" Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.988915 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz"] Dec 09 10:18:04 crc kubenswrapper[5002]: I1209 10:18:04.990105 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.001223 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-pkchl" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.001624 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.001761 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sl4m\" (UniqueName: \"kubernetes.io/projected/8519dc96-e20f-47d9-9b48-e64a07393d39-kube-api-access-5sl4m\") pod \"horizon-operator-controller-manager-68c6d99b8f-b8mpv\" (UID: \"8519dc96-e20f-47d9-9b48-e64a07393d39\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.001869 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn8wm\" (UniqueName: \"kubernetes.io/projected/ce6b14b8-1656-4ac0-b6e2-24fd9329faf4-kube-api-access-cn8wm\") pod \"glance-operator-controller-manager-5697bb5779-vcspp\" (UID: \"ce6b14b8-1656-4ac0-b6e2-24fd9329faf4\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.001939 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kpxd\" (UniqueName: \"kubernetes.io/projected/531ff665-3733-405e-b178-0f185d8cb22e-kube-api-access-9kpxd\") pod \"infra-operator-controller-manager-78d48bff9d-qqkqr\" (UID: \"531ff665-3733-405e-b178-0f185d8cb22e\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.002006 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk6r2\" (UniqueName: \"kubernetes.io/projected/f0f92bf3-053d-4ea9-bd30-4e6724c414c8-kube-api-access-pk6r2\") pod \"heat-operator-controller-manager-5f64f6f8bb-5pq7d\" (UID: \"f0f92bf3-053d-4ea9-bd30-4e6724c414c8\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.002041 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxvfn\" (UniqueName: \"kubernetes.io/projected/80762edf-2f88-41a7-b96b-2856b2e2c00e-kube-api-access-pxvfn\") pod \"designate-operator-controller-manager-697fb699cf-9th65\" (UID: \"80762edf-2f88-41a7-b96b-2856b2e2c00e\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-9th65" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.002120 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert\") pod \"infra-operator-controller-manager-78d48bff9d-qqkqr\" (UID: \"531ff665-3733-405e-b178-0f185d8cb22e\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.027128 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.037494 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxvfn\" (UniqueName: \"kubernetes.io/projected/80762edf-2f88-41a7-b96b-2856b2e2c00e-kube-api-access-pxvfn\") pod \"designate-operator-controller-manager-697fb699cf-9th65\" (UID: \"80762edf-2f88-41a7-b96b-2856b2e2c00e\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-9th65" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.046882 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.055066 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn8wm\" (UniqueName: \"kubernetes.io/projected/ce6b14b8-1656-4ac0-b6e2-24fd9329faf4-kube-api-access-cn8wm\") pod \"glance-operator-controller-manager-5697bb5779-vcspp\" (UID: \"ce6b14b8-1656-4ac0-b6e2-24fd9329faf4\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.055746 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk6r2\" (UniqueName: \"kubernetes.io/projected/f0f92bf3-053d-4ea9-bd30-4e6724c414c8-kube-api-access-pk6r2\") pod \"heat-operator-controller-manager-5f64f6f8bb-5pq7d\" (UID: \"f0f92bf3-053d-4ea9-bd30-4e6724c414c8\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.074170 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.075313 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.079829 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-46j8r" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.088358 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-jb68h"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.089720 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.091374 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-czm5q" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.093365 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.103042 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert\") pod \"infra-operator-controller-manager-78d48bff9d-qqkqr\" (UID: \"531ff665-3733-405e-b178-0f185d8cb22e\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.103091 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sl4m\" (UniqueName: \"kubernetes.io/projected/8519dc96-e20f-47d9-9b48-e64a07393d39-kube-api-access-5sl4m\") pod \"horizon-operator-controller-manager-68c6d99b8f-b8mpv\" (UID: \"8519dc96-e20f-47d9-9b48-e64a07393d39\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.103129 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf5xh\" (UniqueName: \"kubernetes.io/projected/2eeed466-946c-49a5-9fe9-b393629c3394-kube-api-access-tf5xh\") pod \"mariadb-operator-controller-manager-79c8c4686c-25fmz\" (UID: \"2eeed466-946c-49a5-9fe9-b393629c3394\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.103154 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm2lh\" (UniqueName: \"kubernetes.io/projected/2e565997-148f-4d70-ae78-89f194f791f8-kube-api-access-lm2lh\") pod \"manila-operator-controller-manager-5b5fd79c9c-8jgwr\" (UID: \"2e565997-148f-4d70-ae78-89f194f791f8\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.103199 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kpxd\" (UniqueName: \"kubernetes.io/projected/531ff665-3733-405e-b178-0f185d8cb22e-kube-api-access-9kpxd\") pod \"infra-operator-controller-manager-78d48bff9d-qqkqr\" (UID: \"531ff665-3733-405e-b178-0f185d8cb22e\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.103217 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qvsn\" (UniqueName: \"kubernetes.io/projected/d3588b02-67b1-4172-9ab8-9da4cb7b09dc-kube-api-access-5qvsn\") pod \"ironic-operator-controller-manager-967d97867-qvnbd\" (UID: \"d3588b02-67b1-4172-9ab8-9da4cb7b09dc\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.103251 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwnfd\" (UniqueName: \"kubernetes.io/projected/a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf-kube-api-access-kwnfd\") pod \"keystone-operator-controller-manager-7765d96ddf-495q6\" (UID: \"a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.103286 5002 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.103396 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert podName:531ff665-3733-405e-b178-0f185d8cb22e nodeName:}" failed. No retries permitted until 2025-12-09 10:18:05.603364627 +0000 UTC m=+1017.995415698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert") pod "infra-operator-controller-manager-78d48bff9d-qqkqr" (UID: "531ff665-3733-405e-b178-0f185d8cb22e") : secret "infra-operator-webhook-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.104193 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.116586 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.117855 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.119909 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-p6csj" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.123940 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-jb68h"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.130951 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sl4m\" (UniqueName: \"kubernetes.io/projected/8519dc96-e20f-47d9-9b48-e64a07393d39-kube-api-access-5sl4m\") pod \"horizon-operator-controller-manager-68c6d99b8f-b8mpv\" (UID: \"8519dc96-e20f-47d9-9b48-e64a07393d39\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.141575 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kpxd\" (UniqueName: \"kubernetes.io/projected/531ff665-3733-405e-b178-0f185d8cb22e-kube-api-access-9kpxd\") pod \"infra-operator-controller-manager-78d48bff9d-qqkqr\" (UID: \"531ff665-3733-405e-b178-0f185d8cb22e\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.141652 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.142598 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.146115 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.149373 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.149641 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-h4r6z" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.163538 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.188223 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.195860 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.204097 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5rlr\" (UniqueName: \"kubernetes.io/projected/5f6d862d-d775-4712-b455-ad5110968d19-kube-api-access-f5rlr\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-q294d\" (UID: \"5f6d862d-d775-4712-b455-ad5110968d19\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.204131 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf5xh\" (UniqueName: \"kubernetes.io/projected/2eeed466-946c-49a5-9fe9-b393629c3394-kube-api-access-tf5xh\") pod \"mariadb-operator-controller-manager-79c8c4686c-25fmz\" (UID: \"2eeed466-946c-49a5-9fe9-b393629c3394\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.204159 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm2lh\" (UniqueName: \"kubernetes.io/projected/2e565997-148f-4d70-ae78-89f194f791f8-kube-api-access-lm2lh\") pod \"manila-operator-controller-manager-5b5fd79c9c-8jgwr\" (UID: \"2e565997-148f-4d70-ae78-89f194f791f8\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.204208 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsdqw\" (UniqueName: \"kubernetes.io/projected/097e947b-8923-4155-9d10-f0241f751ad8-kube-api-access-jsdqw\") pod \"octavia-operator-controller-manager-998648c74-jb68h\" (UID: \"097e947b-8923-4155-9d10-f0241f751ad8\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.204233 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qvsn\" (UniqueName: \"kubernetes.io/projected/d3588b02-67b1-4172-9ab8-9da4cb7b09dc-kube-api-access-5qvsn\") pod \"ironic-operator-controller-manager-967d97867-qvnbd\" (UID: \"d3588b02-67b1-4172-9ab8-9da4cb7b09dc\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.204283 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwnfd\" (UniqueName: \"kubernetes.io/projected/a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf-kube-api-access-kwnfd\") pod \"keystone-operator-controller-manager-7765d96ddf-495q6\" (UID: \"a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.223937 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.225307 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.231337 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hblhh" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.250901 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwnfd\" (UniqueName: \"kubernetes.io/projected/a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf-kube-api-access-kwnfd\") pod \"keystone-operator-controller-manager-7765d96ddf-495q6\" (UID: \"a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.258789 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.264710 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm2lh\" (UniqueName: \"kubernetes.io/projected/2e565997-148f-4d70-ae78-89f194f791f8-kube-api-access-lm2lh\") pod \"manila-operator-controller-manager-5b5fd79c9c-8jgwr\" (UID: \"2e565997-148f-4d70-ae78-89f194f791f8\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.267667 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qvsn\" (UniqueName: \"kubernetes.io/projected/d3588b02-67b1-4172-9ab8-9da4cb7b09dc-kube-api-access-5qvsn\") pod \"ironic-operator-controller-manager-967d97867-qvnbd\" (UID: \"d3588b02-67b1-4172-9ab8-9da4cb7b09dc\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.281260 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf5xh\" (UniqueName: \"kubernetes.io/projected/2eeed466-946c-49a5-9fe9-b393629c3394-kube-api-access-tf5xh\") pod \"mariadb-operator-controller-manager-79c8c4686c-25fmz\" (UID: \"2eeed466-946c-49a5-9fe9-b393629c3394\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.290540 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.325982 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsdqw\" (UniqueName: \"kubernetes.io/projected/097e947b-8923-4155-9d10-f0241f751ad8-kube-api-access-jsdqw\") pod \"octavia-operator-controller-manager-998648c74-jb68h\" (UID: \"097e947b-8923-4155-9d10-f0241f751ad8\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.334497 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fm85h8\" (UID: \"3f8701c7-f70e-4584-8cf1-ed40adda7b84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.334616 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw28v\" (UniqueName: \"kubernetes.io/projected/51978e4b-eb47-4e87-8528-38631838226a-kube-api-access-rw28v\") pod \"nova-operator-controller-manager-697bc559fc-7kqxz\" (UID: \"51978e4b-eb47-4e87-8528-38631838226a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.334890 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dx57\" (UniqueName: \"kubernetes.io/projected/3f8701c7-f70e-4584-8cf1-ed40adda7b84-kube-api-access-9dx57\") pod \"openstack-baremetal-operator-controller-manager-84b575879fm85h8\" (UID: \"3f8701c7-f70e-4584-8cf1-ed40adda7b84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.335055 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5rlr\" (UniqueName: \"kubernetes.io/projected/5f6d862d-d775-4712-b455-ad5110968d19-kube-api-access-f5rlr\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-q294d\" (UID: \"5f6d862d-d775-4712-b455-ad5110968d19\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.339022 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-9th65" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.398917 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.410088 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.424001 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.424985 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.425014 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.425083 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.425485 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.427557 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-rwgvb" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.436611 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5rlr\" (UniqueName: \"kubernetes.io/projected/5f6d862d-d775-4712-b455-ad5110968d19-kube-api-access-f5rlr\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-q294d\" (UID: \"5f6d862d-d775-4712-b455-ad5110968d19\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.437510 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fm85h8\" (UID: \"3f8701c7-f70e-4584-8cf1-ed40adda7b84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.437549 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd26k\" (UniqueName: \"kubernetes.io/projected/1c8db604-51f8-4fe5-b659-fc14b7c706f1-kube-api-access-gd26k\") pod \"swift-operator-controller-manager-9d58d64bc-vvm2b\" (UID: \"1c8db604-51f8-4fe5-b659-fc14b7c706f1\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.437582 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw28v\" (UniqueName: \"kubernetes.io/projected/51978e4b-eb47-4e87-8528-38631838226a-kube-api-access-rw28v\") pod \"nova-operator-controller-manager-697bc559fc-7kqxz\" (UID: \"51978e4b-eb47-4e87-8528-38631838226a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.437617 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzj2\" (UniqueName: \"kubernetes.io/projected/d172ce03-c24c-41ca-a61a-eecdd9572f5d-kube-api-access-bzzj2\") pod \"placement-operator-controller-manager-78f8948974-hnnmt\" (UID: \"d172ce03-c24c-41ca-a61a-eecdd9572f5d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.437649 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jphq9\" (UniqueName: \"kubernetes.io/projected/276be0c6-fc03-4693-91c7-5a999e6f0d89-kube-api-access-jphq9\") pod \"ovn-operator-controller-manager-b6456fdb6-rdjzk\" (UID: \"276be0c6-fc03-4693-91c7-5a999e6f0d89\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.437670 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dx57\" (UniqueName: \"kubernetes.io/projected/3f8701c7-f70e-4584-8cf1-ed40adda7b84-kube-api-access-9dx57\") pod \"openstack-baremetal-operator-controller-manager-84b575879fm85h8\" (UID: \"3f8701c7-f70e-4584-8cf1-ed40adda7b84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.438130 5002 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.438169 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert podName:3f8701c7-f70e-4584-8cf1-ed40adda7b84 nodeName:}" failed. No retries permitted until 2025-12-09 10:18:05.938158562 +0000 UTC m=+1018.330209643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fm85h8" (UID: "3f8701c7-f70e-4584-8cf1-ed40adda7b84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.438225 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.438781 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsdqw\" (UniqueName: \"kubernetes.io/projected/097e947b-8923-4155-9d10-f0241f751ad8-kube-api-access-jsdqw\") pod \"octavia-operator-controller-manager-998648c74-jb68h\" (UID: \"097e947b-8923-4155-9d10-f0241f751ad8\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.439786 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.442573 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hbvmw" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.474100 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.481660 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.503224 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.508266 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dx57\" (UniqueName: \"kubernetes.io/projected/3f8701c7-f70e-4584-8cf1-ed40adda7b84-kube-api-access-9dx57\") pod \"openstack-baremetal-operator-controller-manager-84b575879fm85h8\" (UID: \"3f8701c7-f70e-4584-8cf1-ed40adda7b84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.513165 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.514588 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw28v\" (UniqueName: \"kubernetes.io/projected/51978e4b-eb47-4e87-8528-38631838226a-kube-api-access-rw28v\") pod \"nova-operator-controller-manager-697bc559fc-7kqxz\" (UID: \"51978e4b-eb47-4e87-8528-38631838226a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.532825 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xmx75" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.544428 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzj2\" (UniqueName: \"kubernetes.io/projected/d172ce03-c24c-41ca-a61a-eecdd9572f5d-kube-api-access-bzzj2\") pod \"placement-operator-controller-manager-78f8948974-hnnmt\" (UID: \"d172ce03-c24c-41ca-a61a-eecdd9572f5d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.544489 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jphq9\" (UniqueName: \"kubernetes.io/projected/276be0c6-fc03-4693-91c7-5a999e6f0d89-kube-api-access-jphq9\") pod \"ovn-operator-controller-manager-b6456fdb6-rdjzk\" (UID: \"276be0c6-fc03-4693-91c7-5a999e6f0d89\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.544537 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hstwc\" (UniqueName: \"kubernetes.io/projected/c66676bd-3461-405a-bce7-60c91858b55e-kube-api-access-hstwc\") pod \"telemetry-operator-controller-manager-58d5ff84df-4vg8l\" (UID: \"c66676bd-3461-405a-bce7-60c91858b55e\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.544588 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd26k\" (UniqueName: \"kubernetes.io/projected/1c8db604-51f8-4fe5-b659-fc14b7c706f1-kube-api-access-gd26k\") pod \"swift-operator-controller-manager-9d58d64bc-vvm2b\" (UID: \"1c8db604-51f8-4fe5-b659-fc14b7c706f1\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.555512 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.595844 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd26k\" (UniqueName: \"kubernetes.io/projected/1c8db604-51f8-4fe5-b659-fc14b7c706f1-kube-api-access-gd26k\") pod \"swift-operator-controller-manager-9d58d64bc-vvm2b\" (UID: \"1c8db604-51f8-4fe5-b659-fc14b7c706f1\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.596698 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzj2\" (UniqueName: \"kubernetes.io/projected/d172ce03-c24c-41ca-a61a-eecdd9572f5d-kube-api-access-bzzj2\") pod \"placement-operator-controller-manager-78f8948974-hnnmt\" (UID: \"d172ce03-c24c-41ca-a61a-eecdd9572f5d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.603142 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.604606 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jphq9\" (UniqueName: \"kubernetes.io/projected/276be0c6-fc03-4693-91c7-5a999e6f0d89-kube-api-access-jphq9\") pod \"ovn-operator-controller-manager-b6456fdb6-rdjzk\" (UID: \"276be0c6-fc03-4693-91c7-5a999e6f0d89\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.607944 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.622626 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.625899 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-kmk2n" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.626605 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.636111 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.639756 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.643665 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.649210 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-jh5g7" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.650108 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq92g\" (UniqueName: \"kubernetes.io/projected/048abed6-46b3-4dba-bf28-8c0ee1685817-kube-api-access-sq92g\") pod \"test-operator-controller-manager-5854674fcc-8n5z8\" (UID: \"048abed6-46b3-4dba-bf28-8c0ee1685817\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.650137 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert\") pod \"infra-operator-controller-manager-78d48bff9d-qqkqr\" (UID: \"531ff665-3733-405e-b178-0f185d8cb22e\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.650167 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hstwc\" (UniqueName: \"kubernetes.io/projected/c66676bd-3461-405a-bce7-60c91858b55e-kube-api-access-hstwc\") pod \"telemetry-operator-controller-manager-58d5ff84df-4vg8l\" (UID: \"c66676bd-3461-405a-bce7-60c91858b55e\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.650933 5002 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.650968 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert podName:531ff665-3733-405e-b178-0f185d8cb22e nodeName:}" failed. No retries permitted until 2025-12-09 10:18:06.650955198 +0000 UTC m=+1019.043006279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert") pod "infra-operator-controller-manager-78d48bff9d-qqkqr" (UID: "531ff665-3733-405e-b178-0f185d8cb22e") : secret "infra-operator-webhook-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.674765 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.676188 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.679493 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.679699 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qj9nr" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.679936 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.681507 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.686281 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hstwc\" (UniqueName: \"kubernetes.io/projected/c66676bd-3461-405a-bce7-60c91858b55e-kube-api-access-hstwc\") pod \"telemetry-operator-controller-manager-58d5ff84df-4vg8l\" (UID: \"c66676bd-3461-405a-bce7-60c91858b55e\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.717446 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.718974 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.723917 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.731512 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-jkjs2" Dec 09 10:18:05 crc kubenswrapper[5002]: W1209 10:18:05.742437 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d17849_0dbc_4b43_b5e6_49bd7c766aa1.slice/crio-e175159a3029e0e96de6e21ffc7b200dc1c5a2a0ae8ef7d0d40f1504cf0f8bf9 WatchSource:0}: Error finding container e175159a3029e0e96de6e21ffc7b200dc1c5a2a0ae8ef7d0d40f1504cf0f8bf9: Status 404 returned error can't find the container with id e175159a3029e0e96de6e21ffc7b200dc1c5a2a0ae8ef7d0d40f1504cf0f8bf9 Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.751866 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfrkm\" (UniqueName: \"kubernetes.io/projected/2c2f2646-d609-433d-894e-5577cc497adc-kube-api-access-vfrkm\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.752713 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq92g\" (UniqueName: \"kubernetes.io/projected/048abed6-46b3-4dba-bf28-8c0ee1685817-kube-api-access-sq92g\") pod \"test-operator-controller-manager-5854674fcc-8n5z8\" (UID: \"048abed6-46b3-4dba-bf28-8c0ee1685817\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.752763 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhx64\" (UniqueName: \"kubernetes.io/projected/cecd7de4-71b0-4914-a5c5-68b8da7704cf-kube-api-access-hhx64\") pod \"rabbitmq-cluster-operator-manager-668c99d594-x578z\" (UID: \"cecd7de4-71b0-4914-a5c5-68b8da7704cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.752785 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.752827 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z785k\" (UniqueName: \"kubernetes.io/projected/53e6de02-2fb5-4491-aa93-7ad8bd2a190c-kube-api-access-z785k\") pod \"watcher-operator-controller-manager-75944c9b7-vkkw9\" (UID: \"53e6de02-2fb5-4491-aa93-7ad8bd2a190c\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.752888 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.771512 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq92g\" (UniqueName: \"kubernetes.io/projected/048abed6-46b3-4dba-bf28-8c0ee1685817-kube-api-access-sq92g\") pod \"test-operator-controller-manager-5854674fcc-8n5z8\" (UID: \"048abed6-46b3-4dba-bf28-8c0ee1685817\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.776322 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.794293 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.848396 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.855615 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.855918 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfrkm\" (UniqueName: \"kubernetes.io/projected/2c2f2646-d609-433d-894e-5577cc497adc-kube-api-access-vfrkm\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.855789 5002 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.856155 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs podName:2c2f2646-d609-433d-894e-5577cc497adc nodeName:}" failed. No retries permitted until 2025-12-09 10:18:06.356124482 +0000 UTC m=+1018.748175563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs") pod "openstack-operator-controller-manager-7f8d8fb65b-6gj42" (UID: "2c2f2646-d609-433d-894e-5577cc497adc") : secret "metrics-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.856267 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhx64\" (UniqueName: \"kubernetes.io/projected/cecd7de4-71b0-4914-a5c5-68b8da7704cf-kube-api-access-hhx64\") pod \"rabbitmq-cluster-operator-manager-668c99d594-x578z\" (UID: \"cecd7de4-71b0-4914-a5c5-68b8da7704cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.856407 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.856502 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z785k\" (UniqueName: \"kubernetes.io/projected/53e6de02-2fb5-4491-aa93-7ad8bd2a190c-kube-api-access-z785k\") pod \"watcher-operator-controller-manager-75944c9b7-vkkw9\" (UID: \"53e6de02-2fb5-4491-aa93-7ad8bd2a190c\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.856699 5002 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.856734 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs podName:2c2f2646-d609-433d-894e-5577cc497adc nodeName:}" failed. No retries permitted until 2025-12-09 10:18:06.356725988 +0000 UTC m=+1018.748777069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs") pod "openstack-operator-controller-manager-7f8d8fb65b-6gj42" (UID: "2c2f2646-d609-433d-894e-5577cc497adc") : secret "webhook-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.861769 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.879079 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr"] Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.884416 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z785k\" (UniqueName: \"kubernetes.io/projected/53e6de02-2fb5-4491-aa93-7ad8bd2a190c-kube-api-access-z785k\") pod \"watcher-operator-controller-manager-75944c9b7-vkkw9\" (UID: \"53e6de02-2fb5-4491-aa93-7ad8bd2a190c\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.887654 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhx64\" (UniqueName: \"kubernetes.io/projected/cecd7de4-71b0-4914-a5c5-68b8da7704cf-kube-api-access-hhx64\") pod \"rabbitmq-cluster-operator-manager-668c99d594-x578z\" (UID: \"cecd7de4-71b0-4914-a5c5-68b8da7704cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.888150 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfrkm\" (UniqueName: \"kubernetes.io/projected/2c2f2646-d609-433d-894e-5577cc497adc-kube-api-access-vfrkm\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.888486 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.929246 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.957958 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fm85h8\" (UID: \"3f8701c7-f70e-4584-8cf1-ed40adda7b84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.958090 5002 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: E1209 10:18:05.958136 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert podName:3f8701c7-f70e-4584-8cf1-ed40adda7b84 nodeName:}" failed. No retries permitted until 2025-12-09 10:18:06.958121604 +0000 UTC m=+1019.350172675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fm85h8" (UID: "3f8701c7-f70e-4584-8cf1-ed40adda7b84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 10:18:05 crc kubenswrapper[5002]: I1209 10:18:05.977957 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.020110 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d"] Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.068699 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z" Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.070373 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp"] Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.120934 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv"] Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.298513 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr"] Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.301914 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6"] Dec 09 10:18:06 crc kubenswrapper[5002]: W1209 10:18:06.306500 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c0ed3b_56a5_490b_bca4_9541eeb8e2cf.slice/crio-de49510bb017c2d87ce0aa6ab635114f0e8b7e2f06b5b0b3e3472a1d539dc0ae WatchSource:0}: Error finding container de49510bb017c2d87ce0aa6ab635114f0e8b7e2f06b5b0b3e3472a1d539dc0ae: Status 404 returned error can't find the container with id de49510bb017c2d87ce0aa6ab635114f0e8b7e2f06b5b0b3e3472a1d539dc0ae Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.311647 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd"] Dec 09 10:18:06 crc kubenswrapper[5002]: W1209 10:18:06.317000 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3588b02_67b1_4172_9ab8_9da4cb7b09dc.slice/crio-416adf305bb3240ff8714fca446beb008cc65e2d03bf296912dc91cf51908616 WatchSource:0}: Error finding container 416adf305bb3240ff8714fca446beb008cc65e2d03bf296912dc91cf51908616: Status 404 returned error can't find the container with id 416adf305bb3240ff8714fca446beb008cc65e2d03bf296912dc91cf51908616 Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.335114 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-9th65"] Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.395537 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.395616 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.395728 5002 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.395750 5002 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.395778 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs podName:2c2f2646-d609-433d-894e-5577cc497adc nodeName:}" failed. No retries permitted until 2025-12-09 10:18:07.395764073 +0000 UTC m=+1019.787815154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs") pod "openstack-operator-controller-manager-7f8d8fb65b-6gj42" (UID: "2c2f2646-d609-433d-894e-5577cc497adc") : secret "metrics-server-cert" not found Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.395831 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs podName:2c2f2646-d609-433d-894e-5577cc497adc nodeName:}" failed. No retries permitted until 2025-12-09 10:18:07.395802324 +0000 UTC m=+1019.787853405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs") pod "openstack-operator-controller-manager-7f8d8fb65b-6gj42" (UID: "2c2f2646-d609-433d-894e-5577cc497adc") : secret "webhook-server-cert" not found Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.539303 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-jb68h"] Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.546798 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d"] Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.552386 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz"] Dec 09 10:18:06 crc kubenswrapper[5002]: W1209 10:18:06.553713 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f6d862d_d775_4712_b455_ad5110968d19.slice/crio-0bb01e9b1b5efffa73955b5b43b04fda9a6c600fae6e4c311d7a9eb28c136924 WatchSource:0}: Error finding container 0bb01e9b1b5efffa73955b5b43b04fda9a6c600fae6e4c311d7a9eb28c136924: Status 404 returned error can't find the container with id 0bb01e9b1b5efffa73955b5b43b04fda9a6c600fae6e4c311d7a9eb28c136924 Dec 09 10:18:06 crc kubenswrapper[5002]: W1209 10:18:06.555364 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eeed466_946c_49a5_9fe9_b393629c3394.slice/crio-ef2a6e77d10d1196abded1aad0784fbc92e77689e257c82697a66f11789e6a2d WatchSource:0}: Error finding container ef2a6e77d10d1196abded1aad0784fbc92e77689e257c82697a66f11789e6a2d: Status 404 returned error can't find the container with id ef2a6e77d10d1196abded1aad0784fbc92e77689e257c82697a66f11789e6a2d Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.573737 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" event={"ID":"a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf","Type":"ContainerStarted","Data":"de49510bb017c2d87ce0aa6ab635114f0e8b7e2f06b5b0b3e3472a1d539dc0ae"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.575394 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" event={"ID":"097e947b-8923-4155-9d10-f0241f751ad8","Type":"ContainerStarted","Data":"2b3ff5f7ce5ab5754be88f3559daa80e716971f54dd290abc4c6862bb02352f0"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.577347 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz" event={"ID":"2eeed466-946c-49a5-9fe9-b393629c3394","Type":"ContainerStarted","Data":"ef2a6e77d10d1196abded1aad0784fbc92e77689e257c82697a66f11789e6a2d"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.578619 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" event={"ID":"f0f92bf3-053d-4ea9-bd30-4e6724c414c8","Type":"ContainerStarted","Data":"f9b64919881dac06fbb8913dd8d3391f2f6dc56ecd49a53d0a16c50a9fac3f8b"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.579953 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp" event={"ID":"ce6b14b8-1656-4ac0-b6e2-24fd9329faf4","Type":"ContainerStarted","Data":"b2a3d05f3b57a747cbab6f9b495e529948e90f5dd74b5e742d9d0d21371eef92"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.581039 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d" event={"ID":"5f6d862d-d775-4712-b455-ad5110968d19","Type":"ContainerStarted","Data":"0bb01e9b1b5efffa73955b5b43b04fda9a6c600fae6e4c311d7a9eb28c136924"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.582412 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs" event={"ID":"78d17849-0dbc-4b43-b5e6-49bd7c766aa1","Type":"ContainerStarted","Data":"e175159a3029e0e96de6e21ffc7b200dc1c5a2a0ae8ef7d0d40f1504cf0f8bf9"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.583865 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-9th65" event={"ID":"80762edf-2f88-41a7-b96b-2856b2e2c00e","Type":"ContainerStarted","Data":"4af527adf5fbc5313cb703bf917ae01107b59d93af6ebae2de8cd03ec4777175"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.585100 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr" event={"ID":"7fce722c-f3f5-48b6-a567-e867e4e1f27b","Type":"ContainerStarted","Data":"1235887e3b4167cc0ef92211bad18fee175719aae3dc0e80119903c137ff3a0c"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.597021 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr" event={"ID":"2e565997-148f-4d70-ae78-89f194f791f8","Type":"ContainerStarted","Data":"9e814c444ac2390cbc1768e1ab7ef5d9a89c0742634a5be2891ad6ad833bf27c"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.600483 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd" event={"ID":"d3588b02-67b1-4172-9ab8-9da4cb7b09dc","Type":"ContainerStarted","Data":"416adf305bb3240ff8714fca446beb008cc65e2d03bf296912dc91cf51908616"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.603577 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv" event={"ID":"8519dc96-e20f-47d9-9b48-e64a07393d39","Type":"ContainerStarted","Data":"19a9b0f82fd11bbd7fc75dd893d5df0dcf0026711ff90c84245d79eb284a8c1e"} Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.623153 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9"] Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.632136 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8"] Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.640228 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt"] Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.647206 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l"] Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.670613 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz"] Dec 09 10:18:06 crc kubenswrapper[5002]: W1209 10:18:06.671807 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd172ce03_c24c_41ca_a61a_eecdd9572f5d.slice/crio-9929a905eeaed9dc2b9ee159e0a1da9f6c5dbe61e422e9a546d58db21b3b54f9 WatchSource:0}: Error finding container 9929a905eeaed9dc2b9ee159e0a1da9f6c5dbe61e422e9a546d58db21b3b54f9: Status 404 returned error can't find the container with id 9929a905eeaed9dc2b9ee159e0a1da9f6c5dbe61e422e9a546d58db21b3b54f9 Dec 09 10:18:06 crc kubenswrapper[5002]: W1209 10:18:06.673735 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51978e4b_eb47_4e87_8528_38631838226a.slice/crio-3ccd11032bf0d639a0526e90fd4fcc4f1f150c628b536670633d02379c0338d2 WatchSource:0}: Error finding container 3ccd11032bf0d639a0526e90fd4fcc4f1f150c628b536670633d02379c0338d2: Status 404 returned error can't find the container with id 3ccd11032bf0d639a0526e90fd4fcc4f1f150c628b536670633d02379c0338d2 Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.678605 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rw28v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-7kqxz_openstack-operators(51978e4b-eb47-4e87-8528-38631838226a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.683589 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk"] Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.683823 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rw28v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-7kqxz_openstack-operators(51978e4b-eb47-4e87-8528-38631838226a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.686053 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" podUID="51978e4b-eb47-4e87-8528-38631838226a" Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.689049 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b"] Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.695165 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gd26k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-vvm2b_openstack-operators(1c8db604-51f8-4fe5-b659-fc14b7c706f1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.695334 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hstwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-4vg8l_openstack-operators(c66676bd-3461-405a-bce7-60c91858b55e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.695458 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jphq9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-rdjzk_openstack-operators(276be0c6-fc03-4693-91c7-5a999e6f0d89): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.697580 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hstwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-4vg8l_openstack-operators(c66676bd-3461-405a-bce7-60c91858b55e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.697927 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sq92g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-8n5z8_openstack-operators(048abed6-46b3-4dba-bf28-8c0ee1685817): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.698418 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jphq9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-rdjzk_openstack-operators(276be0c6-fc03-4693-91c7-5a999e6f0d89): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.699127 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" podUID="c66676bd-3461-405a-bce7-60c91858b55e" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.699278 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gd26k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-vvm2b_openstack-operators(1c8db604-51f8-4fe5-b659-fc14b7c706f1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.700463 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" podUID="1c8db604-51f8-4fe5-b659-fc14b7c706f1" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.700557 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" podUID="276be0c6-fc03-4693-91c7-5a999e6f0d89" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.701748 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sq92g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-8n5z8_openstack-operators(048abed6-46b3-4dba-bf28-8c0ee1685817): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.703458 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" podUID="048abed6-46b3-4dba-bf28-8c0ee1685817" Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.715155 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert\") pod \"infra-operator-controller-manager-78d48bff9d-qqkqr\" (UID: \"531ff665-3733-405e-b178-0f185d8cb22e\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.715383 5002 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 10:18:06 crc kubenswrapper[5002]: E1209 10:18:06.715455 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert podName:531ff665-3733-405e-b178-0f185d8cb22e nodeName:}" failed. No retries permitted until 2025-12-09 10:18:08.715435969 +0000 UTC m=+1021.107487050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert") pod "infra-operator-controller-manager-78d48bff9d-qqkqr" (UID: "531ff665-3733-405e-b178-0f185d8cb22e") : secret "infra-operator-webhook-server-cert" not found Dec 09 10:18:06 crc kubenswrapper[5002]: I1209 10:18:06.828733 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z"] Dec 09 10:18:06 crc kubenswrapper[5002]: W1209 10:18:06.836747 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcecd7de4_71b0_4914_a5c5_68b8da7704cf.slice/crio-0426b3cad6107f07e7d8344b8d1bacd9d8566f9989b3e4324376841a6a68b981 WatchSource:0}: Error finding container 0426b3cad6107f07e7d8344b8d1bacd9d8566f9989b3e4324376841a6a68b981: Status 404 returned error can't find the container with id 0426b3cad6107f07e7d8344b8d1bacd9d8566f9989b3e4324376841a6a68b981 Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.019894 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fm85h8\" (UID: \"3f8701c7-f70e-4584-8cf1-ed40adda7b84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:07 crc kubenswrapper[5002]: E1209 10:18:07.020431 5002 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 10:18:07 crc kubenswrapper[5002]: E1209 10:18:07.020502 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert podName:3f8701c7-f70e-4584-8cf1-ed40adda7b84 nodeName:}" failed. No retries permitted until 2025-12-09 10:18:09.020484608 +0000 UTC m=+1021.412535689 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fm85h8" (UID: "3f8701c7-f70e-4584-8cf1-ed40adda7b84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.425787 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.425910 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:07 crc kubenswrapper[5002]: E1209 10:18:07.426148 5002 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 10:18:07 crc kubenswrapper[5002]: E1209 10:18:07.426209 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs podName:2c2f2646-d609-433d-894e-5577cc497adc nodeName:}" failed. No retries permitted until 2025-12-09 10:18:09.426190145 +0000 UTC m=+1021.818241226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs") pod "openstack-operator-controller-manager-7f8d8fb65b-6gj42" (UID: "2c2f2646-d609-433d-894e-5577cc497adc") : secret "metrics-server-cert" not found Dec 09 10:18:07 crc kubenswrapper[5002]: E1209 10:18:07.426588 5002 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 10:18:07 crc kubenswrapper[5002]: E1209 10:18:07.426620 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs podName:2c2f2646-d609-433d-894e-5577cc497adc nodeName:}" failed. No retries permitted until 2025-12-09 10:18:09.426610566 +0000 UTC m=+1021.818661647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs") pod "openstack-operator-controller-manager-7f8d8fb65b-6gj42" (UID: "2c2f2646-d609-433d-894e-5577cc497adc") : secret "webhook-server-cert" not found Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.637312 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" event={"ID":"51978e4b-eb47-4e87-8528-38631838226a","Type":"ContainerStarted","Data":"3ccd11032bf0d639a0526e90fd4fcc4f1f150c628b536670633d02379c0338d2"} Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.638611 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z" event={"ID":"cecd7de4-71b0-4914-a5c5-68b8da7704cf","Type":"ContainerStarted","Data":"0426b3cad6107f07e7d8344b8d1bacd9d8566f9989b3e4324376841a6a68b981"} Dec 09 10:18:07 crc kubenswrapper[5002]: E1209 10:18:07.639837 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" podUID="51978e4b-eb47-4e87-8528-38631838226a" Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.640731 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" event={"ID":"276be0c6-fc03-4693-91c7-5a999e6f0d89","Type":"ContainerStarted","Data":"6ef34ec5e25c9855cd815dd817892395bc1bd4daf74d20391f9be6a94cb95305"} Dec 09 10:18:07 crc kubenswrapper[5002]: E1209 10:18:07.642376 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" podUID="276be0c6-fc03-4693-91c7-5a999e6f0d89" Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.642884 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" event={"ID":"048abed6-46b3-4dba-bf28-8c0ee1685817","Type":"ContainerStarted","Data":"42d914fe1c845c0c4e648fc78e5cda7c92d216bec6cc143a53a7988cad79eeb7"} Dec 09 10:18:07 crc kubenswrapper[5002]: E1209 10:18:07.650086 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" podUID="048abed6-46b3-4dba-bf28-8c0ee1685817" Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.674582 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" event={"ID":"c66676bd-3461-405a-bce7-60c91858b55e","Type":"ContainerStarted","Data":"26fb9e884abeb9b1531e019e997801dfb203449ea6b2004eebd5e0370b8db5eb"} Dec 09 10:18:07 crc kubenswrapper[5002]: E1209 10:18:07.677378 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" podUID="c66676bd-3461-405a-bce7-60c91858b55e" Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.681285 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" event={"ID":"53e6de02-2fb5-4491-aa93-7ad8bd2a190c","Type":"ContainerStarted","Data":"0e3c2e2e421e57aa025b74d9c06258630f4f83a3b9ff1b9fcb988582644da00f"} Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.683316 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" event={"ID":"1c8db604-51f8-4fe5-b659-fc14b7c706f1","Type":"ContainerStarted","Data":"db6dd361a1f715ff5f8058ef3dc21892075e905eec57defd83db69f573a4142a"} Dec 09 10:18:07 crc kubenswrapper[5002]: E1209 10:18:07.685488 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" podUID="1c8db604-51f8-4fe5-b659-fc14b7c706f1" Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.691339 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt" event={"ID":"d172ce03-c24c-41ca-a61a-eecdd9572f5d","Type":"ContainerStarted","Data":"9929a905eeaed9dc2b9ee159e0a1da9f6c5dbe61e422e9a546d58db21b3b54f9"} Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.964486 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:18:07 crc kubenswrapper[5002]: I1209 10:18:07.964962 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:18:08 crc kubenswrapper[5002]: E1209 10:18:08.700118 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" podUID="51978e4b-eb47-4e87-8528-38631838226a" Dec 09 10:18:08 crc kubenswrapper[5002]: E1209 10:18:08.702239 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" podUID="048abed6-46b3-4dba-bf28-8c0ee1685817" Dec 09 10:18:08 crc kubenswrapper[5002]: E1209 10:18:08.702254 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" podUID="1c8db604-51f8-4fe5-b659-fc14b7c706f1" Dec 09 10:18:08 crc kubenswrapper[5002]: E1209 10:18:08.702320 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" podUID="276be0c6-fc03-4693-91c7-5a999e6f0d89" Dec 09 10:18:08 crc kubenswrapper[5002]: E1209 10:18:08.702325 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" podUID="c66676bd-3461-405a-bce7-60c91858b55e" Dec 09 10:18:08 crc kubenswrapper[5002]: I1209 10:18:08.759635 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert\") pod \"infra-operator-controller-manager-78d48bff9d-qqkqr\" (UID: \"531ff665-3733-405e-b178-0f185d8cb22e\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:08 crc kubenswrapper[5002]: E1209 10:18:08.760341 5002 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 10:18:08 crc kubenswrapper[5002]: E1209 10:18:08.760426 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert podName:531ff665-3733-405e-b178-0f185d8cb22e nodeName:}" failed. No retries permitted until 2025-12-09 10:18:12.760405963 +0000 UTC m=+1025.152457044 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert") pod "infra-operator-controller-manager-78d48bff9d-qqkqr" (UID: "531ff665-3733-405e-b178-0f185d8cb22e") : secret "infra-operator-webhook-server-cert" not found Dec 09 10:18:09 crc kubenswrapper[5002]: I1209 10:18:09.063490 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fm85h8\" (UID: \"3f8701c7-f70e-4584-8cf1-ed40adda7b84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:09 crc kubenswrapper[5002]: E1209 10:18:09.063658 5002 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 10:18:09 crc kubenswrapper[5002]: E1209 10:18:09.063738 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert podName:3f8701c7-f70e-4584-8cf1-ed40adda7b84 nodeName:}" failed. No retries permitted until 2025-12-09 10:18:13.063719977 +0000 UTC m=+1025.455771118 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fm85h8" (UID: "3f8701c7-f70e-4584-8cf1-ed40adda7b84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 10:18:09 crc kubenswrapper[5002]: I1209 10:18:09.482524 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:09 crc kubenswrapper[5002]: E1209 10:18:09.483020 5002 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 10:18:09 crc kubenswrapper[5002]: E1209 10:18:09.483130 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs podName:2c2f2646-d609-433d-894e-5577cc497adc nodeName:}" failed. No retries permitted until 2025-12-09 10:18:13.483102404 +0000 UTC m=+1025.875153475 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs") pod "openstack-operator-controller-manager-7f8d8fb65b-6gj42" (UID: "2c2f2646-d609-433d-894e-5577cc497adc") : secret "metrics-server-cert" not found Dec 09 10:18:09 crc kubenswrapper[5002]: I1209 10:18:09.484028 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:09 crc kubenswrapper[5002]: E1209 10:18:09.484452 5002 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 10:18:09 crc kubenswrapper[5002]: E1209 10:18:09.484583 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs podName:2c2f2646-d609-433d-894e-5577cc497adc nodeName:}" failed. No retries permitted until 2025-12-09 10:18:13.484530312 +0000 UTC m=+1025.876581393 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs") pod "openstack-operator-controller-manager-7f8d8fb65b-6gj42" (UID: "2c2f2646-d609-433d-894e-5577cc497adc") : secret "webhook-server-cert" not found Dec 09 10:18:12 crc kubenswrapper[5002]: I1209 10:18:12.838061 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert\") pod \"infra-operator-controller-manager-78d48bff9d-qqkqr\" (UID: \"531ff665-3733-405e-b178-0f185d8cb22e\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:12 crc kubenswrapper[5002]: E1209 10:18:12.838442 5002 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 10:18:12 crc kubenswrapper[5002]: E1209 10:18:12.839002 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert podName:531ff665-3733-405e-b178-0f185d8cb22e nodeName:}" failed. No retries permitted until 2025-12-09 10:18:20.838973622 +0000 UTC m=+1033.231024743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert") pod "infra-operator-controller-manager-78d48bff9d-qqkqr" (UID: "531ff665-3733-405e-b178-0f185d8cb22e") : secret "infra-operator-webhook-server-cert" not found Dec 09 10:18:13 crc kubenswrapper[5002]: I1209 10:18:13.143410 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fm85h8\" (UID: \"3f8701c7-f70e-4584-8cf1-ed40adda7b84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:13 crc kubenswrapper[5002]: E1209 10:18:13.143588 5002 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 10:18:13 crc kubenswrapper[5002]: E1209 10:18:13.144693 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert podName:3f8701c7-f70e-4584-8cf1-ed40adda7b84 nodeName:}" failed. No retries permitted until 2025-12-09 10:18:21.143900148 +0000 UTC m=+1033.535951229 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fm85h8" (UID: "3f8701c7-f70e-4584-8cf1-ed40adda7b84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 10:18:13 crc kubenswrapper[5002]: I1209 10:18:13.562778 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:13 crc kubenswrapper[5002]: I1209 10:18:13.562908 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:13 crc kubenswrapper[5002]: E1209 10:18:13.563050 5002 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 10:18:13 crc kubenswrapper[5002]: E1209 10:18:13.563101 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs podName:2c2f2646-d609-433d-894e-5577cc497adc nodeName:}" failed. No retries permitted until 2025-12-09 10:18:21.56308569 +0000 UTC m=+1033.955136771 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs") pod "openstack-operator-controller-manager-7f8d8fb65b-6gj42" (UID: "2c2f2646-d609-433d-894e-5577cc497adc") : secret "metrics-server-cert" not found Dec 09 10:18:13 crc kubenswrapper[5002]: E1209 10:18:13.563144 5002 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 10:18:13 crc kubenswrapper[5002]: E1209 10:18:13.563164 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs podName:2c2f2646-d609-433d-894e-5577cc497adc nodeName:}" failed. No retries permitted until 2025-12-09 10:18:21.563158262 +0000 UTC m=+1033.955209343 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs") pod "openstack-operator-controller-manager-7f8d8fb65b-6gj42" (UID: "2c2f2646-d609-433d-894e-5577cc497adc") : secret "webhook-server-cert" not found Dec 09 10:18:18 crc kubenswrapper[5002]: E1209 10:18:18.658892 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 09 10:18:18 crc kubenswrapper[5002]: E1209 10:18:18.659345 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pk6r2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-5pq7d_openstack-operators(f0f92bf3-053d-4ea9-bd30-4e6724c414c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:18:20 crc kubenswrapper[5002]: I1209 10:18:20.886963 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert\") pod \"infra-operator-controller-manager-78d48bff9d-qqkqr\" (UID: \"531ff665-3733-405e-b178-0f185d8cb22e\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:20 crc kubenswrapper[5002]: I1209 10:18:20.892366 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/531ff665-3733-405e-b178-0f185d8cb22e-cert\") pod \"infra-operator-controller-manager-78d48bff9d-qqkqr\" (UID: \"531ff665-3733-405e-b178-0f185d8cb22e\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:21 crc kubenswrapper[5002]: I1209 10:18:21.109140 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zwx57" Dec 09 10:18:21 crc kubenswrapper[5002]: I1209 10:18:21.118144 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:21 crc kubenswrapper[5002]: I1209 10:18:21.191774 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fm85h8\" (UID: \"3f8701c7-f70e-4584-8cf1-ed40adda7b84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:21 crc kubenswrapper[5002]: I1209 10:18:21.200582 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f8701c7-f70e-4584-8cf1-ed40adda7b84-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fm85h8\" (UID: \"3f8701c7-f70e-4584-8cf1-ed40adda7b84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:21 crc kubenswrapper[5002]: I1209 10:18:21.408489 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-h4r6z" Dec 09 10:18:21 crc kubenswrapper[5002]: I1209 10:18:21.416612 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:21 crc kubenswrapper[5002]: I1209 10:18:21.597307 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:21 crc kubenswrapper[5002]: I1209 10:18:21.597474 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:21 crc kubenswrapper[5002]: E1209 10:18:21.597627 5002 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 10:18:21 crc kubenswrapper[5002]: E1209 10:18:21.597688 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs podName:2c2f2646-d609-433d-894e-5577cc497adc nodeName:}" failed. No retries permitted until 2025-12-09 10:18:37.597669715 +0000 UTC m=+1049.989720796 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs") pod "openstack-operator-controller-manager-7f8d8fb65b-6gj42" (UID: "2c2f2646-d609-433d-894e-5577cc497adc") : secret "webhook-server-cert" not found Dec 09 10:18:21 crc kubenswrapper[5002]: I1209 10:18:21.601343 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-metrics-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:27 crc kubenswrapper[5002]: E1209 10:18:27.991170 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 09 10:18:27 crc kubenswrapper[5002]: E1209 10:18:27.991775 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jsdqw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-jb68h_openstack-operators(097e947b-8923-4155-9d10-f0241f751ad8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:18:30 crc kubenswrapper[5002]: E1209 10:18:30.365548 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a" Dec 09 10:18:30 crc kubenswrapper[5002]: E1209 10:18:30.365913 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z785k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75944c9b7-vkkw9_openstack-operators(53e6de02-2fb5-4491-aa93-7ad8bd2a190c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:18:30 crc kubenswrapper[5002]: E1209 10:18:30.921462 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 09 10:18:30 crc kubenswrapper[5002]: E1209 10:18:30.921920 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kwnfd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-495q6_openstack-operators(a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:18:31 crc kubenswrapper[5002]: E1209 10:18:31.333914 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 09 10:18:31 crc kubenswrapper[5002]: E1209 10:18:31.334188 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hhx64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-x578z_openstack-operators(cecd7de4-71b0-4914-a5c5-68b8da7704cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:18:31 crc kubenswrapper[5002]: E1209 10:18:31.335377 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z" podUID="cecd7de4-71b0-4914-a5c5-68b8da7704cf" Dec 09 10:18:31 crc kubenswrapper[5002]: E1209 10:18:31.901025 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z" podUID="cecd7de4-71b0-4914-a5c5-68b8da7704cf" Dec 09 10:18:34 crc kubenswrapper[5002]: I1209 10:18:34.733051 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8"] Dec 09 10:18:34 crc kubenswrapper[5002]: I1209 10:18:34.847325 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr"] Dec 09 10:18:35 crc kubenswrapper[5002]: W1209 10:18:35.085145 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f8701c7_f70e_4584_8cf1_ed40adda7b84.slice/crio-7375619b8b0679f379fb61020bc3abe5c1256bc2413fa416f198390892a0e253 WatchSource:0}: Error finding container 7375619b8b0679f379fb61020bc3abe5c1256bc2413fa416f198390892a0e253: Status 404 returned error can't find the container with id 7375619b8b0679f379fb61020bc3abe5c1256bc2413fa416f198390892a0e253 Dec 09 10:18:35 crc kubenswrapper[5002]: W1209 10:18:35.086979 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod531ff665_3733_405e_b178_0f185d8cb22e.slice/crio-a9dec5c168f6336b9999dc0532e97b30ed36d2fc5f780d4bec52a33e80c32904 WatchSource:0}: Error finding container a9dec5c168f6336b9999dc0532e97b30ed36d2fc5f780d4bec52a33e80c32904: Status 404 returned error can't find the container with id a9dec5c168f6336b9999dc0532e97b30ed36d2fc5f780d4bec52a33e80c32904 Dec 09 10:18:35 crc kubenswrapper[5002]: I1209 10:18:35.925386 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d" event={"ID":"5f6d862d-d775-4712-b455-ad5110968d19","Type":"ContainerStarted","Data":"27a95a585558f0eb1bfa1ce237dc698330a92d841674435172e1d53cbe782d23"} Dec 09 10:18:35 crc kubenswrapper[5002]: I1209 10:18:35.933719 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz" event={"ID":"2eeed466-946c-49a5-9fe9-b393629c3394","Type":"ContainerStarted","Data":"65f879d085f3d2abebb85849689a5e71e8cae26a5f3cba0be3621dcc4465bf06"} Dec 09 10:18:35 crc kubenswrapper[5002]: I1209 10:18:35.953916 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr" event={"ID":"2e565997-148f-4d70-ae78-89f194f791f8","Type":"ContainerStarted","Data":"83001d7e1d82db15235bb9cd625fb9d8b3ef28b18b67c52c5f8e1423eeaa6181"} Dec 09 10:18:35 crc kubenswrapper[5002]: I1209 10:18:35.957347 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd" event={"ID":"d3588b02-67b1-4172-9ab8-9da4cb7b09dc","Type":"ContainerStarted","Data":"554134110ef502e3a67a8b0efacd1b03d77e5e02bd8b19ac71e6a091dcb55495"} Dec 09 10:18:35 crc kubenswrapper[5002]: I1209 10:18:35.958794 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp" event={"ID":"ce6b14b8-1656-4ac0-b6e2-24fd9329faf4","Type":"ContainerStarted","Data":"249fe1d9c75f4e4f9ac34fbd3cd895c7c0bc081e891fd4a71c083eb6c26aae64"} Dec 09 10:18:35 crc kubenswrapper[5002]: I1209 10:18:35.967194 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv" event={"ID":"8519dc96-e20f-47d9-9b48-e64a07393d39","Type":"ContainerStarted","Data":"95293bf7b1e2d6417917e0c92b0ff78a1b77e037dae61f0ea131dc65b97b474c"} Dec 09 10:18:35 crc kubenswrapper[5002]: I1209 10:18:35.968407 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs" event={"ID":"78d17849-0dbc-4b43-b5e6-49bd7c766aa1","Type":"ContainerStarted","Data":"21d360c42d409ffe206dc76a64249c15cab71008dedfa3289976a042b0f487b1"} Dec 09 10:18:35 crc kubenswrapper[5002]: I1209 10:18:35.969284 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt" event={"ID":"d172ce03-c24c-41ca-a61a-eecdd9572f5d","Type":"ContainerStarted","Data":"de99fd514933dd38abe0aed1183c65a40d4aabab7a4d4eb7b6f69264417e6bb3"} Dec 09 10:18:35 crc kubenswrapper[5002]: I1209 10:18:35.970579 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr" event={"ID":"7fce722c-f3f5-48b6-a567-e867e4e1f27b","Type":"ContainerStarted","Data":"6f778db56cd8bcc90ea406752289952228a52c0ef637c6cd2bfea8f2a5957244"} Dec 09 10:18:35 crc kubenswrapper[5002]: I1209 10:18:35.975131 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" event={"ID":"3f8701c7-f70e-4584-8cf1-ed40adda7b84","Type":"ContainerStarted","Data":"7375619b8b0679f379fb61020bc3abe5c1256bc2413fa416f198390892a0e253"} Dec 09 10:18:35 crc kubenswrapper[5002]: I1209 10:18:35.976208 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" event={"ID":"531ff665-3733-405e-b178-0f185d8cb22e","Type":"ContainerStarted","Data":"a9dec5c168f6336b9999dc0532e97b30ed36d2fc5f780d4bec52a33e80c32904"} Dec 09 10:18:36 crc kubenswrapper[5002]: I1209 10:18:36.984367 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-9th65" event={"ID":"80762edf-2f88-41a7-b96b-2856b2e2c00e","Type":"ContainerStarted","Data":"3b94c6ba054d4cd306aa95f8ea248464763cfe63fbae83f4cfb68f8bb5cfd509"} Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.648143 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.657772 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c2f2646-d609-433d-894e-5577cc497adc-webhook-certs\") pod \"openstack-operator-controller-manager-7f8d8fb65b-6gj42\" (UID: \"2c2f2646-d609-433d-894e-5577cc497adc\") " pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:37 crc kubenswrapper[5002]: E1209 10:18:37.685398 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" podUID="f0f92bf3-053d-4ea9-bd30-4e6724c414c8" Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.812838 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qj9nr" Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.821202 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.964443 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.964518 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.964571 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.965288 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9580dfbfe31ac43f61d3b220c2620f364cbabb0180f5e1a555d93ea2015032be"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.965361 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://9580dfbfe31ac43f61d3b220c2620f364cbabb0180f5e1a555d93ea2015032be" gracePeriod=600 Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.990955 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" event={"ID":"276be0c6-fc03-4693-91c7-5a999e6f0d89","Type":"ContainerStarted","Data":"7eea7e0c84765603d751d3d4ab6f3353ff5ca5e44d3e19e3ac4986b3dc83b110"} Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.993635 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" event={"ID":"048abed6-46b3-4dba-bf28-8c0ee1685817","Type":"ContainerStarted","Data":"26e26b54be35c4503e635ca49e91686e4dba3608d5166d8672ae165f3d333186"} Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.995082 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" event={"ID":"c66676bd-3461-405a-bce7-60c91858b55e","Type":"ContainerStarted","Data":"90e86c28869f8c15df3d5e0b428f5e30832c9a309c7cc486b13a1f7803c8f696"} Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.996350 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" event={"ID":"f0f92bf3-053d-4ea9-bd30-4e6724c414c8","Type":"ContainerStarted","Data":"cf6a9d8ca09668d63c332d19d6048c136c8c208bae097f97ebd45bcadf70dda0"} Dec 09 10:18:37 crc kubenswrapper[5002]: I1209 10:18:37.998635 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" event={"ID":"1c8db604-51f8-4fe5-b659-fc14b7c706f1","Type":"ContainerStarted","Data":"a56041bddf387fde558d3337a3363811e2d426fe6b26b4072b80dacc625ef4fa"} Dec 09 10:18:38 crc kubenswrapper[5002]: I1209 10:18:38.001037 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" event={"ID":"51978e4b-eb47-4e87-8528-38631838226a","Type":"ContainerStarted","Data":"0f400ceae9fa2be8a88d0cdc92fdd8b8735021583f462a5539f511d83517b591"} Dec 09 10:18:39 crc kubenswrapper[5002]: I1209 10:18:39.008853 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="9580dfbfe31ac43f61d3b220c2620f364cbabb0180f5e1a555d93ea2015032be" exitCode=0 Dec 09 10:18:39 crc kubenswrapper[5002]: I1209 10:18:39.008909 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"9580dfbfe31ac43f61d3b220c2620f364cbabb0180f5e1a555d93ea2015032be"} Dec 09 10:18:39 crc kubenswrapper[5002]: I1209 10:18:39.009222 5002 scope.go:117] "RemoveContainer" containerID="637ff8a569cd1216521e8fa15ac9579d9708df3205b27a6dad02b958376b5a55" Dec 09 10:18:39 crc kubenswrapper[5002]: E1209 10:18:39.739148 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" podUID="097e947b-8923-4155-9d10-f0241f751ad8" Dec 09 10:18:39 crc kubenswrapper[5002]: I1209 10:18:39.866735 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42"] Dec 09 10:18:40 crc kubenswrapper[5002]: I1209 10:18:40.018617 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" event={"ID":"276be0c6-fc03-4693-91c7-5a999e6f0d89","Type":"ContainerStarted","Data":"389e46ef6373570ae3fdf496398957257a1c16a6833c2a1c4308fb1592938e55"} Dec 09 10:18:40 crc kubenswrapper[5002]: I1209 10:18:40.018920 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" Dec 09 10:18:40 crc kubenswrapper[5002]: I1209 10:18:40.020282 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" event={"ID":"2c2f2646-d609-433d-894e-5577cc497adc","Type":"ContainerStarted","Data":"0a54358c1e372215f8099a8322770fedfd4ecfb99d98c31e80223daf7e92710a"} Dec 09 10:18:40 crc kubenswrapper[5002]: I1209 10:18:40.023053 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"3884e46cf25151268d65649fde8e75f33e599a76a13b5c73816d374f2399025a"} Dec 09 10:18:40 crc kubenswrapper[5002]: I1209 10:18:40.024277 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" event={"ID":"097e947b-8923-4155-9d10-f0241f751ad8","Type":"ContainerStarted","Data":"b8436497690095f4e6a51edfe2f2c5952d8bd20dfa1da4af9c72c417b4f20794"} Dec 09 10:18:40 crc kubenswrapper[5002]: I1209 10:18:40.025661 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" event={"ID":"1c8db604-51f8-4fe5-b659-fc14b7c706f1","Type":"ContainerStarted","Data":"94f60d6d8821266ab818f75f50d3ab8971bfdb86ae8209772fcd4ea71ee0366d"} Dec 09 10:18:40 crc kubenswrapper[5002]: I1209 10:18:40.025790 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" Dec 09 10:18:40 crc kubenswrapper[5002]: I1209 10:18:40.037424 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" podStartSLOduration=8.274383301 podStartE2EDuration="36.037406381s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.69539212 +0000 UTC m=+1019.087443201" lastFinishedPulling="2025-12-09 10:18:34.4584152 +0000 UTC m=+1046.850466281" observedRunningTime="2025-12-09 10:18:40.032888139 +0000 UTC m=+1052.424939230" watchObservedRunningTime="2025-12-09 10:18:40.037406381 +0000 UTC m=+1052.429457462" Dec 09 10:18:40 crc kubenswrapper[5002]: I1209 10:18:40.094648 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" podStartSLOduration=7.697405444 podStartE2EDuration="36.094625563s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.694996819 +0000 UTC m=+1019.087047900" lastFinishedPulling="2025-12-09 10:18:35.092216938 +0000 UTC m=+1047.484268019" observedRunningTime="2025-12-09 10:18:40.091416636 +0000 UTC m=+1052.483467717" watchObservedRunningTime="2025-12-09 10:18:40.094625563 +0000 UTC m=+1052.486676644" Dec 09 10:18:40 crc kubenswrapper[5002]: E1209 10:18:40.256515 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" podUID="a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf" Dec 09 10:18:40 crc kubenswrapper[5002]: E1209 10:18:40.444869 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" podUID="53e6de02-2fb5-4491-aa93-7ad8bd2a190c" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.036282 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" event={"ID":"a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf","Type":"ContainerStarted","Data":"4ddfdee07059519da6024c5f03c62d063b1751b56f36e55c7dbfe09a672c7f83"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.050754 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" event={"ID":"f0f92bf3-053d-4ea9-bd30-4e6724c414c8","Type":"ContainerStarted","Data":"dcfd5e4d04a1eac6207346eb62d689fbf449aff8f723b0fc37a097bc879b3ae8"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.051510 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.059946 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" event={"ID":"53e6de02-2fb5-4491-aa93-7ad8bd2a190c","Type":"ContainerStarted","Data":"23224b3ecc35d57928ebb14bb979197da20747d219082cee14cb20bf00d50065"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.072033 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd" event={"ID":"d3588b02-67b1-4172-9ab8-9da4cb7b09dc","Type":"ContainerStarted","Data":"69f694aa57bd8d5df3b0a1cc704a021d76f81476325549e2f7a5c1557433b557"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.074348 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.080002 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.095011 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" event={"ID":"51978e4b-eb47-4e87-8528-38631838226a","Type":"ContainerStarted","Data":"727f0ee52fa9f06546f7587181f3bfa9907dcc830e6bdd13ae0a9f4be3d5960f"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.095683 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.133255 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" podStartSLOduration=3.885410995 podStartE2EDuration="37.13323626s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.120866978 +0000 UTC m=+1018.512918059" lastFinishedPulling="2025-12-09 10:18:39.368692213 +0000 UTC m=+1051.760743324" observedRunningTime="2025-12-09 10:18:41.130697241 +0000 UTC m=+1053.522748322" watchObservedRunningTime="2025-12-09 10:18:41.13323626 +0000 UTC m=+1053.525287341" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.135080 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" event={"ID":"048abed6-46b3-4dba-bf28-8c0ee1685817","Type":"ContainerStarted","Data":"2562595c0a9d212105533c085e0d97093126f2652a7e046af18cba43dcf4f75f"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.135836 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.150041 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" event={"ID":"c66676bd-3461-405a-bce7-60c91858b55e","Type":"ContainerStarted","Data":"e4e08fb6cacb492d7ba614c410692187d0b8d0597b4424f80ee45141900f1540"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.150675 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.222548 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr" event={"ID":"2e565997-148f-4d70-ae78-89f194f791f8","Type":"ContainerStarted","Data":"50294cb6f534cf11fef74d88fcd1fb6e7cf0ea272e8928c4066364e69b577567"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.223189 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.223648 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qvnbd" podStartSLOduration=4.082958418 podStartE2EDuration="37.223638225s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.319557822 +0000 UTC m=+1018.711608903" lastFinishedPulling="2025-12-09 10:18:39.460237629 +0000 UTC m=+1051.852288710" observedRunningTime="2025-12-09 10:18:41.158172121 +0000 UTC m=+1053.550223202" watchObservedRunningTime="2025-12-09 10:18:41.223638225 +0000 UTC m=+1053.615689306" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.238010 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.245167 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv" event={"ID":"8519dc96-e20f-47d9-9b48-e64a07393d39","Type":"ContainerStarted","Data":"cdba35a628dce2124e7c6301a88068bdd7970ed362466220a2a3e6eedcc9e9fa"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.246136 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" podStartSLOduration=9.466434781 podStartE2EDuration="37.246109161s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.677839476 +0000 UTC m=+1019.069890547" lastFinishedPulling="2025-12-09 10:18:34.457513836 +0000 UTC m=+1046.849564927" observedRunningTime="2025-12-09 10:18:41.209217847 +0000 UTC m=+1053.601268928" watchObservedRunningTime="2025-12-09 10:18:41.246109161 +0000 UTC m=+1053.638160242" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.246246 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.284107 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.288227 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" podStartSLOduration=7.891389366 podStartE2EDuration="36.288211035s" podCreationTimestamp="2025-12-09 10:18:05 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.695251206 +0000 UTC m=+1019.087302287" lastFinishedPulling="2025-12-09 10:18:35.092072885 +0000 UTC m=+1047.484123956" observedRunningTime="2025-12-09 10:18:41.244627031 +0000 UTC m=+1053.636678112" watchObservedRunningTime="2025-12-09 10:18:41.288211035 +0000 UTC m=+1053.680262116" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.309044 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d" event={"ID":"5f6d862d-d775-4712-b455-ad5110968d19","Type":"ContainerStarted","Data":"44bc239a1849d4cdd8a43f9f1fb11670b324d1be40961c81cef0329a1119902d"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.311236 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.319041 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" podStartSLOduration=8.55844492 podStartE2EDuration="36.319020486s" podCreationTimestamp="2025-12-09 10:18:05 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.697804833 +0000 UTC m=+1019.089855914" lastFinishedPulling="2025-12-09 10:18:34.458380399 +0000 UTC m=+1046.850431480" observedRunningTime="2025-12-09 10:18:41.312662314 +0000 UTC m=+1053.704713405" watchObservedRunningTime="2025-12-09 10:18:41.319020486 +0000 UTC m=+1053.711071567" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.325766 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.327746 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-9th65" event={"ID":"80762edf-2f88-41a7-b96b-2856b2e2c00e","Type":"ContainerStarted","Data":"8e3a0336788c537dbce1b0219814405ac8c3ab8adb5740a6133186043f947252"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.330014 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-9th65" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.341015 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr" event={"ID":"7fce722c-f3f5-48b6-a567-e867e4e1f27b","Type":"ContainerStarted","Data":"9b43d2bc231aa60d3e39bf18c6e6338bdefb76227a7dc1adbd5892f7404e4f91"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.341827 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.341976 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-9th65" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.347204 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.358798 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-q294d" podStartSLOduration=4.477392781 podStartE2EDuration="37.358775457s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.56203081 +0000 UTC m=+1018.954081901" lastFinishedPulling="2025-12-09 10:18:39.443413496 +0000 UTC m=+1051.835464577" observedRunningTime="2025-12-09 10:18:41.348407718 +0000 UTC m=+1053.740458799" watchObservedRunningTime="2025-12-09 10:18:41.358775457 +0000 UTC m=+1053.750826538" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.363022 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" event={"ID":"2c2f2646-d609-433d-894e-5577cc497adc","Type":"ContainerStarted","Data":"0ee99e13d9715ba3213179bac25caa2a44503394d337eb15708722a1beed2213"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.363590 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.373792 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-8jgwr" podStartSLOduration=4.258216615 podStartE2EDuration="37.373776731s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.292152438 +0000 UTC m=+1018.684203509" lastFinishedPulling="2025-12-09 10:18:39.407712544 +0000 UTC m=+1051.799763625" observedRunningTime="2025-12-09 10:18:41.370167364 +0000 UTC m=+1053.762218455" watchObservedRunningTime="2025-12-09 10:18:41.373776731 +0000 UTC m=+1053.765827812" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.386355 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" event={"ID":"531ff665-3733-405e-b178-0f185d8cb22e","Type":"ContainerStarted","Data":"4822dcddc287ef92033ab5f60441824423eef55a892e963a9294faf82af9227a"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.386400 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" event={"ID":"531ff665-3733-405e-b178-0f185d8cb22e","Type":"ContainerStarted","Data":"7276e099a518bde0a6e52a95b8364ad9ecfa892d78efdff8a876870401c2d18b"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.386457 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.387744 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp" event={"ID":"ce6b14b8-1656-4ac0-b6e2-24fd9329faf4","Type":"ContainerStarted","Data":"acf6ba89b7350a774eea9f2d74a45210e8f542cf9dbb0701feb3fa4441568140"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.388697 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.396904 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.398199 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs" event={"ID":"78d17849-0dbc-4b43-b5e6-49bd7c766aa1","Type":"ContainerStarted","Data":"4468b6992b596b7887e80e05d4338e33bfd50ef4bc68d2f3a64684638a4604f9"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.400043 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.405076 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.412758 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz" event={"ID":"2eeed466-946c-49a5-9fe9-b393629c3394","Type":"ContainerStarted","Data":"7c12a3cadd3ee820fc2eae2a0b58bad635838ccdc964fd7068f5f2d28e45a65f"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.413737 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-b8mpv" podStartSLOduration=4.095244148 podStartE2EDuration="37.413707487s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.123050496 +0000 UTC m=+1018.515101577" lastFinishedPulling="2025-12-09 10:18:39.441513835 +0000 UTC m=+1051.833564916" observedRunningTime="2025-12-09 10:18:41.396211156 +0000 UTC m=+1053.788262237" watchObservedRunningTime="2025-12-09 10:18:41.413707487 +0000 UTC m=+1053.805758568" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.414642 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.418260 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.420291 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-vcspp" podStartSLOduration=4.052979529 podStartE2EDuration="37.420270704s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.10689467 +0000 UTC m=+1018.498945741" lastFinishedPulling="2025-12-09 10:18:39.474185835 +0000 UTC m=+1051.866236916" observedRunningTime="2025-12-09 10:18:41.414523469 +0000 UTC m=+1053.806574550" watchObservedRunningTime="2025-12-09 10:18:41.420270704 +0000 UTC m=+1053.812321785" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.420520 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt" event={"ID":"d172ce03-c24c-41ca-a61a-eecdd9572f5d","Type":"ContainerStarted","Data":"66453574da0590dd833015c9601fd3ae751ce9158a30905bbc238c46c750fc09"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.420925 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.426591 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.447981 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" event={"ID":"3f8701c7-f70e-4584-8cf1-ed40adda7b84","Type":"ContainerStarted","Data":"f74e05f4cc9e36f421ec7e4642015035b074dc77fd18527b7cd481d76491e6dc"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.448039 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" event={"ID":"3f8701c7-f70e-4584-8cf1-ed40adda7b84","Type":"ContainerStarted","Data":"54f104f3662e810fef286c2b2ecd3f1b60872b5fc4a8d150ae8369f42c2143c3"} Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.470540 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" podStartSLOduration=33.212759488 podStartE2EDuration="37.470520688s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:35.093487063 +0000 UTC m=+1047.485538154" lastFinishedPulling="2025-12-09 10:18:39.351248253 +0000 UTC m=+1051.743299354" observedRunningTime="2025-12-09 10:18:41.46726216 +0000 UTC m=+1053.859313261" watchObservedRunningTime="2025-12-09 10:18:41.470520688 +0000 UTC m=+1053.862571779" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.496184 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" podStartSLOduration=36.496164699 podStartE2EDuration="36.496164699s" podCreationTimestamp="2025-12-09 10:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:18:41.495415249 +0000 UTC m=+1053.887466330" watchObservedRunningTime="2025-12-09 10:18:41.496164699 +0000 UTC m=+1053.888215780" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.526387 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-9th65" podStartSLOduration=4.442307022 podStartE2EDuration="37.526362933s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.353948489 +0000 UTC m=+1018.745999570" lastFinishedPulling="2025-12-09 10:18:39.4380044 +0000 UTC m=+1051.830055481" observedRunningTime="2025-12-09 10:18:41.514599076 +0000 UTC m=+1053.906650157" watchObservedRunningTime="2025-12-09 10:18:41.526362933 +0000 UTC m=+1053.918414024" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.567340 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4dzfs" podStartSLOduration=4.043477777 podStartE2EDuration="37.567322267s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:05.776805449 +0000 UTC m=+1018.168856530" lastFinishedPulling="2025-12-09 10:18:39.300649939 +0000 UTC m=+1051.692701020" observedRunningTime="2025-12-09 10:18:41.566297179 +0000 UTC m=+1053.958348270" watchObservedRunningTime="2025-12-09 10:18:41.567322267 +0000 UTC m=+1053.959373348" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.570412 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kj2xr" podStartSLOduration=4.043613232 podStartE2EDuration="37.570394149s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:05.915321494 +0000 UTC m=+1018.307372575" lastFinishedPulling="2025-12-09 10:18:39.442102401 +0000 UTC m=+1051.834153492" observedRunningTime="2025-12-09 10:18:41.548400637 +0000 UTC m=+1053.940451728" watchObservedRunningTime="2025-12-09 10:18:41.570394149 +0000 UTC m=+1053.962445230" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.619911 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-25fmz" podStartSLOduration=4.741433787 podStartE2EDuration="37.619894253s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.560385217 +0000 UTC m=+1018.952436308" lastFinishedPulling="2025-12-09 10:18:39.438845693 +0000 UTC m=+1051.830896774" observedRunningTime="2025-12-09 10:18:41.618963608 +0000 UTC m=+1054.011014699" watchObservedRunningTime="2025-12-09 10:18:41.619894253 +0000 UTC m=+1054.011945344" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.665395 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" podStartSLOduration=33.356262655 podStartE2EDuration="37.665109141s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:35.092183368 +0000 UTC m=+1047.484234459" lastFinishedPulling="2025-12-09 10:18:39.401029844 +0000 UTC m=+1051.793080945" observedRunningTime="2025-12-09 10:18:41.662288435 +0000 UTC m=+1054.054339536" watchObservedRunningTime="2025-12-09 10:18:41.665109141 +0000 UTC m=+1054.057160232" Dec 09 10:18:41 crc kubenswrapper[5002]: I1209 10:18:41.689292 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-hnnmt" podStartSLOduration=4.924065373 podStartE2EDuration="37.689273253s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.677158698 +0000 UTC m=+1019.069209779" lastFinishedPulling="2025-12-09 10:18:39.442366578 +0000 UTC m=+1051.834417659" observedRunningTime="2025-12-09 10:18:41.678650386 +0000 UTC m=+1054.070701487" watchObservedRunningTime="2025-12-09 10:18:41.689273253 +0000 UTC m=+1054.081324334" Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.459080 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" event={"ID":"53e6de02-2fb5-4491-aa93-7ad8bd2a190c","Type":"ContainerStarted","Data":"83b2d08ba8a091e0354af9f1d0457c93f2a1a69271687c6864439cf96c040a09"} Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.459956 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.462131 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" event={"ID":"a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf","Type":"ContainerStarted","Data":"7a324bfa4b994f3109e7011c0d5ae23764f50d81e59891df5c3a24988555676d"} Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.462946 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.465442 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" event={"ID":"097e947b-8923-4155-9d10-f0241f751ad8","Type":"ContainerStarted","Data":"0c183f24864ff916f879bfdc81f70ae8a0693f13e1da6282f54d738ffc8223fe"} Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.467037 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.468737 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-4vg8l" Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.469056 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7kqxz" Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.469439 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n5z8" Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.478336 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" podStartSLOduration=2.517208731 podStartE2EDuration="37.478324925s" podCreationTimestamp="2025-12-09 10:18:05 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.665400908 +0000 UTC m=+1019.057451979" lastFinishedPulling="2025-12-09 10:18:41.626517092 +0000 UTC m=+1054.018568173" observedRunningTime="2025-12-09 10:18:42.477598725 +0000 UTC m=+1054.869649846" watchObservedRunningTime="2025-12-09 10:18:42.478324925 +0000 UTC m=+1054.870376006" Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.540219 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" podStartSLOduration=3.230939796 podStartE2EDuration="38.540206022s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.308805928 +0000 UTC m=+1018.700857009" lastFinishedPulling="2025-12-09 10:18:41.618072154 +0000 UTC m=+1054.010123235" observedRunningTime="2025-12-09 10:18:42.537427967 +0000 UTC m=+1054.929479048" watchObservedRunningTime="2025-12-09 10:18:42.540206022 +0000 UTC m=+1054.932257103" Dec 09 10:18:42 crc kubenswrapper[5002]: I1209 10:18:42.587270 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" podStartSLOduration=4.25859936 podStartE2EDuration="38.58724768s" podCreationTimestamp="2025-12-09 10:18:04 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.556389441 +0000 UTC m=+1018.948440532" lastFinishedPulling="2025-12-09 10:18:40.885037771 +0000 UTC m=+1053.277088852" observedRunningTime="2025-12-09 10:18:42.582352508 +0000 UTC m=+1054.974403589" watchObservedRunningTime="2025-12-09 10:18:42.58724768 +0000 UTC m=+1054.979298761" Dec 09 10:18:43 crc kubenswrapper[5002]: I1209 10:18:43.480467 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" Dec 09 10:18:45 crc kubenswrapper[5002]: I1209 10:18:45.108634 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5pq7d" Dec 09 10:18:45 crc kubenswrapper[5002]: I1209 10:18:45.631108 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rdjzk" Dec 09 10:18:45 crc kubenswrapper[5002]: I1209 10:18:45.853248 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-vvm2b" Dec 09 10:18:46 crc kubenswrapper[5002]: I1209 10:18:46.062724 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 10:18:47 crc kubenswrapper[5002]: I1209 10:18:47.518197 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z" event={"ID":"cecd7de4-71b0-4914-a5c5-68b8da7704cf","Type":"ContainerStarted","Data":"b9afc943a364c68636684a30cb47b32ebf164add08d74c7463c22cbfc1b1036f"} Dec 09 10:18:47 crc kubenswrapper[5002]: I1209 10:18:47.553059 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x578z" podStartSLOduration=2.91079983 podStartE2EDuration="42.553029267s" podCreationTimestamp="2025-12-09 10:18:05 +0000 UTC" firstStartedPulling="2025-12-09 10:18:06.839353839 +0000 UTC m=+1019.231404920" lastFinishedPulling="2025-12-09 10:18:46.481583236 +0000 UTC m=+1058.873634357" observedRunningTime="2025-12-09 10:18:47.541477626 +0000 UTC m=+1059.933528767" watchObservedRunningTime="2025-12-09 10:18:47.553029267 +0000 UTC m=+1059.945080378" Dec 09 10:18:47 crc kubenswrapper[5002]: I1209 10:18:47.829618 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7f8d8fb65b-6gj42" Dec 09 10:18:51 crc kubenswrapper[5002]: I1209 10:18:51.128619 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-qqkqr" Dec 09 10:18:51 crc kubenswrapper[5002]: I1209 10:18:51.423498 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fm85h8" Dec 09 10:18:55 crc kubenswrapper[5002]: I1209 10:18:55.403123 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-495q6" Dec 09 10:18:55 crc kubenswrapper[5002]: I1209 10:18:55.489334 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jb68h" Dec 09 10:18:55 crc kubenswrapper[5002]: I1209 10:18:55.981400 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-vkkw9" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.151336 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g9btt"] Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.152831 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.156419 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.156601 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.156717 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-2p88k" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.156855 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.165217 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g9btt"] Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.209204 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g57t2"] Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.210391 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.212527 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.217915 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g57t2"] Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.239865 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-config\") pod \"dnsmasq-dns-675f4bcbfc-g9btt\" (UID: \"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.240225 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z459\" (UniqueName: \"kubernetes.io/projected/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-kube-api-access-7z459\") pod \"dnsmasq-dns-675f4bcbfc-g9btt\" (UID: \"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.341664 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-config\") pod \"dnsmasq-dns-78dd6ddcc-g57t2\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.341716 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-g57t2\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.341755 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxwds\" (UniqueName: \"kubernetes.io/projected/b475e53a-d6c5-454d-98cf-8eb43bf60438-kube-api-access-bxwds\") pod \"dnsmasq-dns-78dd6ddcc-g57t2\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.341841 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-config\") pod \"dnsmasq-dns-675f4bcbfc-g9btt\" (UID: \"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.341893 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z459\" (UniqueName: \"kubernetes.io/projected/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-kube-api-access-7z459\") pod \"dnsmasq-dns-675f4bcbfc-g9btt\" (UID: \"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.342966 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-config\") pod \"dnsmasq-dns-675f4bcbfc-g9btt\" (UID: \"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.357777 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z459\" (UniqueName: \"kubernetes.io/projected/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-kube-api-access-7z459\") pod \"dnsmasq-dns-675f4bcbfc-g9btt\" (UID: \"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.442947 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-config\") pod \"dnsmasq-dns-78dd6ddcc-g57t2\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.443223 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-g57t2\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.443345 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxwds\" (UniqueName: \"kubernetes.io/projected/b475e53a-d6c5-454d-98cf-8eb43bf60438-kube-api-access-bxwds\") pod \"dnsmasq-dns-78dd6ddcc-g57t2\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.443713 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-config\") pod \"dnsmasq-dns-78dd6ddcc-g57t2\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.444237 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-g57t2\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.464851 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxwds\" (UniqueName: \"kubernetes.io/projected/b475e53a-d6c5-454d-98cf-8eb43bf60438-kube-api-access-bxwds\") pod \"dnsmasq-dns-78dd6ddcc-g57t2\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.476574 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.526168 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.914323 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g9btt"] Dec 09 10:19:14 crc kubenswrapper[5002]: I1209 10:19:14.983581 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g57t2"] Dec 09 10:19:14 crc kubenswrapper[5002]: W1209 10:19:14.983695 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb475e53a_d6c5_454d_98cf_8eb43bf60438.slice/crio-24c2548a67b96de0bb15e05e1200da289724ec8421f9bebb408f24a5d8567b81 WatchSource:0}: Error finding container 24c2548a67b96de0bb15e05e1200da289724ec8421f9bebb408f24a5d8567b81: Status 404 returned error can't find the container with id 24c2548a67b96de0bb15e05e1200da289724ec8421f9bebb408f24a5d8567b81 Dec 09 10:19:15 crc kubenswrapper[5002]: I1209 10:19:15.763171 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" event={"ID":"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e","Type":"ContainerStarted","Data":"2dfcdb7068ee8f89697af61d2244ee70f7c3da10e9ea2c6661fcfa1de354f775"} Dec 09 10:19:15 crc kubenswrapper[5002]: I1209 10:19:15.764453 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" event={"ID":"b475e53a-d6c5-454d-98cf-8eb43bf60438","Type":"ContainerStarted","Data":"24c2548a67b96de0bb15e05e1200da289724ec8421f9bebb408f24a5d8567b81"} Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.733905 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g9btt"] Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.749241 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vqhpp"] Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.750612 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.770058 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vqhpp"] Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.886637 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-config\") pod \"dnsmasq-dns-666b6646f7-vqhpp\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.886757 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72qmh\" (UniqueName: \"kubernetes.io/projected/de34f23b-efc3-41e6-a76c-04bfe4eb6494-kube-api-access-72qmh\") pod \"dnsmasq-dns-666b6646f7-vqhpp\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.886847 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vqhpp\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.988502 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72qmh\" (UniqueName: \"kubernetes.io/projected/de34f23b-efc3-41e6-a76c-04bfe4eb6494-kube-api-access-72qmh\") pod \"dnsmasq-dns-666b6646f7-vqhpp\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.988934 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vqhpp\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.990217 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vqhpp\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.990384 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-config\") pod \"dnsmasq-dns-666b6646f7-vqhpp\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.991096 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-config\") pod \"dnsmasq-dns-666b6646f7-vqhpp\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:16 crc kubenswrapper[5002]: I1209 10:19:16.999178 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g57t2"] Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.019080 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72qmh\" (UniqueName: \"kubernetes.io/projected/de34f23b-efc3-41e6-a76c-04bfe4eb6494-kube-api-access-72qmh\") pod \"dnsmasq-dns-666b6646f7-vqhpp\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.024178 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5hpxc"] Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.027378 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.038572 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5hpxc"] Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.076392 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.092375 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plptv\" (UniqueName: \"kubernetes.io/projected/2db56b85-ed32-480a-a20d-0610a73a1bae-kube-api-access-plptv\") pod \"dnsmasq-dns-57d769cc4f-5hpxc\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.092418 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-5hpxc\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.092439 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-config\") pod \"dnsmasq-dns-57d769cc4f-5hpxc\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.194532 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plptv\" (UniqueName: \"kubernetes.io/projected/2db56b85-ed32-480a-a20d-0610a73a1bae-kube-api-access-plptv\") pod \"dnsmasq-dns-57d769cc4f-5hpxc\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.194578 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-5hpxc\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.194597 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-config\") pod \"dnsmasq-dns-57d769cc4f-5hpxc\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.196247 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-config\") pod \"dnsmasq-dns-57d769cc4f-5hpxc\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.196854 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-5hpxc\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.215218 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plptv\" (UniqueName: \"kubernetes.io/projected/2db56b85-ed32-480a-a20d-0610a73a1bae-kube-api-access-plptv\") pod \"dnsmasq-dns-57d769cc4f-5hpxc\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.366304 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.591453 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vqhpp"] Dec 09 10:19:17 crc kubenswrapper[5002]: W1209 10:19:17.597603 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde34f23b_efc3_41e6_a76c_04bfe4eb6494.slice/crio-69209a3bb57885baed912f0518158a0b1a500d5f5baf9536ba4fde8dee9555d4 WatchSource:0}: Error finding container 69209a3bb57885baed912f0518158a0b1a500d5f5baf9536ba4fde8dee9555d4: Status 404 returned error can't find the container with id 69209a3bb57885baed912f0518158a0b1a500d5f5baf9536ba4fde8dee9555d4 Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.816669 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" event={"ID":"de34f23b-efc3-41e6-a76c-04bfe4eb6494","Type":"ContainerStarted","Data":"69209a3bb57885baed912f0518158a0b1a500d5f5baf9536ba4fde8dee9555d4"} Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.827863 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5hpxc"] Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.892562 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.893711 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.897228 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.897473 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.897695 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.898543 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.898777 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.898924 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.900245 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9h2t5" Dec 09 10:19:17 crc kubenswrapper[5002]: I1209 10:19:17.962956 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.018791 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w627k\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-kube-api-access-w627k\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.018853 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-server-conf\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.018871 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.019005 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58c08274-46ea-48be-a135-0c1174cd6135-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.019109 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.019227 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58c08274-46ea-48be-a135-0c1174cd6135-pod-info\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.019476 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.019555 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.019644 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.019665 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.019681 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.121151 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.121193 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.121209 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.121252 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w627k\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-kube-api-access-w627k\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.121279 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-server-conf\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.121295 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.121340 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58c08274-46ea-48be-a135-0c1174cd6135-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.121360 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.121381 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58c08274-46ea-48be-a135-0c1174cd6135-pod-info\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.121406 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.121438 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.122250 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.123302 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.123623 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.124026 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.124532 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.124799 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-server-conf\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.129758 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.129802 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58c08274-46ea-48be-a135-0c1174cd6135-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.134295 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58c08274-46ea-48be-a135-0c1174cd6135-pod-info\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.137049 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.144112 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w627k\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-kube-api-access-w627k\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.150618 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.153599 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.154773 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.159359 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.159535 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.159634 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6cmxw" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.159731 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.159893 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.159943 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.160091 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.168466 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.212218 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.327001 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.327057 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.327171 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9278e14e-2524-4e42-b870-f493ea02ede8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.327203 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.327231 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.327278 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.327313 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.327452 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.327493 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.327566 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tck5z\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-kube-api-access-tck5z\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.327609 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9278e14e-2524-4e42-b870-f493ea02ede8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.428903 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.428990 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429033 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429062 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tck5z\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-kube-api-access-tck5z\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429089 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9278e14e-2524-4e42-b870-f493ea02ede8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429110 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429125 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429160 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9278e14e-2524-4e42-b870-f493ea02ede8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429176 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429192 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429214 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429844 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429930 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.429982 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.430620 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.431449 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.432016 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.438737 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9278e14e-2524-4e42-b870-f493ea02ede8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.441686 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.445588 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.446972 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9278e14e-2524-4e42-b870-f493ea02ede8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.488064 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tck5z\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-kube-api-access-tck5z\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.502071 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.512573 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.782927 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.838100 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" event={"ID":"2db56b85-ed32-480a-a20d-0610a73a1bae","Type":"ContainerStarted","Data":"7135308b31e246dde4c9dcb4ddbe526e2e130484d907dba4d37a71bb8069c007"} Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.842476 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"58c08274-46ea-48be-a135-0c1174cd6135","Type":"ContainerStarted","Data":"faecc1405337e2a042e11321b6db4645aa07d1e217ad5911586ebd7bfc699e93"} Dec 09 10:19:18 crc kubenswrapper[5002]: W1209 10:19:18.964578 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9278e14e_2524_4e42_b870_f493ea02ede8.slice/crio-4b5da4e1754b31552c6dc2cd68bb6dbdef670bc97ab8fcbe670e88f5f8083613 WatchSource:0}: Error finding container 4b5da4e1754b31552c6dc2cd68bb6dbdef670bc97ab8fcbe670e88f5f8083613: Status 404 returned error can't find the container with id 4b5da4e1754b31552c6dc2cd68bb6dbdef670bc97ab8fcbe670e88f5f8083613 Dec 09 10:19:18 crc kubenswrapper[5002]: I1209 10:19:18.966085 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.543014 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.544299 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.546152 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.546869 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-r5phn" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.547592 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.548062 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.555891 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.560887 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.653634 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.653729 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.653755 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.653778 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.653834 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.653855 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.653891 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.653929 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz4zx\" (UniqueName: \"kubernetes.io/projected/7faabd78-c9ab-4397-aa4d-b8aaff302251-kube-api-access-wz4zx\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.755918 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz4zx\" (UniqueName: \"kubernetes.io/projected/7faabd78-c9ab-4397-aa4d-b8aaff302251-kube-api-access-wz4zx\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.755975 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.756029 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.756044 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.756064 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.756093 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.756109 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.756459 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.756926 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.757343 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.757551 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.758614 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.759303 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.761946 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.762643 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.772155 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz4zx\" (UniqueName: \"kubernetes.io/projected/7faabd78-c9ab-4397-aa4d-b8aaff302251-kube-api-access-wz4zx\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.789475 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " pod="openstack/openstack-galera-0" Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.849140 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9278e14e-2524-4e42-b870-f493ea02ede8","Type":"ContainerStarted","Data":"4b5da4e1754b31552c6dc2cd68bb6dbdef670bc97ab8fcbe670e88f5f8083613"} Dec 09 10:19:19 crc kubenswrapper[5002]: I1209 10:19:19.871858 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.012373 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.013950 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.019631 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.019793 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.020085 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.020119 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-x4jzm" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.029105 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.082230 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjgz9\" (UniqueName: \"kubernetes.io/projected/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kube-api-access-wjgz9\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.082275 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.082300 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.082328 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.082387 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.083074 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.083105 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.083127 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.184769 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.184807 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.184875 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.184899 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.184921 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.184985 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjgz9\" (UniqueName: \"kubernetes.io/projected/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kube-api-access-wjgz9\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.185017 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.185050 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.186309 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.186676 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.186711 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.186920 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.186965 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.190136 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.207356 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjgz9\" (UniqueName: \"kubernetes.io/projected/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kube-api-access-wjgz9\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.207859 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.243411 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.329355 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.330319 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.333146 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pmptp" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.333398 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.333434 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.344592 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.351435 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.387172 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.387219 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-kolla-config\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.387244 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs79l\" (UniqueName: \"kubernetes.io/projected/1e36e954-d9c1-41e3-8542-e8f300db90cb-kube-api-access-vs79l\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.387281 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-config-data\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.387298 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.488169 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs79l\" (UniqueName: \"kubernetes.io/projected/1e36e954-d9c1-41e3-8542-e8f300db90cb-kube-api-access-vs79l\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.488241 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-config-data\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.488263 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.488341 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.488363 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-kolla-config\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.489224 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-kolla-config\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.490037 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-config-data\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.502074 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.506050 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs79l\" (UniqueName: \"kubernetes.io/projected/1e36e954-d9c1-41e3-8542-e8f300db90cb-kube-api-access-vs79l\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.518429 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " pod="openstack/memcached-0" Dec 09 10:19:21 crc kubenswrapper[5002]: I1209 10:19:21.687964 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 10:19:22 crc kubenswrapper[5002]: I1209 10:19:22.821680 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:19:22 crc kubenswrapper[5002]: I1209 10:19:22.826129 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 10:19:22 crc kubenswrapper[5002]: I1209 10:19:22.829384 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-rk5nt" Dec 09 10:19:22 crc kubenswrapper[5002]: I1209 10:19:22.835241 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:19:22 crc kubenswrapper[5002]: I1209 10:19:22.910611 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66r8s\" (UniqueName: \"kubernetes.io/projected/3fdef05d-e63e-48b6-8d88-6a374294940e-kube-api-access-66r8s\") pod \"kube-state-metrics-0\" (UID: \"3fdef05d-e63e-48b6-8d88-6a374294940e\") " pod="openstack/kube-state-metrics-0" Dec 09 10:19:23 crc kubenswrapper[5002]: I1209 10:19:23.011622 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66r8s\" (UniqueName: \"kubernetes.io/projected/3fdef05d-e63e-48b6-8d88-6a374294940e-kube-api-access-66r8s\") pod \"kube-state-metrics-0\" (UID: \"3fdef05d-e63e-48b6-8d88-6a374294940e\") " pod="openstack/kube-state-metrics-0" Dec 09 10:19:23 crc kubenswrapper[5002]: I1209 10:19:23.041698 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66r8s\" (UniqueName: \"kubernetes.io/projected/3fdef05d-e63e-48b6-8d88-6a374294940e-kube-api-access-66r8s\") pod \"kube-state-metrics-0\" (UID: \"3fdef05d-e63e-48b6-8d88-6a374294940e\") " pod="openstack/kube-state-metrics-0" Dec 09 10:19:23 crc kubenswrapper[5002]: I1209 10:19:23.171625 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.230341 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.232216 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.235272 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.235931 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.236885 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fnqv9" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.237060 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.237240 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.240231 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.282700 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.282801 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.282884 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-config\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.283192 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.283223 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjxfk\" (UniqueName: \"kubernetes.io/projected/322c0304-1696-43fb-9225-a709e7e2ea89-kube-api-access-qjxfk\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.283276 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.283315 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.283380 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.384639 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.384694 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.384739 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.384771 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-config\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.384792 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.384806 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjxfk\" (UniqueName: \"kubernetes.io/projected/322c0304-1696-43fb-9225-a709e7e2ea89-kube-api-access-qjxfk\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.384848 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.384891 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.385571 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.385784 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.385996 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.386219 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-config\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.390408 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.390672 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.401353 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjxfk\" (UniqueName: \"kubernetes.io/projected/322c0304-1696-43fb-9225-a709e7e2ea89-kube-api-access-qjxfk\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.413792 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.418558 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.572113 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 10:19:27 crc kubenswrapper[5002]: I1209 10:19:27.594204 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.396958 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-47b4k"] Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.398776 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.401230 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-7h7xm" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.401368 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.401509 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.420489 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-47b4k"] Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.462998 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-g4kc8"] Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.464918 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.494260 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-g4kc8"] Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.504758 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-lib\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.504842 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-ovn-controller-tls-certs\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.504887 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run-ovn\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.504941 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-log-ovn\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.504967 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tf86\" (UniqueName: \"kubernetes.io/projected/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-kube-api-access-2tf86\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.504992 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-run\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.505022 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.505049 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-scripts\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.505077 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-combined-ca-bundle\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.505098 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-scripts\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.505121 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2xpx\" (UniqueName: \"kubernetes.io/projected/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-kube-api-access-x2xpx\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.505149 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-log\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.505190 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-etc-ovs\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.606980 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-log\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607052 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-etc-ovs\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607079 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-lib\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607107 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-ovn-controller-tls-certs\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607136 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run-ovn\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607176 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-log-ovn\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607196 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tf86\" (UniqueName: \"kubernetes.io/projected/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-kube-api-access-2tf86\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607214 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-run\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607236 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607257 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-scripts\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607279 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-combined-ca-bundle\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607296 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-scripts\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607327 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2xpx\" (UniqueName: \"kubernetes.io/projected/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-kube-api-access-x2xpx\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607584 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-etc-ovs\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607693 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-log-ovn\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607753 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607786 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run-ovn\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.607849 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-run\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.608023 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-lib\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.610014 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-scripts\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.610122 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-log\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.612647 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-combined-ca-bundle\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.612749 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-ovn-controller-tls-certs\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.619278 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-scripts\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.623166 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2xpx\" (UniqueName: \"kubernetes.io/projected/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-kube-api-access-x2xpx\") pod \"ovn-controller-ovs-g4kc8\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.624843 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tf86\" (UniqueName: \"kubernetes.io/projected/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-kube-api-access-2tf86\") pod \"ovn-controller-47b4k\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.734841 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k" Dec 09 10:19:28 crc kubenswrapper[5002]: I1209 10:19:28.790437 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.202571 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.204265 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.206952 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.207105 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.207147 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.207182 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5lnfh" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.216664 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.317320 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.317373 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.317399 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.317438 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4c4\" (UniqueName: \"kubernetes.io/projected/28cb84ad-b399-4fe4-9631-e481dfa75aed-kube-api-access-hn4c4\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.317539 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.317613 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.317633 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-config\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.317669 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.419014 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.419080 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.419100 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-config\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.419127 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.419195 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.419212 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.419229 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.419249 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4c4\" (UniqueName: \"kubernetes.io/projected/28cb84ad-b399-4fe4-9631-e481dfa75aed-kube-api-access-hn4c4\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.419388 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.420320 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.420878 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-config\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.421042 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.424619 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.425176 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.431527 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.441736 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4c4\" (UniqueName: \"kubernetes.io/projected/28cb84ad-b399-4fe4-9631-e481dfa75aed-kube-api-access-hn4c4\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.442836 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:29 crc kubenswrapper[5002]: I1209 10:19:29.532427 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:42 crc kubenswrapper[5002]: E1209 10:19:42.636634 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 09 10:19:42 crc kubenswrapper[5002]: E1209 10:19:42.637645 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w627k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(58c08274-46ea-48be-a135-0c1174cd6135): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:19:42 crc kubenswrapper[5002]: E1209 10:19:42.638896 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="58c08274-46ea-48be-a135-0c1174cd6135" Dec 09 10:19:42 crc kubenswrapper[5002]: W1209 10:19:42.641968 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7faabd78_c9ab_4397_aa4d_b8aaff302251.slice/crio-38111b2f8e861bf57c5f0e322034fbc75eba84e214cb51fbd1f99f4f884df28a WatchSource:0}: Error finding container 38111b2f8e861bf57c5f0e322034fbc75eba84e214cb51fbd1f99f4f884df28a: Status 404 returned error can't find the container with id 38111b2f8e861bf57c5f0e322034fbc75eba84e214cb51fbd1f99f4f884df28a Dec 09 10:19:43 crc kubenswrapper[5002]: I1209 10:19:43.062435 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7faabd78-c9ab-4397-aa4d-b8aaff302251","Type":"ContainerStarted","Data":"38111b2f8e861bf57c5f0e322034fbc75eba84e214cb51fbd1f99f4f884df28a"} Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.064907 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="58c08274-46ea-48be-a135-0c1174cd6135" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.463423 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.463848 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxwds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-g57t2_openstack(b475e53a-d6c5-454d-98cf-8eb43bf60438): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.465899 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" podUID="b475e53a-d6c5-454d-98cf-8eb43bf60438" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.493653 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.493804 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72qmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-vqhpp_openstack(de34f23b-efc3-41e6-a76c-04bfe4eb6494): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.495590 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" podUID="de34f23b-efc3-41e6-a76c-04bfe4eb6494" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.503923 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.504089 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7z459,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-g9btt_openstack(add75f96-a4a7-43ac-8d6d-d7acd7c19f0e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.505834 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" podUID="add75f96-a4a7-43ac-8d6d-d7acd7c19f0e" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.529503 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.529633 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-plptv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-5hpxc_openstack(2db56b85-ed32-480a-a20d-0610a73a1bae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:19:43 crc kubenswrapper[5002]: E1209 10:19:43.530868 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" podUID="2db56b85-ed32-480a-a20d-0610a73a1bae" Dec 09 10:19:43 crc kubenswrapper[5002]: I1209 10:19:43.916278 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.051125 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:19:44 crc kubenswrapper[5002]: W1209 10:19:44.070219 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fdef05d_e63e_48b6_8d88_6a374294940e.slice/crio-1f37c7d6fddd41f5cfe2b0a979617fdb72360dec2fb341bccfe819aadcc5b37a WatchSource:0}: Error finding container 1f37c7d6fddd41f5cfe2b0a979617fdb72360dec2fb341bccfe819aadcc5b37a: Status 404 returned error can't find the container with id 1f37c7d6fddd41f5cfe2b0a979617fdb72360dec2fb341bccfe819aadcc5b37a Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.075788 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-47b4k"] Dec 09 10:19:44 crc kubenswrapper[5002]: W1209 10:19:44.078129 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdaeef31_a8f8_478a_86b0_4d0126eb7f3a.slice/crio-6c3e11365d63634272c07c286c98d3e312d97e5f9e6503b754df302cc4b9bb4c WatchSource:0}: Error finding container 6c3e11365d63634272c07c286c98d3e312d97e5f9e6503b754df302cc4b9bb4c: Status 404 returned error can't find the container with id 6c3e11365d63634272c07c286c98d3e312d97e5f9e6503b754df302cc4b9bb4c Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.086233 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1e36e954-d9c1-41e3-8542-e8f300db90cb","Type":"ContainerStarted","Data":"2f2fe9d583ff861743289c5cbd4368ed28b66a703168bd606e18025b9335fab7"} Dec 09 10:19:44 crc kubenswrapper[5002]: E1209 10:19:44.088597 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" podUID="de34f23b-efc3-41e6-a76c-04bfe4eb6494" Dec 09 10:19:44 crc kubenswrapper[5002]: E1209 10:19:44.088932 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" podUID="2db56b85-ed32-480a-a20d-0610a73a1bae" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.280204 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.399564 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 10:19:44 crc kubenswrapper[5002]: W1209 10:19:44.413955 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod322c0304_1696_43fb_9225_a709e7e2ea89.slice/crio-493ef9e1293cf81052f3942296fa2500b1a949f6744a038a176f166c2ff4311c WatchSource:0}: Error finding container 493ef9e1293cf81052f3942296fa2500b1a949f6744a038a176f166c2ff4311c: Status 404 returned error can't find the container with id 493ef9e1293cf81052f3942296fa2500b1a949f6744a038a176f166c2ff4311c Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.493870 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-g4kc8"] Dec 09 10:19:44 crc kubenswrapper[5002]: W1209 10:19:44.523173 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26d7a6fc_ab80_43cf_8ee4_bf9fd88089e6.slice/crio-a56e9525efca8beac81f7fa45f9a7a566642078a4686d2d31f413c7f8c689519 WatchSource:0}: Error finding container a56e9525efca8beac81f7fa45f9a7a566642078a4686d2d31f413c7f8c689519: Status 404 returned error can't find the container with id a56e9525efca8beac81f7fa45f9a7a566642078a4686d2d31f413c7f8c689519 Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.796183 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.802368 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.836852 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxwds\" (UniqueName: \"kubernetes.io/projected/b475e53a-d6c5-454d-98cf-8eb43bf60438-kube-api-access-bxwds\") pod \"b475e53a-d6c5-454d-98cf-8eb43bf60438\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.837025 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-config\") pod \"b475e53a-d6c5-454d-98cf-8eb43bf60438\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.837188 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z459\" (UniqueName: \"kubernetes.io/projected/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-kube-api-access-7z459\") pod \"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e\" (UID: \"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e\") " Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.837322 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-config\") pod \"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e\" (UID: \"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e\") " Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.837461 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-dns-svc\") pod \"b475e53a-d6c5-454d-98cf-8eb43bf60438\" (UID: \"b475e53a-d6c5-454d-98cf-8eb43bf60438\") " Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.837796 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-config" (OuterVolumeSpecName: "config") pod "b475e53a-d6c5-454d-98cf-8eb43bf60438" (UID: "b475e53a-d6c5-454d-98cf-8eb43bf60438"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.838625 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.838735 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b475e53a-d6c5-454d-98cf-8eb43bf60438" (UID: "b475e53a-d6c5-454d-98cf-8eb43bf60438"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.838942 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-config" (OuterVolumeSpecName: "config") pod "add75f96-a4a7-43ac-8d6d-d7acd7c19f0e" (UID: "add75f96-a4a7-43ac-8d6d-d7acd7c19f0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.842986 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b475e53a-d6c5-454d-98cf-8eb43bf60438-kube-api-access-bxwds" (OuterVolumeSpecName: "kube-api-access-bxwds") pod "b475e53a-d6c5-454d-98cf-8eb43bf60438" (UID: "b475e53a-d6c5-454d-98cf-8eb43bf60438"). InnerVolumeSpecName "kube-api-access-bxwds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.855976 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-kube-api-access-7z459" (OuterVolumeSpecName: "kube-api-access-7z459") pod "add75f96-a4a7-43ac-8d6d-d7acd7c19f0e" (UID: "add75f96-a4a7-43ac-8d6d-d7acd7c19f0e"). InnerVolumeSpecName "kube-api-access-7z459". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.940029 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b475e53a-d6c5-454d-98cf-8eb43bf60438-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.940056 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxwds\" (UniqueName: \"kubernetes.io/projected/b475e53a-d6c5-454d-98cf-8eb43bf60438-kube-api-access-bxwds\") on node \"crc\" DevicePath \"\"" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.940066 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z459\" (UniqueName: \"kubernetes.io/projected/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-kube-api-access-7z459\") on node \"crc\" DevicePath \"\"" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.940075 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:19:44 crc kubenswrapper[5002]: I1209 10:19:44.985434 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 10:19:44 crc kubenswrapper[5002]: W1209 10:19:44.992351 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28cb84ad_b399_4fe4_9631_e481dfa75aed.slice/crio-96faafc2f3c60b72f11fea10579fb5eb0859efde3ad8721d48d167fd9c26d95d WatchSource:0}: Error finding container 96faafc2f3c60b72f11fea10579fb5eb0859efde3ad8721d48d167fd9c26d95d: Status 404 returned error can't find the container with id 96faafc2f3c60b72f11fea10579fb5eb0859efde3ad8721d48d167fd9c26d95d Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.098091 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" event={"ID":"b475e53a-d6c5-454d-98cf-8eb43bf60438","Type":"ContainerDied","Data":"24c2548a67b96de0bb15e05e1200da289724ec8421f9bebb408f24a5d8567b81"} Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.098107 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g57t2" Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.100253 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d100f321-6fe6-4eb3-a00c-50b9ff5e2861","Type":"ContainerStarted","Data":"40804e59a7145457f96a2a586d17f80f2f2f358bad75b7d0296bbf50d61359d2"} Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.103145 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9278e14e-2524-4e42-b870-f493ea02ede8","Type":"ContainerStarted","Data":"2354f84dc26ea366678ca4f5adfbc4ea21ccc99533838a486cd28b5710f9ea1c"} Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.104247 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"28cb84ad-b399-4fe4-9631-e481dfa75aed","Type":"ContainerStarted","Data":"96faafc2f3c60b72f11fea10579fb5eb0859efde3ad8721d48d167fd9c26d95d"} Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.106120 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" event={"ID":"add75f96-a4a7-43ac-8d6d-d7acd7c19f0e","Type":"ContainerDied","Data":"2dfcdb7068ee8f89697af61d2244ee70f7c3da10e9ea2c6661fcfa1de354f775"} Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.106166 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g9btt" Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.134730 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"322c0304-1696-43fb-9225-a709e7e2ea89","Type":"ContainerStarted","Data":"493ef9e1293cf81052f3942296fa2500b1a949f6744a038a176f166c2ff4311c"} Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.136007 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3fdef05d-e63e-48b6-8d88-6a374294940e","Type":"ContainerStarted","Data":"1f37c7d6fddd41f5cfe2b0a979617fdb72360dec2fb341bccfe819aadcc5b37a"} Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.138785 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-47b4k" event={"ID":"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a","Type":"ContainerStarted","Data":"6c3e11365d63634272c07c286c98d3e312d97e5f9e6503b754df302cc4b9bb4c"} Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.140509 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-g4kc8" event={"ID":"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6","Type":"ContainerStarted","Data":"a56e9525efca8beac81f7fa45f9a7a566642078a4686d2d31f413c7f8c689519"} Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.165920 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g57t2"] Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.175974 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g57t2"] Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.196046 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g9btt"] Dec 09 10:19:45 crc kubenswrapper[5002]: I1209 10:19:45.201494 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g9btt"] Dec 09 10:19:46 crc kubenswrapper[5002]: I1209 10:19:46.074447 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add75f96-a4a7-43ac-8d6d-d7acd7c19f0e" path="/var/lib/kubelet/pods/add75f96-a4a7-43ac-8d6d-d7acd7c19f0e/volumes" Dec 09 10:19:46 crc kubenswrapper[5002]: I1209 10:19:46.075588 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b475e53a-d6c5-454d-98cf-8eb43bf60438" path="/var/lib/kubelet/pods/b475e53a-d6c5-454d-98cf-8eb43bf60438/volumes" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.255287 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ghft5"] Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.257195 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.261402 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.273373 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ghft5"] Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.314646 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f7675b-6614-4e41-86e6-364b7f04664e-config\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.314746 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxmdr\" (UniqueName: \"kubernetes.io/projected/e0f7675b-6614-4e41-86e6-364b7f04664e-kube-api-access-mxmdr\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.314768 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovs-rundir\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.314844 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-combined-ca-bundle\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.314873 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovn-rundir\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.315493 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.394846 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5hpxc"] Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.418881 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f7675b-6614-4e41-86e6-364b7f04664e-config\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.418959 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovs-rundir\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.418989 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxmdr\" (UniqueName: \"kubernetes.io/projected/e0f7675b-6614-4e41-86e6-364b7f04664e-kube-api-access-mxmdr\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.419228 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-combined-ca-bundle\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.419274 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovn-rundir\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.419402 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.421369 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovs-rundir\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.423139 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f7675b-6614-4e41-86e6-364b7f04664e-config\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.423287 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovn-rundir\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.427530 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.440288 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-combined-ca-bundle\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.443302 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-52q77"] Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.444946 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.451692 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.455473 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxmdr\" (UniqueName: \"kubernetes.io/projected/e0f7675b-6614-4e41-86e6-364b7f04664e-kube-api-access-mxmdr\") pod \"ovn-controller-metrics-ghft5\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.488062 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-52q77"] Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.525451 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lkbx\" (UniqueName: \"kubernetes.io/projected/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-kube-api-access-7lkbx\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.527226 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-config\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.527312 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.527338 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.559012 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vqhpp"] Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.590369 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.592212 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8n4dc"] Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.593865 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.599921 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.629824 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdd2q\" (UniqueName: \"kubernetes.io/projected/0471984e-6e43-485b-8ae1-cd7be7951b89-kube-api-access-vdd2q\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.629890 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.629931 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.629979 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-config\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.634106 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8n4dc"] Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.634612 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.634668 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.634774 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-config\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.634867 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.634938 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lkbx\" (UniqueName: \"kubernetes.io/projected/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-kube-api-access-7lkbx\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.636897 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.637084 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.637302 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-config\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.676393 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lkbx\" (UniqueName: \"kubernetes.io/projected/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-kube-api-access-7lkbx\") pod \"dnsmasq-dns-7fd796d7df-52q77\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.740911 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-config\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.740987 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.741054 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdd2q\" (UniqueName: \"kubernetes.io/projected/0471984e-6e43-485b-8ae1-cd7be7951b89-kube-api-access-vdd2q\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.741078 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.741106 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.742007 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.742032 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-config\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.742235 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.742790 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.760481 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdd2q\" (UniqueName: \"kubernetes.io/projected/0471984e-6e43-485b-8ae1-cd7be7951b89-kube-api-access-vdd2q\") pod \"dnsmasq-dns-86db49b7ff-8n4dc\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.814909 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:49 crc kubenswrapper[5002]: I1209 10:19:49.928179 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.016762 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.022618 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.063457 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-dns-svc\") pod \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.063725 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-config\") pod \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.063750 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plptv\" (UniqueName: \"kubernetes.io/projected/2db56b85-ed32-480a-a20d-0610a73a1bae-kube-api-access-plptv\") pod \"2db56b85-ed32-480a-a20d-0610a73a1bae\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.063863 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-dns-svc\") pod \"2db56b85-ed32-480a-a20d-0610a73a1bae\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.063898 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72qmh\" (UniqueName: \"kubernetes.io/projected/de34f23b-efc3-41e6-a76c-04bfe4eb6494-kube-api-access-72qmh\") pod \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\" (UID: \"de34f23b-efc3-41e6-a76c-04bfe4eb6494\") " Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.063968 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-config\") pod \"2db56b85-ed32-480a-a20d-0610a73a1bae\" (UID: \"2db56b85-ed32-480a-a20d-0610a73a1bae\") " Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.065021 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-config" (OuterVolumeSpecName: "config") pod "de34f23b-efc3-41e6-a76c-04bfe4eb6494" (UID: "de34f23b-efc3-41e6-a76c-04bfe4eb6494"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.065234 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de34f23b-efc3-41e6-a76c-04bfe4eb6494" (UID: "de34f23b-efc3-41e6-a76c-04bfe4eb6494"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.065426 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-config" (OuterVolumeSpecName: "config") pod "2db56b85-ed32-480a-a20d-0610a73a1bae" (UID: "2db56b85-ed32-480a-a20d-0610a73a1bae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.069767 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de34f23b-efc3-41e6-a76c-04bfe4eb6494-kube-api-access-72qmh" (OuterVolumeSpecName: "kube-api-access-72qmh") pod "de34f23b-efc3-41e6-a76c-04bfe4eb6494" (UID: "de34f23b-efc3-41e6-a76c-04bfe4eb6494"). InnerVolumeSpecName "kube-api-access-72qmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.072663 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2db56b85-ed32-480a-a20d-0610a73a1bae" (UID: "2db56b85-ed32-480a-a20d-0610a73a1bae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.082076 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db56b85-ed32-480a-a20d-0610a73a1bae-kube-api-access-plptv" (OuterVolumeSpecName: "kube-api-access-plptv") pod "2db56b85-ed32-480a-a20d-0610a73a1bae" (UID: "2db56b85-ed32-480a-a20d-0610a73a1bae"). InnerVolumeSpecName "kube-api-access-plptv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.166176 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.166217 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.166232 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de34f23b-efc3-41e6-a76c-04bfe4eb6494-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.166247 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plptv\" (UniqueName: \"kubernetes.io/projected/2db56b85-ed32-480a-a20d-0610a73a1bae-kube-api-access-plptv\") on node \"crc\" DevicePath \"\"" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.166267 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2db56b85-ed32-480a-a20d-0610a73a1bae-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.166281 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72qmh\" (UniqueName: \"kubernetes.io/projected/de34f23b-efc3-41e6-a76c-04bfe4eb6494-kube-api-access-72qmh\") on node \"crc\" DevicePath \"\"" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.192078 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.192071 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-5hpxc" event={"ID":"2db56b85-ed32-480a-a20d-0610a73a1bae","Type":"ContainerDied","Data":"7135308b31e246dde4c9dcb4ddbe526e2e130484d907dba4d37a71bb8069c007"} Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.196287 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" event={"ID":"de34f23b-efc3-41e6-a76c-04bfe4eb6494","Type":"ContainerDied","Data":"69209a3bb57885baed912f0518158a0b1a500d5f5baf9536ba4fde8dee9555d4"} Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.196378 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vqhpp" Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.273615 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vqhpp"] Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.281728 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vqhpp"] Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.294757 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5hpxc"] Dec 09 10:19:51 crc kubenswrapper[5002]: I1209 10:19:51.300137 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5hpxc"] Dec 09 10:19:52 crc kubenswrapper[5002]: I1209 10:19:52.053562 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-52q77"] Dec 09 10:19:52 crc kubenswrapper[5002]: I1209 10:19:52.071457 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db56b85-ed32-480a-a20d-0610a73a1bae" path="/var/lib/kubelet/pods/2db56b85-ed32-480a-a20d-0610a73a1bae/volumes" Dec 09 10:19:52 crc kubenswrapper[5002]: I1209 10:19:52.072683 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de34f23b-efc3-41e6-a76c-04bfe4eb6494" path="/var/lib/kubelet/pods/de34f23b-efc3-41e6-a76c-04bfe4eb6494/volumes" Dec 09 10:19:52 crc kubenswrapper[5002]: I1209 10:19:52.155342 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8n4dc"] Dec 09 10:19:52 crc kubenswrapper[5002]: I1209 10:19:52.160949 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ghft5"] Dec 09 10:19:52 crc kubenswrapper[5002]: I1209 10:19:52.205078 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-g4kc8" event={"ID":"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6","Type":"ContainerStarted","Data":"23d2c226434058f19c40b285671296768eb145dc5740c1444ad73c806ee85ca4"} Dec 09 10:19:52 crc kubenswrapper[5002]: W1209 10:19:52.389589 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0f7675b_6614_4e41_86e6_364b7f04664e.slice/crio-ca41310247f17d665549977b5992bdf6d31b44886b6d9cc390d984e83a71300c WatchSource:0}: Error finding container ca41310247f17d665549977b5992bdf6d31b44886b6d9cc390d984e83a71300c: Status 404 returned error can't find the container with id ca41310247f17d665549977b5992bdf6d31b44886b6d9cc390d984e83a71300c Dec 09 10:19:52 crc kubenswrapper[5002]: W1209 10:19:52.394000 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0471984e_6e43_485b_8ae1_cd7be7951b89.slice/crio-0d7663bb9ceca220b586c60cc11077984640349dd2311f7ec0d5bc426736c709 WatchSource:0}: Error finding container 0d7663bb9ceca220b586c60cc11077984640349dd2311f7ec0d5bc426736c709: Status 404 returned error can't find the container with id 0d7663bb9ceca220b586c60cc11077984640349dd2311f7ec0d5bc426736c709 Dec 09 10:19:52 crc kubenswrapper[5002]: W1209 10:19:52.417064 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb68e0176_a28e_4b3b_ad77_28d6f5f943f8.slice/crio-a78113cfaa8618718893e21e7f955e97ffccef7dc23e3f940de10e0df8765156 WatchSource:0}: Error finding container a78113cfaa8618718893e21e7f955e97ffccef7dc23e3f940de10e0df8765156: Status 404 returned error can't find the container with id a78113cfaa8618718893e21e7f955e97ffccef7dc23e3f940de10e0df8765156 Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.217242 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"28cb84ad-b399-4fe4-9631-e481dfa75aed","Type":"ContainerStarted","Data":"2215494876baf67d40bfc6391dc6cc221f9e14b2fc38cc62efc7ad13c22f507b"} Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.220121 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" event={"ID":"0471984e-6e43-485b-8ae1-cd7be7951b89","Type":"ContainerStarted","Data":"0d7663bb9ceca220b586c60cc11077984640349dd2311f7ec0d5bc426736c709"} Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.223140 5002 generic.go:334] "Generic (PLEG): container finished" podID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerID="23d2c226434058f19c40b285671296768eb145dc5740c1444ad73c806ee85ca4" exitCode=0 Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.223203 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-g4kc8" event={"ID":"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6","Type":"ContainerDied","Data":"23d2c226434058f19c40b285671296768eb145dc5740c1444ad73c806ee85ca4"} Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.226917 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1e36e954-d9c1-41e3-8542-e8f300db90cb","Type":"ContainerStarted","Data":"b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8"} Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.227073 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.232165 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ghft5" event={"ID":"e0f7675b-6614-4e41-86e6-364b7f04664e","Type":"ContainerStarted","Data":"ca41310247f17d665549977b5992bdf6d31b44886b6d9cc390d984e83a71300c"} Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.233745 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7faabd78-c9ab-4397-aa4d-b8aaff302251","Type":"ContainerStarted","Data":"a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65"} Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.252185 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"322c0304-1696-43fb-9225-a709e7e2ea89","Type":"ContainerStarted","Data":"48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050"} Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.254678 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d100f321-6fe6-4eb3-a00c-50b9ff5e2861","Type":"ContainerStarted","Data":"c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703"} Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.262252 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-47b4k" event={"ID":"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a","Type":"ContainerStarted","Data":"ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf"} Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.262388 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-47b4k" Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.264200 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" event={"ID":"b68e0176-a28e-4b3b-ad77-28d6f5f943f8","Type":"ContainerStarted","Data":"a78113cfaa8618718893e21e7f955e97ffccef7dc23e3f940de10e0df8765156"} Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.317907 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=24.709300657 podStartE2EDuration="32.317891427s" podCreationTimestamp="2025-12-09 10:19:21 +0000 UTC" firstStartedPulling="2025-12-09 10:19:43.934386529 +0000 UTC m=+1116.326437610" lastFinishedPulling="2025-12-09 10:19:51.542977289 +0000 UTC m=+1123.935028380" observedRunningTime="2025-12-09 10:19:53.292430111 +0000 UTC m=+1125.684481252" watchObservedRunningTime="2025-12-09 10:19:53.317891427 +0000 UTC m=+1125.709942508" Dec 09 10:19:53 crc kubenswrapper[5002]: I1209 10:19:53.349603 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-47b4k" podStartSLOduration=17.577598768 podStartE2EDuration="25.349588741s" podCreationTimestamp="2025-12-09 10:19:28 +0000 UTC" firstStartedPulling="2025-12-09 10:19:44.081637087 +0000 UTC m=+1116.473688168" lastFinishedPulling="2025-12-09 10:19:51.85362706 +0000 UTC m=+1124.245678141" observedRunningTime="2025-12-09 10:19:53.3406577 +0000 UTC m=+1125.732708781" watchObservedRunningTime="2025-12-09 10:19:53.349588741 +0000 UTC m=+1125.741639812" Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.306036 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-g4kc8" event={"ID":"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6","Type":"ContainerStarted","Data":"3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782"} Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.306509 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.306526 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.306533 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-g4kc8" event={"ID":"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6","Type":"ContainerStarted","Data":"5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5"} Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.310551 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3fdef05d-e63e-48b6-8d88-6a374294940e","Type":"ContainerStarted","Data":"12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848"} Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.310667 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.312877 5002 generic.go:334] "Generic (PLEG): container finished" podID="0471984e-6e43-485b-8ae1-cd7be7951b89" containerID="2eb60917a5ad3055737c86a738d3e854d369bc3db6eaace9f147dbb05a1e54b5" exitCode=0 Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.312942 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" event={"ID":"0471984e-6e43-485b-8ae1-cd7be7951b89","Type":"ContainerDied","Data":"2eb60917a5ad3055737c86a738d3e854d369bc3db6eaace9f147dbb05a1e54b5"} Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.315692 5002 generic.go:334] "Generic (PLEG): container finished" podID="b68e0176-a28e-4b3b-ad77-28d6f5f943f8" containerID="4872faca91228f6786252a7cf045fdb023f887e08a21fae602fc4c1d8ea6b323" exitCode=0 Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.316733 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" event={"ID":"b68e0176-a28e-4b3b-ad77-28d6f5f943f8","Type":"ContainerDied","Data":"4872faca91228f6786252a7cf045fdb023f887e08a21fae602fc4c1d8ea6b323"} Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.333309 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-g4kc8" podStartSLOduration=19.317514011 podStartE2EDuration="26.333282438s" podCreationTimestamp="2025-12-09 10:19:28 +0000 UTC" firstStartedPulling="2025-12-09 10:19:44.525034584 +0000 UTC m=+1116.917085675" lastFinishedPulling="2025-12-09 10:19:51.540803021 +0000 UTC m=+1123.932854102" observedRunningTime="2025-12-09 10:19:54.331619153 +0000 UTC m=+1126.723670254" watchObservedRunningTime="2025-12-09 10:19:54.333282438 +0000 UTC m=+1126.725333519" Dec 09 10:19:54 crc kubenswrapper[5002]: I1209 10:19:54.374433 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=23.51087703 podStartE2EDuration="32.374413156s" podCreationTimestamp="2025-12-09 10:19:22 +0000 UTC" firstStartedPulling="2025-12-09 10:19:44.072864451 +0000 UTC m=+1116.464915532" lastFinishedPulling="2025-12-09 10:19:52.936400577 +0000 UTC m=+1125.328451658" observedRunningTime="2025-12-09 10:19:54.367795038 +0000 UTC m=+1126.759846129" watchObservedRunningTime="2025-12-09 10:19:54.374413156 +0000 UTC m=+1126.766464237" Dec 09 10:19:56 crc kubenswrapper[5002]: I1209 10:19:56.334742 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" event={"ID":"b68e0176-a28e-4b3b-ad77-28d6f5f943f8","Type":"ContainerStarted","Data":"e7abda2952182e01e3daf7a0c60defb86e901a049f11ffa5d03026700505c1d4"} Dec 09 10:19:56 crc kubenswrapper[5002]: I1209 10:19:56.335097 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:19:56 crc kubenswrapper[5002]: I1209 10:19:56.337454 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"58c08274-46ea-48be-a135-0c1174cd6135","Type":"ContainerStarted","Data":"700088876c2e92d617598571a2ba75be5d5b0ca2bdb88d2688fcecc1a5db9a68"} Dec 09 10:19:56 crc kubenswrapper[5002]: I1209 10:19:56.339642 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" event={"ID":"0471984e-6e43-485b-8ae1-cd7be7951b89","Type":"ContainerStarted","Data":"3533e415fc3b60a01fbf71e5712338dcea984abc003f940dc73c43c941049ff3"} Dec 09 10:19:56 crc kubenswrapper[5002]: I1209 10:19:56.340031 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:19:56 crc kubenswrapper[5002]: I1209 10:19:56.361453 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" podStartSLOduration=6.717305121 podStartE2EDuration="7.361433428s" podCreationTimestamp="2025-12-09 10:19:49 +0000 UTC" firstStartedPulling="2025-12-09 10:19:52.42587005 +0000 UTC m=+1124.817921151" lastFinishedPulling="2025-12-09 10:19:53.069998377 +0000 UTC m=+1125.462049458" observedRunningTime="2025-12-09 10:19:56.353791812 +0000 UTC m=+1128.745842893" watchObservedRunningTime="2025-12-09 10:19:56.361433428 +0000 UTC m=+1128.753484509" Dec 09 10:19:56 crc kubenswrapper[5002]: I1209 10:19:56.401661 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" podStartSLOduration=6.737002391 podStartE2EDuration="7.40159554s" podCreationTimestamp="2025-12-09 10:19:49 +0000 UTC" firstStartedPulling="2025-12-09 10:19:52.406279092 +0000 UTC m=+1124.798330193" lastFinishedPulling="2025-12-09 10:19:53.070872261 +0000 UTC m=+1125.462923342" observedRunningTime="2025-12-09 10:19:56.400237254 +0000 UTC m=+1128.792288335" watchObservedRunningTime="2025-12-09 10:19:56.40159554 +0000 UTC m=+1128.793646621" Dec 09 10:19:58 crc kubenswrapper[5002]: I1209 10:19:58.358684 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"322c0304-1696-43fb-9225-a709e7e2ea89","Type":"ContainerStarted","Data":"61bf667d89aa459332a0bf66073b7adb7457f78a04b9772762f9e64fcf4753ad"} Dec 09 10:19:58 crc kubenswrapper[5002]: I1209 10:19:58.360737 5002 generic.go:334] "Generic (PLEG): container finished" podID="d100f321-6fe6-4eb3-a00c-50b9ff5e2861" containerID="c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703" exitCode=0 Dec 09 10:19:58 crc kubenswrapper[5002]: I1209 10:19:58.363803 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d100f321-6fe6-4eb3-a00c-50b9ff5e2861","Type":"ContainerDied","Data":"c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703"} Dec 09 10:19:58 crc kubenswrapper[5002]: I1209 10:19:58.366482 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"28cb84ad-b399-4fe4-9631-e481dfa75aed","Type":"ContainerStarted","Data":"faa9bd08445de88d08d39878352fad3e34339b91a452f594ffc3a3e18843d443"} Dec 09 10:19:58 crc kubenswrapper[5002]: I1209 10:19:58.368878 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ghft5" event={"ID":"e0f7675b-6614-4e41-86e6-364b7f04664e","Type":"ContainerStarted","Data":"9c44cdbba87ffe7a2f1b4ebc4e3392a164aba9181ae3fb15f68ca8ab66302c4d"} Dec 09 10:19:58 crc kubenswrapper[5002]: I1209 10:19:58.370938 5002 generic.go:334] "Generic (PLEG): container finished" podID="7faabd78-c9ab-4397-aa4d-b8aaff302251" containerID="a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65" exitCode=0 Dec 09 10:19:58 crc kubenswrapper[5002]: I1209 10:19:58.370978 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7faabd78-c9ab-4397-aa4d-b8aaff302251","Type":"ContainerDied","Data":"a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65"} Dec 09 10:19:58 crc kubenswrapper[5002]: I1209 10:19:58.383528 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=19.303519091 podStartE2EDuration="32.383470484s" podCreationTimestamp="2025-12-09 10:19:26 +0000 UTC" firstStartedPulling="2025-12-09 10:19:44.419424538 +0000 UTC m=+1116.811475619" lastFinishedPulling="2025-12-09 10:19:57.499375931 +0000 UTC m=+1129.891427012" observedRunningTime="2025-12-09 10:19:58.382427146 +0000 UTC m=+1130.774478237" watchObservedRunningTime="2025-12-09 10:19:58.383470484 +0000 UTC m=+1130.775521565" Dec 09 10:19:58 crc kubenswrapper[5002]: I1209 10:19:58.413331 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.875924474 podStartE2EDuration="30.413314227s" podCreationTimestamp="2025-12-09 10:19:28 +0000 UTC" firstStartedPulling="2025-12-09 10:19:44.995297136 +0000 UTC m=+1117.387348217" lastFinishedPulling="2025-12-09 10:19:57.532686869 +0000 UTC m=+1129.924737970" observedRunningTime="2025-12-09 10:19:58.409365401 +0000 UTC m=+1130.801416482" watchObservedRunningTime="2025-12-09 10:19:58.413314227 +0000 UTC m=+1130.805365308" Dec 09 10:19:58 crc kubenswrapper[5002]: I1209 10:19:58.462695 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ghft5" podStartSLOduration=4.349281812 podStartE2EDuration="9.462673358s" podCreationTimestamp="2025-12-09 10:19:49 +0000 UTC" firstStartedPulling="2025-12-09 10:19:52.40619279 +0000 UTC m=+1124.798243881" lastFinishedPulling="2025-12-09 10:19:57.519584316 +0000 UTC m=+1129.911635427" observedRunningTime="2025-12-09 10:19:58.436503522 +0000 UTC m=+1130.828554603" watchObservedRunningTime="2025-12-09 10:19:58.462673358 +0000 UTC m=+1130.854724439" Dec 09 10:19:59 crc kubenswrapper[5002]: I1209 10:19:59.382284 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7faabd78-c9ab-4397-aa4d-b8aaff302251","Type":"ContainerStarted","Data":"6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5"} Dec 09 10:19:59 crc kubenswrapper[5002]: I1209 10:19:59.385585 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d100f321-6fe6-4eb3-a00c-50b9ff5e2861","Type":"ContainerStarted","Data":"c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9"} Dec 09 10:19:59 crc kubenswrapper[5002]: I1209 10:19:59.431441 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=32.598197592 podStartE2EDuration="41.431411641s" podCreationTimestamp="2025-12-09 10:19:18 +0000 UTC" firstStartedPulling="2025-12-09 10:19:42.67635293 +0000 UTC m=+1115.068404051" lastFinishedPulling="2025-12-09 10:19:51.509567019 +0000 UTC m=+1123.901618100" observedRunningTime="2025-12-09 10:19:59.418192485 +0000 UTC m=+1131.810243576" watchObservedRunningTime="2025-12-09 10:19:59.431411641 +0000 UTC m=+1131.823462762" Dec 09 10:19:59 crc kubenswrapper[5002]: I1209 10:19:59.452211 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=33.15198553 podStartE2EDuration="40.452191671s" podCreationTimestamp="2025-12-09 10:19:19 +0000 UTC" firstStartedPulling="2025-12-09 10:19:44.341990502 +0000 UTC m=+1116.734041583" lastFinishedPulling="2025-12-09 10:19:51.642196643 +0000 UTC m=+1124.034247724" observedRunningTime="2025-12-09 10:19:59.445981524 +0000 UTC m=+1131.838032615" watchObservedRunningTime="2025-12-09 10:19:59.452191671 +0000 UTC m=+1131.844242762" Dec 09 10:19:59 crc kubenswrapper[5002]: I1209 10:19:59.533741 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:59 crc kubenswrapper[5002]: I1209 10:19:59.533791 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:59 crc kubenswrapper[5002]: I1209 10:19:59.582264 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 09 10:19:59 crc kubenswrapper[5002]: I1209 10:19:59.872799 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 09 10:19:59 crc kubenswrapper[5002]: I1209 10:19:59.872864 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 09 10:20:00 crc kubenswrapper[5002]: I1209 10:20:00.442367 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 09 10:20:00 crc kubenswrapper[5002]: I1209 10:20:00.573012 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 09 10:20:00 crc kubenswrapper[5002]: I1209 10:20:00.628954 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.352488 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.352538 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.400522 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.439275 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.597318 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.599538 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.604451 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.604668 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.608540 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-z77sw" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.608838 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.623325 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.650596 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.650674 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-config\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.650934 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.651035 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmdj\" (UniqueName: \"kubernetes.io/projected/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-kube-api-access-swmdj\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.651094 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.651373 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-scripts\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.651525 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.689713 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.752532 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.752590 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmdj\" (UniqueName: \"kubernetes.io/projected/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-kube-api-access-swmdj\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.752614 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.752696 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-scripts\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.752753 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.752785 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.752803 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-config\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.753593 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.753663 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-config\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.753711 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-scripts\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.758775 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.760006 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.768780 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.776194 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmdj\" (UniqueName: \"kubernetes.io/projected/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-kube-api-access-swmdj\") pod \"ovn-northd-0\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " pod="openstack/ovn-northd-0" Dec 09 10:20:01 crc kubenswrapper[5002]: I1209 10:20:01.927886 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 10:20:02 crc kubenswrapper[5002]: I1209 10:20:02.360705 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 10:20:02 crc kubenswrapper[5002]: W1209 10:20:02.365684 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36fbd6d1_d87d_45a2_9bca_0f25f3daca0c.slice/crio-08b0af9597b2797bac52a9423398ee7af507abb50d93d997131e6ed75aa92eba WatchSource:0}: Error finding container 08b0af9597b2797bac52a9423398ee7af507abb50d93d997131e6ed75aa92eba: Status 404 returned error can't find the container with id 08b0af9597b2797bac52a9423398ee7af507abb50d93d997131e6ed75aa92eba Dec 09 10:20:02 crc kubenswrapper[5002]: I1209 10:20:02.410339 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c","Type":"ContainerStarted","Data":"08b0af9597b2797bac52a9423398ee7af507abb50d93d997131e6ed75aa92eba"} Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.121131 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-52q77"] Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.121750 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" podUID="b68e0176-a28e-4b3b-ad77-28d6f5f943f8" containerName="dnsmasq-dns" containerID="cri-o://e7abda2952182e01e3daf7a0c60defb86e901a049f11ffa5d03026700505c1d4" gracePeriod=10 Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.123061 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.150731 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-bbdqx"] Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.152282 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.175790 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bbdqx"] Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.204452 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.287718 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.288035 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zmpm\" (UniqueName: \"kubernetes.io/projected/f36484d9-764c-460b-b48e-f0c36e1145e8-kube-api-access-9zmpm\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.288152 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-dns-svc\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.288246 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.288331 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-config\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.391956 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.392013 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zmpm\" (UniqueName: \"kubernetes.io/projected/f36484d9-764c-460b-b48e-f0c36e1145e8-kube-api-access-9zmpm\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.392060 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-dns-svc\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.392115 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.392145 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-config\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.393100 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.393581 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-config\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.393842 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-dns-svc\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.394225 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.413799 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zmpm\" (UniqueName: \"kubernetes.io/projected/f36484d9-764c-460b-b48e-f0c36e1145e8-kube-api-access-9zmpm\") pod \"dnsmasq-dns-698758b865-bbdqx\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.470241 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:03 crc kubenswrapper[5002]: I1209 10:20:03.916994 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bbdqx"] Dec 09 10:20:03 crc kubenswrapper[5002]: W1209 10:20:03.923771 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf36484d9_764c_460b_b48e_f0c36e1145e8.slice/crio-7caf4911096d14f6d61486c85bbc7f7c971fceb139005e01326f0df4e4dba26f WatchSource:0}: Error finding container 7caf4911096d14f6d61486c85bbc7f7c971fceb139005e01326f0df4e4dba26f: Status 404 returned error can't find the container with id 7caf4911096d14f6d61486c85bbc7f7c971fceb139005e01326f0df4e4dba26f Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.294360 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.299509 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.301491 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.301618 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-88njv" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.301638 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.302305 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.328687 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.409048 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwnbd\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-kube-api-access-wwnbd\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.409088 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-cache\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.409116 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.409137 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-lock\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.409279 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.461242 5002 generic.go:334] "Generic (PLEG): container finished" podID="b68e0176-a28e-4b3b-ad77-28d6f5f943f8" containerID="e7abda2952182e01e3daf7a0c60defb86e901a049f11ffa5d03026700505c1d4" exitCode=0 Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.461306 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" event={"ID":"b68e0176-a28e-4b3b-ad77-28d6f5f943f8","Type":"ContainerDied","Data":"e7abda2952182e01e3daf7a0c60defb86e901a049f11ffa5d03026700505c1d4"} Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.462444 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bbdqx" event={"ID":"f36484d9-764c-460b-b48e-f0c36e1145e8","Type":"ContainerStarted","Data":"7caf4911096d14f6d61486c85bbc7f7c971fceb139005e01326f0df4e4dba26f"} Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.511253 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwnbd\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-kube-api-access-wwnbd\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.511307 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-cache\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.511350 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.511381 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-lock\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.511424 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: E1209 10:20:04.511656 5002 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 10:20:04 crc kubenswrapper[5002]: E1209 10:20:04.511683 5002 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 10:20:04 crc kubenswrapper[5002]: E1209 10:20:04.511738 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift podName:dfa166a7-dec2-453d-9cd9-f77d30f1636a nodeName:}" failed. No retries permitted until 2025-12-09 10:20:05.011717796 +0000 UTC m=+1137.403768877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift") pod "swift-storage-0" (UID: "dfa166a7-dec2-453d-9cd9-f77d30f1636a") : configmap "swift-ring-files" not found Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.511732 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.511792 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-cache\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.511873 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-lock\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.522091 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-sdzqj"] Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.527447 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.530660 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.534149 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.538241 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.538983 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwnbd\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-kube-api-access-wwnbd\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.541391 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-sdzqj"] Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.541518 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.613137 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rczrz\" (UniqueName: \"kubernetes.io/projected/4c575708-ef27-4116-8eb1-9eae1aae903f-kube-api-access-rczrz\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.613579 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-dispersionconf\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.613716 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-combined-ca-bundle\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.614014 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-ring-data-devices\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.614136 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-scripts\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.614229 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4c575708-ef27-4116-8eb1-9eae1aae903f-etc-swift\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.614332 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-swiftconf\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.715633 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-ring-data-devices\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.716000 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-scripts\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.716783 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4c575708-ef27-4116-8eb1-9eae1aae903f-etc-swift\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.717162 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-swiftconf\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.717711 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczrz\" (UniqueName: \"kubernetes.io/projected/4c575708-ef27-4116-8eb1-9eae1aae903f-kube-api-access-rczrz\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.717873 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-dispersionconf\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.718049 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-combined-ca-bundle\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.716457 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-ring-data-devices\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.717119 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4c575708-ef27-4116-8eb1-9eae1aae903f-etc-swift\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.716740 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-scripts\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.720905 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-dispersionconf\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.721559 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-combined-ca-bundle\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.723011 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-swiftconf\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.733398 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczrz\" (UniqueName: \"kubernetes.io/projected/4c575708-ef27-4116-8eb1-9eae1aae903f-kube-api-access-rczrz\") pod \"swift-ring-rebalance-sdzqj\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.816205 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" podUID="b68e0176-a28e-4b3b-ad77-28d6f5f943f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.920921 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:04 crc kubenswrapper[5002]: I1209 10:20:04.930003 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:20:05 crc kubenswrapper[5002]: I1209 10:20:05.030284 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:05 crc kubenswrapper[5002]: E1209 10:20:05.030472 5002 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 10:20:05 crc kubenswrapper[5002]: E1209 10:20:05.031073 5002 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 10:20:05 crc kubenswrapper[5002]: E1209 10:20:05.031113 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift podName:dfa166a7-dec2-453d-9cd9-f77d30f1636a nodeName:}" failed. No retries permitted until 2025-12-09 10:20:06.031098891 +0000 UTC m=+1138.423149962 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift") pod "swift-storage-0" (UID: "dfa166a7-dec2-453d-9cd9-f77d30f1636a") : configmap "swift-ring-files" not found Dec 09 10:20:05 crc kubenswrapper[5002]: I1209 10:20:05.397139 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-sdzqj"] Dec 09 10:20:05 crc kubenswrapper[5002]: I1209 10:20:05.470270 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-sdzqj" event={"ID":"4c575708-ef27-4116-8eb1-9eae1aae903f","Type":"ContainerStarted","Data":"10a1fcf3eebee9467f6df85f740fbe292392c452666f131dfc8edd6a1b31a13c"} Dec 09 10:20:06 crc kubenswrapper[5002]: I1209 10:20:06.054466 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:06 crc kubenswrapper[5002]: E1209 10:20:06.054662 5002 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 10:20:06 crc kubenswrapper[5002]: E1209 10:20:06.054691 5002 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 10:20:06 crc kubenswrapper[5002]: E1209 10:20:06.054768 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift podName:dfa166a7-dec2-453d-9cd9-f77d30f1636a nodeName:}" failed. No retries permitted until 2025-12-09 10:20:08.054744824 +0000 UTC m=+1140.446795915 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift") pod "swift-storage-0" (UID: "dfa166a7-dec2-453d-9cd9-f77d30f1636a") : configmap "swift-ring-files" not found Dec 09 10:20:08 crc kubenswrapper[5002]: I1209 10:20:08.090892 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:08 crc kubenswrapper[5002]: E1209 10:20:08.091124 5002 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 10:20:08 crc kubenswrapper[5002]: E1209 10:20:08.091269 5002 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 10:20:08 crc kubenswrapper[5002]: E1209 10:20:08.091319 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift podName:dfa166a7-dec2-453d-9cd9-f77d30f1636a nodeName:}" failed. No retries permitted until 2025-12-09 10:20:12.091304621 +0000 UTC m=+1144.483355702 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift") pod "swift-storage-0" (UID: "dfa166a7-dec2-453d-9cd9-f77d30f1636a") : configmap "swift-ring-files" not found Dec 09 10:20:09 crc kubenswrapper[5002]: I1209 10:20:09.816859 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" podUID="b68e0176-a28e-4b3b-ad77-28d6f5f943f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 09 10:20:11 crc kubenswrapper[5002]: I1209 10:20:11.517218 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bbdqx" event={"ID":"f36484d9-764c-460b-b48e-f0c36e1145e8","Type":"ContainerStarted","Data":"0061a469361fca7875a55c02553365038f8dd7530d1c4f33aecb9ad66561977a"} Dec 09 10:20:11 crc kubenswrapper[5002]: I1209 10:20:11.557947 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 09 10:20:11 crc kubenswrapper[5002]: I1209 10:20:11.640826 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="d100f321-6fe6-4eb3-a00c-50b9ff5e2861" containerName="galera" probeResult="failure" output=< Dec 09 10:20:11 crc kubenswrapper[5002]: wsrep_local_state_comment (Joined) differs from Synced Dec 09 10:20:11 crc kubenswrapper[5002]: > Dec 09 10:20:12 crc kubenswrapper[5002]: I1209 10:20:12.178649 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:12 crc kubenswrapper[5002]: E1209 10:20:12.178883 5002 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 10:20:12 crc kubenswrapper[5002]: E1209 10:20:12.179067 5002 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 10:20:12 crc kubenswrapper[5002]: E1209 10:20:12.179121 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift podName:dfa166a7-dec2-453d-9cd9-f77d30f1636a nodeName:}" failed. No retries permitted until 2025-12-09 10:20:20.179106221 +0000 UTC m=+1152.571157292 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift") pod "swift-storage-0" (UID: "dfa166a7-dec2-453d-9cd9-f77d30f1636a") : configmap "swift-ring-files" not found Dec 09 10:20:12 crc kubenswrapper[5002]: I1209 10:20:12.534914 5002 generic.go:334] "Generic (PLEG): container finished" podID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerID="0061a469361fca7875a55c02553365038f8dd7530d1c4f33aecb9ad66561977a" exitCode=0 Dec 09 10:20:12 crc kubenswrapper[5002]: I1209 10:20:12.534956 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bbdqx" event={"ID":"f36484d9-764c-460b-b48e-f0c36e1145e8","Type":"ContainerDied","Data":"0061a469361fca7875a55c02553365038f8dd7530d1c4f33aecb9ad66561977a"} Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.260530 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.332522 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-config\") pod \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.332852 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lkbx\" (UniqueName: \"kubernetes.io/projected/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-kube-api-access-7lkbx\") pod \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.332905 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-dns-svc\") pod \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.332994 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-ovsdbserver-nb\") pod \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\" (UID: \"b68e0176-a28e-4b3b-ad77-28d6f5f943f8\") " Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.339689 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-kube-api-access-7lkbx" (OuterVolumeSpecName: "kube-api-access-7lkbx") pod "b68e0176-a28e-4b3b-ad77-28d6f5f943f8" (UID: "b68e0176-a28e-4b3b-ad77-28d6f5f943f8"). InnerVolumeSpecName "kube-api-access-7lkbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.402428 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b68e0176-a28e-4b3b-ad77-28d6f5f943f8" (UID: "b68e0176-a28e-4b3b-ad77-28d6f5f943f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.403665 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b68e0176-a28e-4b3b-ad77-28d6f5f943f8" (UID: "b68e0176-a28e-4b3b-ad77-28d6f5f943f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.409861 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-config" (OuterVolumeSpecName: "config") pod "b68e0176-a28e-4b3b-ad77-28d6f5f943f8" (UID: "b68e0176-a28e-4b3b-ad77-28d6f5f943f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.435460 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lkbx\" (UniqueName: \"kubernetes.io/projected/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-kube-api-access-7lkbx\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.435501 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.435510 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.435519 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b68e0176-a28e-4b3b-ad77-28d6f5f943f8-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.519593 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.550134 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" event={"ID":"b68e0176-a28e-4b3b-ad77-28d6f5f943f8","Type":"ContainerDied","Data":"a78113cfaa8618718893e21e7f955e97ffccef7dc23e3f940de10e0df8765156"} Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.550191 5002 scope.go:117] "RemoveContainer" containerID="e7abda2952182e01e3daf7a0c60defb86e901a049f11ffa5d03026700505c1d4" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.550332 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-52q77" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.587420 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-52q77"] Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.596983 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-52q77"] Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.607245 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 09 10:20:14 crc kubenswrapper[5002]: I1209 10:20:14.746155 5002 scope.go:117] "RemoveContainer" containerID="4872faca91228f6786252a7cf045fdb023f887e08a21fae602fc4c1d8ea6b323" Dec 09 10:20:16 crc kubenswrapper[5002]: I1209 10:20:16.068799 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b68e0176-a28e-4b3b-ad77-28d6f5f943f8" path="/var/lib/kubelet/pods/b68e0176-a28e-4b3b-ad77-28d6f5f943f8/volumes" Dec 09 10:20:16 crc kubenswrapper[5002]: I1209 10:20:16.574807 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-sdzqj" event={"ID":"4c575708-ef27-4116-8eb1-9eae1aae903f","Type":"ContainerStarted","Data":"a20c1f002f7d8bcb4ece970ae57f34515b9903a80f22bc0993c57aaa705415a2"} Dec 09 10:20:16 crc kubenswrapper[5002]: I1209 10:20:16.577050 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bbdqx" event={"ID":"f36484d9-764c-460b-b48e-f0c36e1145e8","Type":"ContainerStarted","Data":"0f54f46d5b1dd9e6fe86f181086d298dc0411f77090795c15f7773a0005ebd85"} Dec 09 10:20:16 crc kubenswrapper[5002]: I1209 10:20:16.577172 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:16 crc kubenswrapper[5002]: I1209 10:20:16.578787 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c","Type":"ContainerStarted","Data":"e7405eccb60d4c551738f72265103db45ab534fc28ebd77e569e7e80a729397d"} Dec 09 10:20:16 crc kubenswrapper[5002]: I1209 10:20:16.578839 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c","Type":"ContainerStarted","Data":"077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be"} Dec 09 10:20:16 crc kubenswrapper[5002]: I1209 10:20:16.579016 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 09 10:20:16 crc kubenswrapper[5002]: I1209 10:20:16.598946 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-sdzqj" podStartSLOduration=3.028475083 podStartE2EDuration="12.598926497s" podCreationTimestamp="2025-12-09 10:20:04 +0000 UTC" firstStartedPulling="2025-12-09 10:20:05.406568419 +0000 UTC m=+1137.798619500" lastFinishedPulling="2025-12-09 10:20:14.977019833 +0000 UTC m=+1147.369070914" observedRunningTime="2025-12-09 10:20:16.592147844 +0000 UTC m=+1148.984198955" watchObservedRunningTime="2025-12-09 10:20:16.598926497 +0000 UTC m=+1148.990977578" Dec 09 10:20:16 crc kubenswrapper[5002]: I1209 10:20:16.619172 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.009890462 podStartE2EDuration="15.619153292s" podCreationTimestamp="2025-12-09 10:20:01 +0000 UTC" firstStartedPulling="2025-12-09 10:20:02.369250694 +0000 UTC m=+1134.761301785" lastFinishedPulling="2025-12-09 10:20:14.978513534 +0000 UTC m=+1147.370564615" observedRunningTime="2025-12-09 10:20:16.611866865 +0000 UTC m=+1149.003917956" watchObservedRunningTime="2025-12-09 10:20:16.619153292 +0000 UTC m=+1149.011204373" Dec 09 10:20:16 crc kubenswrapper[5002]: I1209 10:20:16.633529 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-bbdqx" podStartSLOduration=13.633513139 podStartE2EDuration="13.633513139s" podCreationTimestamp="2025-12-09 10:20:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:20:16.63021732 +0000 UTC m=+1149.022268411" watchObservedRunningTime="2025-12-09 10:20:16.633513139 +0000 UTC m=+1149.025564220" Dec 09 10:20:17 crc kubenswrapper[5002]: I1209 10:20:17.587258 5002 generic.go:334] "Generic (PLEG): container finished" podID="9278e14e-2524-4e42-b870-f493ea02ede8" containerID="2354f84dc26ea366678ca4f5adfbc4ea21ccc99533838a486cd28b5710f9ea1c" exitCode=0 Dec 09 10:20:17 crc kubenswrapper[5002]: I1209 10:20:17.587359 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9278e14e-2524-4e42-b870-f493ea02ede8","Type":"ContainerDied","Data":"2354f84dc26ea366678ca4f5adfbc4ea21ccc99533838a486cd28b5710f9ea1c"} Dec 09 10:20:18 crc kubenswrapper[5002]: I1209 10:20:18.595658 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9278e14e-2524-4e42-b870-f493ea02ede8","Type":"ContainerStarted","Data":"1faa363b9769f751a8c09fade1d2f2f3b3905666130dc1d039543eef99f84775"} Dec 09 10:20:18 crc kubenswrapper[5002]: I1209 10:20:18.595944 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:20:18 crc kubenswrapper[5002]: I1209 10:20:18.627614 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.095275445 podStartE2EDuration="1m1.627590952s" podCreationTimestamp="2025-12-09 10:19:17 +0000 UTC" firstStartedPulling="2025-12-09 10:19:18.968442747 +0000 UTC m=+1091.360493818" lastFinishedPulling="2025-12-09 10:19:43.500758244 +0000 UTC m=+1115.892809325" observedRunningTime="2025-12-09 10:20:18.620882001 +0000 UTC m=+1151.012933092" watchObservedRunningTime="2025-12-09 10:20:18.627590952 +0000 UTC m=+1151.019642033" Dec 09 10:20:20 crc kubenswrapper[5002]: I1209 10:20:20.241543 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:20 crc kubenswrapper[5002]: E1209 10:20:20.241833 5002 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 10:20:20 crc kubenswrapper[5002]: E1209 10:20:20.241990 5002 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 10:20:20 crc kubenswrapper[5002]: E1209 10:20:20.242057 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift podName:dfa166a7-dec2-453d-9cd9-f77d30f1636a nodeName:}" failed. No retries permitted until 2025-12-09 10:20:36.242041794 +0000 UTC m=+1168.634092875 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift") pod "swift-storage-0" (UID: "dfa166a7-dec2-453d-9cd9-f77d30f1636a") : configmap "swift-ring-files" not found Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.394897 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-10b7-account-create-update-9bwfh"] Dec 09 10:20:21 crc kubenswrapper[5002]: E1209 10:20:21.395525 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68e0176-a28e-4b3b-ad77-28d6f5f943f8" containerName="init" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.395539 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68e0176-a28e-4b3b-ad77-28d6f5f943f8" containerName="init" Dec 09 10:20:21 crc kubenswrapper[5002]: E1209 10:20:21.395562 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68e0176-a28e-4b3b-ad77-28d6f5f943f8" containerName="dnsmasq-dns" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.395569 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68e0176-a28e-4b3b-ad77-28d6f5f943f8" containerName="dnsmasq-dns" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.395731 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="b68e0176-a28e-4b3b-ad77-28d6f5f943f8" containerName="dnsmasq-dns" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.396267 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-10b7-account-create-update-9bwfh" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.399039 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.405466 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-9kc52"] Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.407218 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9kc52" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.417191 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-10b7-account-create-update-9bwfh"] Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.426946 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9kc52"] Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.462192 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzsrw\" (UniqueName: \"kubernetes.io/projected/c0d368da-0627-4c5e-ad8e-821bbc205874-kube-api-access-jzsrw\") pod \"keystone-10b7-account-create-update-9bwfh\" (UID: \"c0d368da-0627-4c5e-ad8e-821bbc205874\") " pod="openstack/keystone-10b7-account-create-update-9bwfh" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.462235 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e4c9601-2d18-4b05-9187-f668fb760808-operator-scripts\") pod \"keystone-db-create-9kc52\" (UID: \"1e4c9601-2d18-4b05-9187-f668fb760808\") " pod="openstack/keystone-db-create-9kc52" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.462314 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0d368da-0627-4c5e-ad8e-821bbc205874-operator-scripts\") pod \"keystone-10b7-account-create-update-9bwfh\" (UID: \"c0d368da-0627-4c5e-ad8e-821bbc205874\") " pod="openstack/keystone-10b7-account-create-update-9bwfh" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.462342 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpsc8\" (UniqueName: \"kubernetes.io/projected/1e4c9601-2d18-4b05-9187-f668fb760808-kube-api-access-cpsc8\") pod \"keystone-db-create-9kc52\" (UID: \"1e4c9601-2d18-4b05-9187-f668fb760808\") " pod="openstack/keystone-db-create-9kc52" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.478033 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.552209 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-qxbxl"] Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.554930 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qxbxl" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.564587 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzsrw\" (UniqueName: \"kubernetes.io/projected/c0d368da-0627-4c5e-ad8e-821bbc205874-kube-api-access-jzsrw\") pod \"keystone-10b7-account-create-update-9bwfh\" (UID: \"c0d368da-0627-4c5e-ad8e-821bbc205874\") " pod="openstack/keystone-10b7-account-create-update-9bwfh" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.564644 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e4c9601-2d18-4b05-9187-f668fb760808-operator-scripts\") pod \"keystone-db-create-9kc52\" (UID: \"1e4c9601-2d18-4b05-9187-f668fb760808\") " pod="openstack/keystone-db-create-9kc52" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.564790 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0d368da-0627-4c5e-ad8e-821bbc205874-operator-scripts\") pod \"keystone-10b7-account-create-update-9bwfh\" (UID: \"c0d368da-0627-4c5e-ad8e-821bbc205874\") " pod="openstack/keystone-10b7-account-create-update-9bwfh" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.564848 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpsc8\" (UniqueName: \"kubernetes.io/projected/1e4c9601-2d18-4b05-9187-f668fb760808-kube-api-access-cpsc8\") pod \"keystone-db-create-9kc52\" (UID: \"1e4c9601-2d18-4b05-9187-f668fb760808\") " pod="openstack/keystone-db-create-9kc52" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.567666 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e4c9601-2d18-4b05-9187-f668fb760808-operator-scripts\") pod \"keystone-db-create-9kc52\" (UID: \"1e4c9601-2d18-4b05-9187-f668fb760808\") " pod="openstack/keystone-db-create-9kc52" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.570024 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0d368da-0627-4c5e-ad8e-821bbc205874-operator-scripts\") pod \"keystone-10b7-account-create-update-9bwfh\" (UID: \"c0d368da-0627-4c5e-ad8e-821bbc205874\") " pod="openstack/keystone-10b7-account-create-update-9bwfh" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.576600 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-qxbxl"] Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.608142 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzsrw\" (UniqueName: \"kubernetes.io/projected/c0d368da-0627-4c5e-ad8e-821bbc205874-kube-api-access-jzsrw\") pod \"keystone-10b7-account-create-update-9bwfh\" (UID: \"c0d368da-0627-4c5e-ad8e-821bbc205874\") " pod="openstack/keystone-10b7-account-create-update-9bwfh" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.608201 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpsc8\" (UniqueName: \"kubernetes.io/projected/1e4c9601-2d18-4b05-9187-f668fb760808-kube-api-access-cpsc8\") pod \"keystone-db-create-9kc52\" (UID: \"1e4c9601-2d18-4b05-9187-f668fb760808\") " pod="openstack/keystone-db-create-9kc52" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.664500 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-60c5-account-create-update-wp67w"] Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.666163 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-operator-scripts\") pod \"placement-db-create-qxbxl\" (UID: \"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3\") " pod="openstack/placement-db-create-qxbxl" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.666246 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgdwp\" (UniqueName: \"kubernetes.io/projected/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-kube-api-access-qgdwp\") pod \"placement-db-create-qxbxl\" (UID: \"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3\") " pod="openstack/placement-db-create-qxbxl" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.666864 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-60c5-account-create-update-wp67w" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.673309 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.686678 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-60c5-account-create-update-wp67w"] Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.712832 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-10b7-account-create-update-9bwfh" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.724426 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9kc52" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.768372 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-operator-scripts\") pod \"placement-db-create-qxbxl\" (UID: \"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3\") " pod="openstack/placement-db-create-qxbxl" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.768430 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgdwp\" (UniqueName: \"kubernetes.io/projected/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-kube-api-access-qgdwp\") pod \"placement-db-create-qxbxl\" (UID: \"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3\") " pod="openstack/placement-db-create-qxbxl" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.768468 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mvrr\" (UniqueName: \"kubernetes.io/projected/bc666b29-bbbf-4206-ae4d-7d7e52542577-kube-api-access-2mvrr\") pod \"placement-60c5-account-create-update-wp67w\" (UID: \"bc666b29-bbbf-4206-ae4d-7d7e52542577\") " pod="openstack/placement-60c5-account-create-update-wp67w" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.768560 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc666b29-bbbf-4206-ae4d-7d7e52542577-operator-scripts\") pod \"placement-60c5-account-create-update-wp67w\" (UID: \"bc666b29-bbbf-4206-ae4d-7d7e52542577\") " pod="openstack/placement-60c5-account-create-update-wp67w" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.769653 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-operator-scripts\") pod \"placement-db-create-qxbxl\" (UID: \"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3\") " pod="openstack/placement-db-create-qxbxl" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.792906 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgdwp\" (UniqueName: \"kubernetes.io/projected/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-kube-api-access-qgdwp\") pod \"placement-db-create-qxbxl\" (UID: \"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3\") " pod="openstack/placement-db-create-qxbxl" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.868391 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-d7hdr"] Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.870090 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mvrr\" (UniqueName: \"kubernetes.io/projected/bc666b29-bbbf-4206-ae4d-7d7e52542577-kube-api-access-2mvrr\") pod \"placement-60c5-account-create-update-wp67w\" (UID: \"bc666b29-bbbf-4206-ae4d-7d7e52542577\") " pod="openstack/placement-60c5-account-create-update-wp67w" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.870208 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc666b29-bbbf-4206-ae4d-7d7e52542577-operator-scripts\") pod \"placement-60c5-account-create-update-wp67w\" (UID: \"bc666b29-bbbf-4206-ae4d-7d7e52542577\") " pod="openstack/placement-60c5-account-create-update-wp67w" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.871312 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc666b29-bbbf-4206-ae4d-7d7e52542577-operator-scripts\") pod \"placement-60c5-account-create-update-wp67w\" (UID: \"bc666b29-bbbf-4206-ae4d-7d7e52542577\") " pod="openstack/placement-60c5-account-create-update-wp67w" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.873863 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-d7hdr" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.873922 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-d7hdr"] Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.874920 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qxbxl" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.913572 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mvrr\" (UniqueName: \"kubernetes.io/projected/bc666b29-bbbf-4206-ae4d-7d7e52542577-kube-api-access-2mvrr\") pod \"placement-60c5-account-create-update-wp67w\" (UID: \"bc666b29-bbbf-4206-ae4d-7d7e52542577\") " pod="openstack/placement-60c5-account-create-update-wp67w" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.972550 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mctrk\" (UniqueName: \"kubernetes.io/projected/31e0f58f-0655-466b-90b0-5b0e1887fe75-kube-api-access-mctrk\") pod \"glance-db-create-d7hdr\" (UID: \"31e0f58f-0655-466b-90b0-5b0e1887fe75\") " pod="openstack/glance-db-create-d7hdr" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.972629 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e0f58f-0655-466b-90b0-5b0e1887fe75-operator-scripts\") pod \"glance-db-create-d7hdr\" (UID: \"31e0f58f-0655-466b-90b0-5b0e1887fe75\") " pod="openstack/glance-db-create-d7hdr" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.981994 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a775-account-create-update-tb8ck"] Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.984652 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a775-account-create-update-tb8ck" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.989168 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 09 10:20:21 crc kubenswrapper[5002]: I1209 10:20:21.997985 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-60c5-account-create-update-wp67w" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.012756 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a775-account-create-update-tb8ck"] Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.104582 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14e584f-7d75-42a6-b6a0-1c3931b022ec-operator-scripts\") pod \"glance-a775-account-create-update-tb8ck\" (UID: \"c14e584f-7d75-42a6-b6a0-1c3931b022ec\") " pod="openstack/glance-a775-account-create-update-tb8ck" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.104680 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mctrk\" (UniqueName: \"kubernetes.io/projected/31e0f58f-0655-466b-90b0-5b0e1887fe75-kube-api-access-mctrk\") pod \"glance-db-create-d7hdr\" (UID: \"31e0f58f-0655-466b-90b0-5b0e1887fe75\") " pod="openstack/glance-db-create-d7hdr" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.104852 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e0f58f-0655-466b-90b0-5b0e1887fe75-operator-scripts\") pod \"glance-db-create-d7hdr\" (UID: \"31e0f58f-0655-466b-90b0-5b0e1887fe75\") " pod="openstack/glance-db-create-d7hdr" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.104871 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgrvr\" (UniqueName: \"kubernetes.io/projected/c14e584f-7d75-42a6-b6a0-1c3931b022ec-kube-api-access-lgrvr\") pod \"glance-a775-account-create-update-tb8ck\" (UID: \"c14e584f-7d75-42a6-b6a0-1c3931b022ec\") " pod="openstack/glance-a775-account-create-update-tb8ck" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.107312 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e0f58f-0655-466b-90b0-5b0e1887fe75-operator-scripts\") pod \"glance-db-create-d7hdr\" (UID: \"31e0f58f-0655-466b-90b0-5b0e1887fe75\") " pod="openstack/glance-db-create-d7hdr" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.171637 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mctrk\" (UniqueName: \"kubernetes.io/projected/31e0f58f-0655-466b-90b0-5b0e1887fe75-kube-api-access-mctrk\") pod \"glance-db-create-d7hdr\" (UID: \"31e0f58f-0655-466b-90b0-5b0e1887fe75\") " pod="openstack/glance-db-create-d7hdr" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.210053 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14e584f-7d75-42a6-b6a0-1c3931b022ec-operator-scripts\") pod \"glance-a775-account-create-update-tb8ck\" (UID: \"c14e584f-7d75-42a6-b6a0-1c3931b022ec\") " pod="openstack/glance-a775-account-create-update-tb8ck" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.210261 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgrvr\" (UniqueName: \"kubernetes.io/projected/c14e584f-7d75-42a6-b6a0-1c3931b022ec-kube-api-access-lgrvr\") pod \"glance-a775-account-create-update-tb8ck\" (UID: \"c14e584f-7d75-42a6-b6a0-1c3931b022ec\") " pod="openstack/glance-a775-account-create-update-tb8ck" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.213485 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-d7hdr" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.221519 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14e584f-7d75-42a6-b6a0-1c3931b022ec-operator-scripts\") pod \"glance-a775-account-create-update-tb8ck\" (UID: \"c14e584f-7d75-42a6-b6a0-1c3931b022ec\") " pod="openstack/glance-a775-account-create-update-tb8ck" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.238200 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgrvr\" (UniqueName: \"kubernetes.io/projected/c14e584f-7d75-42a6-b6a0-1c3931b022ec-kube-api-access-lgrvr\") pod \"glance-a775-account-create-update-tb8ck\" (UID: \"c14e584f-7d75-42a6-b6a0-1c3931b022ec\") " pod="openstack/glance-a775-account-create-update-tb8ck" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.314136 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a775-account-create-update-tb8ck" Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.373974 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-10b7-account-create-update-9bwfh"] Dec 09 10:20:22 crc kubenswrapper[5002]: W1209 10:20:22.397302 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0d368da_0627_4c5e_ad8e_821bbc205874.slice/crio-a3a69139d2f48455ebb78400f7918e97017c5a96ba41cfcf94fe0d653a0d9119 WatchSource:0}: Error finding container a3a69139d2f48455ebb78400f7918e97017c5a96ba41cfcf94fe0d653a0d9119: Status 404 returned error can't find the container with id a3a69139d2f48455ebb78400f7918e97017c5a96ba41cfcf94fe0d653a0d9119 Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.513780 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9kc52"] Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.621827 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-60c5-account-create-update-wp67w"] Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.638757 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9kc52" event={"ID":"1e4c9601-2d18-4b05-9187-f668fb760808","Type":"ContainerStarted","Data":"a4b38a042c5d84e1d106259db02e0f699e7bf41bb04cfdd955804f49acd24c42"} Dec 09 10:20:22 crc kubenswrapper[5002]: W1209 10:20:22.638873 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc666b29_bbbf_4206_ae4d_7d7e52542577.slice/crio-c1917c3584aab604d06d560b43f333f320324a406bdb83a43f7554469d9e15b3 WatchSource:0}: Error finding container c1917c3584aab604d06d560b43f333f320324a406bdb83a43f7554469d9e15b3: Status 404 returned error can't find the container with id c1917c3584aab604d06d560b43f333f320324a406bdb83a43f7554469d9e15b3 Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.640146 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-10b7-account-create-update-9bwfh" event={"ID":"c0d368da-0627-4c5e-ad8e-821bbc205874","Type":"ContainerStarted","Data":"2bc5e6430f12b450876b8742c6de7a12ca3be054e352f48252fd07dd06cf20d1"} Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.640177 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-10b7-account-create-update-9bwfh" event={"ID":"c0d368da-0627-4c5e-ad8e-821bbc205874","Type":"ContainerStarted","Data":"a3a69139d2f48455ebb78400f7918e97017c5a96ba41cfcf94fe0d653a0d9119"} Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.652071 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-qxbxl"] Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.676955 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-10b7-account-create-update-9bwfh" podStartSLOduration=1.676936295 podStartE2EDuration="1.676936295s" podCreationTimestamp="2025-12-09 10:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:20:22.676405261 +0000 UTC m=+1155.068456342" watchObservedRunningTime="2025-12-09 10:20:22.676936295 +0000 UTC m=+1155.068987376" Dec 09 10:20:22 crc kubenswrapper[5002]: W1209 10:20:22.692005 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fb9392c_5a4f_4bc2_89b0_2c4b59853cf3.slice/crio-ad82d10f01162f50a29e61cb7b27113087f33c49514b69d5d8b458378c09a456 WatchSource:0}: Error finding container ad82d10f01162f50a29e61cb7b27113087f33c49514b69d5d8b458378c09a456: Status 404 returned error can't find the container with id ad82d10f01162f50a29e61cb7b27113087f33c49514b69d5d8b458378c09a456 Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.751746 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-d7hdr"] Dec 09 10:20:22 crc kubenswrapper[5002]: I1209 10:20:22.759282 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a775-account-create-update-tb8ck"] Dec 09 10:20:22 crc kubenswrapper[5002]: W1209 10:20:22.776497 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31e0f58f_0655_466b_90b0_5b0e1887fe75.slice/crio-c259b3f6a79941f7e967e04f53355e4f41d3b5d567579e327e2baf98fda39b9f WatchSource:0}: Error finding container c259b3f6a79941f7e967e04f53355e4f41d3b5d567579e327e2baf98fda39b9f: Status 404 returned error can't find the container with id c259b3f6a79941f7e967e04f53355e4f41d3b5d567579e327e2baf98fda39b9f Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.471978 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.553630 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8n4dc"] Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.553903 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" podUID="0471984e-6e43-485b-8ae1-cd7be7951b89" containerName="dnsmasq-dns" containerID="cri-o://3533e415fc3b60a01fbf71e5712338dcea984abc003f940dc73c43c941049ff3" gracePeriod=10 Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.650396 5002 generic.go:334] "Generic (PLEG): container finished" podID="c14e584f-7d75-42a6-b6a0-1c3931b022ec" containerID="94d514a6ebe6c1eff84f9f88c7ad35fe0027d8ec7ab6c85385089b506517f141" exitCode=0 Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.650469 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a775-account-create-update-tb8ck" event={"ID":"c14e584f-7d75-42a6-b6a0-1c3931b022ec","Type":"ContainerDied","Data":"94d514a6ebe6c1eff84f9f88c7ad35fe0027d8ec7ab6c85385089b506517f141"} Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.650499 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a775-account-create-update-tb8ck" event={"ID":"c14e584f-7d75-42a6-b6a0-1c3931b022ec","Type":"ContainerStarted","Data":"bea5cafa1daac96258a19b95a3d122d5a588134bd2984bafc6d5122c0c841326"} Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.652621 5002 generic.go:334] "Generic (PLEG): container finished" podID="8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3" containerID="2b16c2489d05cd8e096c967062632f622a1d7e67f83b1054e468488b01f8969f" exitCode=0 Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.652670 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qxbxl" event={"ID":"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3","Type":"ContainerDied","Data":"2b16c2489d05cd8e096c967062632f622a1d7e67f83b1054e468488b01f8969f"} Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.652688 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qxbxl" event={"ID":"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3","Type":"ContainerStarted","Data":"ad82d10f01162f50a29e61cb7b27113087f33c49514b69d5d8b458378c09a456"} Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.655239 5002 generic.go:334] "Generic (PLEG): container finished" podID="31e0f58f-0655-466b-90b0-5b0e1887fe75" containerID="13f99fc09c85191228c9afb1481413c4572d18c67027baca455d6c30a57f6d07" exitCode=0 Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.655289 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-d7hdr" event={"ID":"31e0f58f-0655-466b-90b0-5b0e1887fe75","Type":"ContainerDied","Data":"13f99fc09c85191228c9afb1481413c4572d18c67027baca455d6c30a57f6d07"} Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.655305 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-d7hdr" event={"ID":"31e0f58f-0655-466b-90b0-5b0e1887fe75","Type":"ContainerStarted","Data":"c259b3f6a79941f7e967e04f53355e4f41d3b5d567579e327e2baf98fda39b9f"} Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.657156 5002 generic.go:334] "Generic (PLEG): container finished" podID="1e4c9601-2d18-4b05-9187-f668fb760808" containerID="fa7bfbca45b21b5d83224de4e38c3dfca159e2b30e9c84bbb5739ee8d9f10e3d" exitCode=0 Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.657269 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9kc52" event={"ID":"1e4c9601-2d18-4b05-9187-f668fb760808","Type":"ContainerDied","Data":"fa7bfbca45b21b5d83224de4e38c3dfca159e2b30e9c84bbb5739ee8d9f10e3d"} Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.660615 5002 generic.go:334] "Generic (PLEG): container finished" podID="bc666b29-bbbf-4206-ae4d-7d7e52542577" containerID="468110e57eae6321d8b757c466c1d5d8849d2adc61dbc25d3bbb142325cb38a6" exitCode=0 Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.660712 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-60c5-account-create-update-wp67w" event={"ID":"bc666b29-bbbf-4206-ae4d-7d7e52542577","Type":"ContainerDied","Data":"468110e57eae6321d8b757c466c1d5d8849d2adc61dbc25d3bbb142325cb38a6"} Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.660742 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-60c5-account-create-update-wp67w" event={"ID":"bc666b29-bbbf-4206-ae4d-7d7e52542577","Type":"ContainerStarted","Data":"c1917c3584aab604d06d560b43f333f320324a406bdb83a43f7554469d9e15b3"} Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.662511 5002 generic.go:334] "Generic (PLEG): container finished" podID="4c575708-ef27-4116-8eb1-9eae1aae903f" containerID="a20c1f002f7d8bcb4ece970ae57f34515b9903a80f22bc0993c57aaa705415a2" exitCode=0 Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.662563 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-sdzqj" event={"ID":"4c575708-ef27-4116-8eb1-9eae1aae903f","Type":"ContainerDied","Data":"a20c1f002f7d8bcb4ece970ae57f34515b9903a80f22bc0993c57aaa705415a2"} Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.668337 5002 generic.go:334] "Generic (PLEG): container finished" podID="c0d368da-0627-4c5e-ad8e-821bbc205874" containerID="2bc5e6430f12b450876b8742c6de7a12ca3be054e352f48252fd07dd06cf20d1" exitCode=0 Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.668408 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-10b7-account-create-update-9bwfh" event={"ID":"c0d368da-0627-4c5e-ad8e-821bbc205874","Type":"ContainerDied","Data":"2bc5e6430f12b450876b8742c6de7a12ca3be054e352f48252fd07dd06cf20d1"} Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.773470 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-47b4k" podUID="fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" containerName="ovn-controller" probeResult="failure" output=< Dec 09 10:20:23 crc kubenswrapper[5002]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 09 10:20:23 crc kubenswrapper[5002]: > Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.839422 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:20:23 crc kubenswrapper[5002]: I1209 10:20:23.852832 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.086774 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-47b4k-config-hdjsd"] Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.088315 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.092806 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.103834 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-47b4k-config-hdjsd"] Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.144763 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-scripts\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.144907 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run-ovn\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.145839 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br6sq\" (UniqueName: \"kubernetes.io/projected/117985ac-d1a7-4cd9-bd99-cd893d5d6761-kube-api-access-br6sq\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.146000 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-additional-scripts\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.146265 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.146350 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-log-ovn\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.248084 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-scripts\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.248146 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run-ovn\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.248191 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br6sq\" (UniqueName: \"kubernetes.io/projected/117985ac-d1a7-4cd9-bd99-cd893d5d6761-kube-api-access-br6sq\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.248223 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-additional-scripts\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.248301 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.248323 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-log-ovn\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.248531 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-log-ovn\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.248550 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run-ovn\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.248839 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.249349 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-additional-scripts\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.250475 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-scripts\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.275607 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br6sq\" (UniqueName: \"kubernetes.io/projected/117985ac-d1a7-4cd9-bd99-cd893d5d6761-kube-api-access-br6sq\") pod \"ovn-controller-47b4k-config-hdjsd\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.406905 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.678165 5002 generic.go:334] "Generic (PLEG): container finished" podID="0471984e-6e43-485b-8ae1-cd7be7951b89" containerID="3533e415fc3b60a01fbf71e5712338dcea984abc003f940dc73c43c941049ff3" exitCode=0 Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.678205 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" event={"ID":"0471984e-6e43-485b-8ae1-cd7be7951b89","Type":"ContainerDied","Data":"3533e415fc3b60a01fbf71e5712338dcea984abc003f940dc73c43c941049ff3"} Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.901462 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-47b4k-config-hdjsd"] Dec 09 10:20:24 crc kubenswrapper[5002]: I1209 10:20:24.930380 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" podUID="0471984e-6e43-485b-8ae1-cd7be7951b89" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.129921 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qxbxl" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.167464 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-operator-scripts\") pod \"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3\" (UID: \"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.167712 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgdwp\" (UniqueName: \"kubernetes.io/projected/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-kube-api-access-qgdwp\") pod \"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3\" (UID: \"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.168367 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3" (UID: "8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.178125 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-kube-api-access-qgdwp" (OuterVolumeSpecName: "kube-api-access-qgdwp") pod "8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3" (UID: "8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3"). InnerVolumeSpecName "kube-api-access-qgdwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.273429 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.273472 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgdwp\" (UniqueName: \"kubernetes.io/projected/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3-kube-api-access-qgdwp\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.336675 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-60c5-account-create-update-wp67w" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.344928 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a775-account-create-update-tb8ck" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.349968 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-d7hdr" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.368573 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.375496 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgrvr\" (UniqueName: \"kubernetes.io/projected/c14e584f-7d75-42a6-b6a0-1c3931b022ec-kube-api-access-lgrvr\") pod \"c14e584f-7d75-42a6-b6a0-1c3931b022ec\" (UID: \"c14e584f-7d75-42a6-b6a0-1c3931b022ec\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.375603 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e0f58f-0655-466b-90b0-5b0e1887fe75-operator-scripts\") pod \"31e0f58f-0655-466b-90b0-5b0e1887fe75\" (UID: \"31e0f58f-0655-466b-90b0-5b0e1887fe75\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.375718 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mctrk\" (UniqueName: \"kubernetes.io/projected/31e0f58f-0655-466b-90b0-5b0e1887fe75-kube-api-access-mctrk\") pod \"31e0f58f-0655-466b-90b0-5b0e1887fe75\" (UID: \"31e0f58f-0655-466b-90b0-5b0e1887fe75\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.375765 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc666b29-bbbf-4206-ae4d-7d7e52542577-operator-scripts\") pod \"bc666b29-bbbf-4206-ae4d-7d7e52542577\" (UID: \"bc666b29-bbbf-4206-ae4d-7d7e52542577\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.375787 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14e584f-7d75-42a6-b6a0-1c3931b022ec-operator-scripts\") pod \"c14e584f-7d75-42a6-b6a0-1c3931b022ec\" (UID: \"c14e584f-7d75-42a6-b6a0-1c3931b022ec\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.375841 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mvrr\" (UniqueName: \"kubernetes.io/projected/bc666b29-bbbf-4206-ae4d-7d7e52542577-kube-api-access-2mvrr\") pod \"bc666b29-bbbf-4206-ae4d-7d7e52542577\" (UID: \"bc666b29-bbbf-4206-ae4d-7d7e52542577\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.376210 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e0f58f-0655-466b-90b0-5b0e1887fe75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31e0f58f-0655-466b-90b0-5b0e1887fe75" (UID: "31e0f58f-0655-466b-90b0-5b0e1887fe75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.380142 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc666b29-bbbf-4206-ae4d-7d7e52542577-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc666b29-bbbf-4206-ae4d-7d7e52542577" (UID: "bc666b29-bbbf-4206-ae4d-7d7e52542577"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.380498 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14e584f-7d75-42a6-b6a0-1c3931b022ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c14e584f-7d75-42a6-b6a0-1c3931b022ec" (UID: "c14e584f-7d75-42a6-b6a0-1c3931b022ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.383258 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9kc52" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.386115 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc666b29-bbbf-4206-ae4d-7d7e52542577-kube-api-access-2mvrr" (OuterVolumeSpecName: "kube-api-access-2mvrr") pod "bc666b29-bbbf-4206-ae4d-7d7e52542577" (UID: "bc666b29-bbbf-4206-ae4d-7d7e52542577"). InnerVolumeSpecName "kube-api-access-2mvrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.386377 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14e584f-7d75-42a6-b6a0-1c3931b022ec-kube-api-access-lgrvr" (OuterVolumeSpecName: "kube-api-access-lgrvr") pod "c14e584f-7d75-42a6-b6a0-1c3931b022ec" (UID: "c14e584f-7d75-42a6-b6a0-1c3931b022ec"). InnerVolumeSpecName "kube-api-access-lgrvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.387658 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-10b7-account-create-update-9bwfh" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.394929 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e0f58f-0655-466b-90b0-5b0e1887fe75-kube-api-access-mctrk" (OuterVolumeSpecName: "kube-api-access-mctrk") pod "31e0f58f-0655-466b-90b0-5b0e1887fe75" (UID: "31e0f58f-0655-466b-90b0-5b0e1887fe75"). InnerVolumeSpecName "kube-api-access-mctrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.476602 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-dispersionconf\") pod \"4c575708-ef27-4116-8eb1-9eae1aae903f\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.476878 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rczrz\" (UniqueName: \"kubernetes.io/projected/4c575708-ef27-4116-8eb1-9eae1aae903f-kube-api-access-rczrz\") pod \"4c575708-ef27-4116-8eb1-9eae1aae903f\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.476959 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-swiftconf\") pod \"4c575708-ef27-4116-8eb1-9eae1aae903f\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.476976 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4c575708-ef27-4116-8eb1-9eae1aae903f-etc-swift\") pod \"4c575708-ef27-4116-8eb1-9eae1aae903f\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477044 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzsrw\" (UniqueName: \"kubernetes.io/projected/c0d368da-0627-4c5e-ad8e-821bbc205874-kube-api-access-jzsrw\") pod \"c0d368da-0627-4c5e-ad8e-821bbc205874\" (UID: \"c0d368da-0627-4c5e-ad8e-821bbc205874\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477061 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0d368da-0627-4c5e-ad8e-821bbc205874-operator-scripts\") pod \"c0d368da-0627-4c5e-ad8e-821bbc205874\" (UID: \"c0d368da-0627-4c5e-ad8e-821bbc205874\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477135 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-scripts\") pod \"4c575708-ef27-4116-8eb1-9eae1aae903f\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477175 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e4c9601-2d18-4b05-9187-f668fb760808-operator-scripts\") pod \"1e4c9601-2d18-4b05-9187-f668fb760808\" (UID: \"1e4c9601-2d18-4b05-9187-f668fb760808\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477190 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-combined-ca-bundle\") pod \"4c575708-ef27-4116-8eb1-9eae1aae903f\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477205 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-ring-data-devices\") pod \"4c575708-ef27-4116-8eb1-9eae1aae903f\" (UID: \"4c575708-ef27-4116-8eb1-9eae1aae903f\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477229 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpsc8\" (UniqueName: \"kubernetes.io/projected/1e4c9601-2d18-4b05-9187-f668fb760808-kube-api-access-cpsc8\") pod \"1e4c9601-2d18-4b05-9187-f668fb760808\" (UID: \"1e4c9601-2d18-4b05-9187-f668fb760808\") " Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477538 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e0f58f-0655-466b-90b0-5b0e1887fe75-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477555 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mctrk\" (UniqueName: \"kubernetes.io/projected/31e0f58f-0655-466b-90b0-5b0e1887fe75-kube-api-access-mctrk\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477565 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc666b29-bbbf-4206-ae4d-7d7e52542577-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477575 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14e584f-7d75-42a6-b6a0-1c3931b022ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477608 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mvrr\" (UniqueName: \"kubernetes.io/projected/bc666b29-bbbf-4206-ae4d-7d7e52542577-kube-api-access-2mvrr\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.477618 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgrvr\" (UniqueName: \"kubernetes.io/projected/c14e584f-7d75-42a6-b6a0-1c3931b022ec-kube-api-access-lgrvr\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.478302 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c575708-ef27-4116-8eb1-9eae1aae903f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4c575708-ef27-4116-8eb1-9eae1aae903f" (UID: "4c575708-ef27-4116-8eb1-9eae1aae903f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.479552 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4c575708-ef27-4116-8eb1-9eae1aae903f" (UID: "4c575708-ef27-4116-8eb1-9eae1aae903f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.479982 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e4c9601-2d18-4b05-9187-f668fb760808-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e4c9601-2d18-4b05-9187-f668fb760808" (UID: "1e4c9601-2d18-4b05-9187-f668fb760808"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.480653 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0d368da-0627-4c5e-ad8e-821bbc205874-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0d368da-0627-4c5e-ad8e-821bbc205874" (UID: "c0d368da-0627-4c5e-ad8e-821bbc205874"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.481209 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c575708-ef27-4116-8eb1-9eae1aae903f-kube-api-access-rczrz" (OuterVolumeSpecName: "kube-api-access-rczrz") pod "4c575708-ef27-4116-8eb1-9eae1aae903f" (UID: "4c575708-ef27-4116-8eb1-9eae1aae903f"). InnerVolumeSpecName "kube-api-access-rczrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.481906 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e4c9601-2d18-4b05-9187-f668fb760808-kube-api-access-cpsc8" (OuterVolumeSpecName: "kube-api-access-cpsc8") pod "1e4c9601-2d18-4b05-9187-f668fb760808" (UID: "1e4c9601-2d18-4b05-9187-f668fb760808"). InnerVolumeSpecName "kube-api-access-cpsc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.483615 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d368da-0627-4c5e-ad8e-821bbc205874-kube-api-access-jzsrw" (OuterVolumeSpecName: "kube-api-access-jzsrw") pod "c0d368da-0627-4c5e-ad8e-821bbc205874" (UID: "c0d368da-0627-4c5e-ad8e-821bbc205874"). InnerVolumeSpecName "kube-api-access-jzsrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.488338 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4c575708-ef27-4116-8eb1-9eae1aae903f" (UID: "4c575708-ef27-4116-8eb1-9eae1aae903f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.501721 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c575708-ef27-4116-8eb1-9eae1aae903f" (UID: "4c575708-ef27-4116-8eb1-9eae1aae903f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.502429 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4c575708-ef27-4116-8eb1-9eae1aae903f" (UID: "4c575708-ef27-4116-8eb1-9eae1aae903f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.506031 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-scripts" (OuterVolumeSpecName: "scripts") pod "4c575708-ef27-4116-8eb1-9eae1aae903f" (UID: "4c575708-ef27-4116-8eb1-9eae1aae903f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.579371 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.579406 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e4c9601-2d18-4b05-9187-f668fb760808-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.579416 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.579428 5002 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4c575708-ef27-4116-8eb1-9eae1aae903f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.579437 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpsc8\" (UniqueName: \"kubernetes.io/projected/1e4c9601-2d18-4b05-9187-f668fb760808-kube-api-access-cpsc8\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.579447 5002 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.579457 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rczrz\" (UniqueName: \"kubernetes.io/projected/4c575708-ef27-4116-8eb1-9eae1aae903f-kube-api-access-rczrz\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.579468 5002 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4c575708-ef27-4116-8eb1-9eae1aae903f-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.579476 5002 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4c575708-ef27-4116-8eb1-9eae1aae903f-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.579484 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzsrw\" (UniqueName: \"kubernetes.io/projected/c0d368da-0627-4c5e-ad8e-821bbc205874-kube-api-access-jzsrw\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.579492 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0d368da-0627-4c5e-ad8e-821bbc205874-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.687893 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a775-account-create-update-tb8ck" event={"ID":"c14e584f-7d75-42a6-b6a0-1c3931b022ec","Type":"ContainerDied","Data":"bea5cafa1daac96258a19b95a3d122d5a588134bd2984bafc6d5122c0c841326"} Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.687944 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bea5cafa1daac96258a19b95a3d122d5a588134bd2984bafc6d5122c0c841326" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.688174 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a775-account-create-update-tb8ck" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.689681 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qxbxl" event={"ID":"8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3","Type":"ContainerDied","Data":"ad82d10f01162f50a29e61cb7b27113087f33c49514b69d5d8b458378c09a456"} Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.689723 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad82d10f01162f50a29e61cb7b27113087f33c49514b69d5d8b458378c09a456" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.689690 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qxbxl" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.691497 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-d7hdr" event={"ID":"31e0f58f-0655-466b-90b0-5b0e1887fe75","Type":"ContainerDied","Data":"c259b3f6a79941f7e967e04f53355e4f41d3b5d567579e327e2baf98fda39b9f"} Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.691523 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c259b3f6a79941f7e967e04f53355e4f41d3b5d567579e327e2baf98fda39b9f" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.691553 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-d7hdr" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.706757 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9kc52" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.706792 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9kc52" event={"ID":"1e4c9601-2d18-4b05-9187-f668fb760808","Type":"ContainerDied","Data":"a4b38a042c5d84e1d106259db02e0f699e7bf41bb04cfdd955804f49acd24c42"} Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.706860 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b38a042c5d84e1d106259db02e0f699e7bf41bb04cfdd955804f49acd24c42" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.711410 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-47b4k-config-hdjsd" event={"ID":"117985ac-d1a7-4cd9-bd99-cd893d5d6761","Type":"ContainerStarted","Data":"de78610ea99dbf9186f16c4b2d222c0cca1a210d9c33a9549a45ec2ebebd54c8"} Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.716617 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-60c5-account-create-update-wp67w" event={"ID":"bc666b29-bbbf-4206-ae4d-7d7e52542577","Type":"ContainerDied","Data":"c1917c3584aab604d06d560b43f333f320324a406bdb83a43f7554469d9e15b3"} Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.716664 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1917c3584aab604d06d560b43f333f320324a406bdb83a43f7554469d9e15b3" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.717463 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-60c5-account-create-update-wp67w" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.719304 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-sdzqj" event={"ID":"4c575708-ef27-4116-8eb1-9eae1aae903f","Type":"ContainerDied","Data":"10a1fcf3eebee9467f6df85f740fbe292392c452666f131dfc8edd6a1b31a13c"} Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.719367 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10a1fcf3eebee9467f6df85f740fbe292392c452666f131dfc8edd6a1b31a13c" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.719444 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-sdzqj" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.757159 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-10b7-account-create-update-9bwfh" event={"ID":"c0d368da-0627-4c5e-ad8e-821bbc205874","Type":"ContainerDied","Data":"a3a69139d2f48455ebb78400f7918e97017c5a96ba41cfcf94fe0d653a0d9119"} Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.757211 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3a69139d2f48455ebb78400f7918e97017c5a96ba41cfcf94fe0d653a0d9119" Dec 09 10:20:25 crc kubenswrapper[5002]: I1209 10:20:25.757280 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-10b7-account-create-update-9bwfh" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.326349 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.404087 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-config\") pod \"0471984e-6e43-485b-8ae1-cd7be7951b89\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.404470 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdd2q\" (UniqueName: \"kubernetes.io/projected/0471984e-6e43-485b-8ae1-cd7be7951b89-kube-api-access-vdd2q\") pod \"0471984e-6e43-485b-8ae1-cd7be7951b89\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.404545 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-dns-svc\") pod \"0471984e-6e43-485b-8ae1-cd7be7951b89\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.404574 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-sb\") pod \"0471984e-6e43-485b-8ae1-cd7be7951b89\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.404673 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-nb\") pod \"0471984e-6e43-485b-8ae1-cd7be7951b89\" (UID: \"0471984e-6e43-485b-8ae1-cd7be7951b89\") " Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.414462 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0471984e-6e43-485b-8ae1-cd7be7951b89-kube-api-access-vdd2q" (OuterVolumeSpecName: "kube-api-access-vdd2q") pod "0471984e-6e43-485b-8ae1-cd7be7951b89" (UID: "0471984e-6e43-485b-8ae1-cd7be7951b89"). InnerVolumeSpecName "kube-api-access-vdd2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.450498 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0471984e-6e43-485b-8ae1-cd7be7951b89" (UID: "0471984e-6e43-485b-8ae1-cd7be7951b89"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.460575 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0471984e-6e43-485b-8ae1-cd7be7951b89" (UID: "0471984e-6e43-485b-8ae1-cd7be7951b89"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.462241 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0471984e-6e43-485b-8ae1-cd7be7951b89" (UID: "0471984e-6e43-485b-8ae1-cd7be7951b89"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.467043 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-config" (OuterVolumeSpecName: "config") pod "0471984e-6e43-485b-8ae1-cd7be7951b89" (UID: "0471984e-6e43-485b-8ae1-cd7be7951b89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.506492 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdd2q\" (UniqueName: \"kubernetes.io/projected/0471984e-6e43-485b-8ae1-cd7be7951b89-kube-api-access-vdd2q\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.506525 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.506533 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.506542 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.506550 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0471984e-6e43-485b-8ae1-cd7be7951b89-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.766008 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.766031 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8n4dc" event={"ID":"0471984e-6e43-485b-8ae1-cd7be7951b89","Type":"ContainerDied","Data":"0d7663bb9ceca220b586c60cc11077984640349dd2311f7ec0d5bc426736c709"} Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.766089 5002 scope.go:117] "RemoveContainer" containerID="3533e415fc3b60a01fbf71e5712338dcea984abc003f940dc73c43c941049ff3" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.768300 5002 generic.go:334] "Generic (PLEG): container finished" podID="117985ac-d1a7-4cd9-bd99-cd893d5d6761" containerID="476934525d50946ec2cb341c1e2b13e82d6ab250a0f2cde25a67c44f24103f24" exitCode=0 Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.768335 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-47b4k-config-hdjsd" event={"ID":"117985ac-d1a7-4cd9-bd99-cd893d5d6761","Type":"ContainerDied","Data":"476934525d50946ec2cb341c1e2b13e82d6ab250a0f2cde25a67c44f24103f24"} Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.784123 5002 scope.go:117] "RemoveContainer" containerID="2eb60917a5ad3055737c86a738d3e854d369bc3db6eaace9f147dbb05a1e54b5" Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.817165 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8n4dc"] Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.827152 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8n4dc"] Dec 09 10:20:26 crc kubenswrapper[5002]: I1209 10:20:26.993672 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.180927 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-h2xt6"] Dec 09 10:20:27 crc kubenswrapper[5002]: E1209 10:20:27.181334 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e4c9601-2d18-4b05-9187-f668fb760808" containerName="mariadb-database-create" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181357 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e4c9601-2d18-4b05-9187-f668fb760808" containerName="mariadb-database-create" Dec 09 10:20:27 crc kubenswrapper[5002]: E1209 10:20:27.181368 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d368da-0627-4c5e-ad8e-821bbc205874" containerName="mariadb-account-create-update" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181377 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d368da-0627-4c5e-ad8e-821bbc205874" containerName="mariadb-account-create-update" Dec 09 10:20:27 crc kubenswrapper[5002]: E1209 10:20:27.181394 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0471984e-6e43-485b-8ae1-cd7be7951b89" containerName="init" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181402 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="0471984e-6e43-485b-8ae1-cd7be7951b89" containerName="init" Dec 09 10:20:27 crc kubenswrapper[5002]: E1209 10:20:27.181416 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc666b29-bbbf-4206-ae4d-7d7e52542577" containerName="mariadb-account-create-update" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181423 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc666b29-bbbf-4206-ae4d-7d7e52542577" containerName="mariadb-account-create-update" Dec 09 10:20:27 crc kubenswrapper[5002]: E1209 10:20:27.181438 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c575708-ef27-4116-8eb1-9eae1aae903f" containerName="swift-ring-rebalance" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181446 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c575708-ef27-4116-8eb1-9eae1aae903f" containerName="swift-ring-rebalance" Dec 09 10:20:27 crc kubenswrapper[5002]: E1209 10:20:27.181462 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e0f58f-0655-466b-90b0-5b0e1887fe75" containerName="mariadb-database-create" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181471 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e0f58f-0655-466b-90b0-5b0e1887fe75" containerName="mariadb-database-create" Dec 09 10:20:27 crc kubenswrapper[5002]: E1209 10:20:27.181487 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14e584f-7d75-42a6-b6a0-1c3931b022ec" containerName="mariadb-account-create-update" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181495 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14e584f-7d75-42a6-b6a0-1c3931b022ec" containerName="mariadb-account-create-update" Dec 09 10:20:27 crc kubenswrapper[5002]: E1209 10:20:27.181509 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3" containerName="mariadb-database-create" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181516 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3" containerName="mariadb-database-create" Dec 09 10:20:27 crc kubenswrapper[5002]: E1209 10:20:27.181529 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0471984e-6e43-485b-8ae1-cd7be7951b89" containerName="dnsmasq-dns" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181535 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="0471984e-6e43-485b-8ae1-cd7be7951b89" containerName="dnsmasq-dns" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181721 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c575708-ef27-4116-8eb1-9eae1aae903f" containerName="swift-ring-rebalance" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181740 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e0f58f-0655-466b-90b0-5b0e1887fe75" containerName="mariadb-database-create" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181762 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="0471984e-6e43-485b-8ae1-cd7be7951b89" containerName="dnsmasq-dns" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181774 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14e584f-7d75-42a6-b6a0-1c3931b022ec" containerName="mariadb-account-create-update" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181784 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3" containerName="mariadb-database-create" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181799 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc666b29-bbbf-4206-ae4d-7d7e52542577" containerName="mariadb-account-create-update" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181808 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e4c9601-2d18-4b05-9187-f668fb760808" containerName="mariadb-database-create" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.181840 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d368da-0627-4c5e-ad8e-821bbc205874" containerName="mariadb-account-create-update" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.182549 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.186714 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.186978 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-55wxd" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.187847 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-h2xt6"] Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.216259 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-db-sync-config-data\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.216337 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-config-data\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.216363 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-combined-ca-bundle\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.216447 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmfs8\" (UniqueName: \"kubernetes.io/projected/d846942e-6d0d-4e42-a584-a910a56d9718-kube-api-access-zmfs8\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.318224 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-combined-ca-bundle\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.319222 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmfs8\" (UniqueName: \"kubernetes.io/projected/d846942e-6d0d-4e42-a584-a910a56d9718-kube-api-access-zmfs8\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.319310 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-db-sync-config-data\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.319378 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-config-data\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.321890 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-combined-ca-bundle\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.323040 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-config-data\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.323638 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-db-sync-config-data\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.334988 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmfs8\" (UniqueName: \"kubernetes.io/projected/d846942e-6d0d-4e42-a584-a910a56d9718-kube-api-access-zmfs8\") pod \"glance-db-sync-h2xt6\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:27 crc kubenswrapper[5002]: I1209 10:20:27.497757 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-h2xt6" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.045083 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.070800 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0471984e-6e43-485b-8ae1-cd7be7951b89" path="/var/lib/kubelet/pods/0471984e-6e43-485b-8ae1-cd7be7951b89/volumes" Dec 09 10:20:28 crc kubenswrapper[5002]: W1209 10:20:28.098591 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd846942e_6d0d_4e42_a584_a910a56d9718.slice/crio-56b299b470e773b7eef13a6091808032a0b755b9eb90b13c8655a128cb44c6a3 WatchSource:0}: Error finding container 56b299b470e773b7eef13a6091808032a0b755b9eb90b13c8655a128cb44c6a3: Status 404 returned error can't find the container with id 56b299b470e773b7eef13a6091808032a0b755b9eb90b13c8655a128cb44c6a3 Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.118587 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-h2xt6"] Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.132497 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-additional-scripts\") pod \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.132547 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br6sq\" (UniqueName: \"kubernetes.io/projected/117985ac-d1a7-4cd9-bd99-cd893d5d6761-kube-api-access-br6sq\") pod \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.132619 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run-ovn\") pod \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.132639 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-log-ovn\") pod \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.132661 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-scripts\") pod \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.132708 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run\") pod \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\" (UID: \"117985ac-d1a7-4cd9-bd99-cd893d5d6761\") " Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.133473 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "117985ac-d1a7-4cd9-bd99-cd893d5d6761" (UID: "117985ac-d1a7-4cd9-bd99-cd893d5d6761"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.133795 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "117985ac-d1a7-4cd9-bd99-cd893d5d6761" (UID: "117985ac-d1a7-4cd9-bd99-cd893d5d6761"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.134057 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "117985ac-d1a7-4cd9-bd99-cd893d5d6761" (UID: "117985ac-d1a7-4cd9-bd99-cd893d5d6761"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.134403 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-scripts" (OuterVolumeSpecName: "scripts") pod "117985ac-d1a7-4cd9-bd99-cd893d5d6761" (UID: "117985ac-d1a7-4cd9-bd99-cd893d5d6761"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.134431 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run" (OuterVolumeSpecName: "var-run") pod "117985ac-d1a7-4cd9-bd99-cd893d5d6761" (UID: "117985ac-d1a7-4cd9-bd99-cd893d5d6761"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.140563 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117985ac-d1a7-4cd9-bd99-cd893d5d6761-kube-api-access-br6sq" (OuterVolumeSpecName: "kube-api-access-br6sq") pod "117985ac-d1a7-4cd9-bd99-cd893d5d6761" (UID: "117985ac-d1a7-4cd9-bd99-cd893d5d6761"). InnerVolumeSpecName "kube-api-access-br6sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.234794 5002 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.234856 5002 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.234867 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.234877 5002 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/117985ac-d1a7-4cd9-bd99-cd893d5d6761-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.234888 5002 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/117985ac-d1a7-4cd9-bd99-cd893d5d6761-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.234900 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br6sq\" (UniqueName: \"kubernetes.io/projected/117985ac-d1a7-4cd9-bd99-cd893d5d6761-kube-api-access-br6sq\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.515222 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="9278e14e-2524-4e42-b870-f493ea02ede8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.778397 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-47b4k" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.787119 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-h2xt6" event={"ID":"d846942e-6d0d-4e42-a584-a910a56d9718","Type":"ContainerStarted","Data":"56b299b470e773b7eef13a6091808032a0b755b9eb90b13c8655a128cb44c6a3"} Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.789622 5002 generic.go:334] "Generic (PLEG): container finished" podID="58c08274-46ea-48be-a135-0c1174cd6135" containerID="700088876c2e92d617598571a2ba75be5d5b0ca2bdb88d2688fcecc1a5db9a68" exitCode=0 Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.789665 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"58c08274-46ea-48be-a135-0c1174cd6135","Type":"ContainerDied","Data":"700088876c2e92d617598571a2ba75be5d5b0ca2bdb88d2688fcecc1a5db9a68"} Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.793936 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-47b4k-config-hdjsd" event={"ID":"117985ac-d1a7-4cd9-bd99-cd893d5d6761","Type":"ContainerDied","Data":"de78610ea99dbf9186f16c4b2d222c0cca1a210d9c33a9549a45ec2ebebd54c8"} Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.793971 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de78610ea99dbf9186f16c4b2d222c0cca1a210d9c33a9549a45ec2ebebd54c8" Dec 09 10:20:28 crc kubenswrapper[5002]: I1209 10:20:28.794033 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k-config-hdjsd" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.216385 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-47b4k-config-hdjsd"] Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.224599 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-47b4k-config-hdjsd"] Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.295798 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-47b4k-config-jl69s"] Dec 09 10:20:29 crc kubenswrapper[5002]: E1209 10:20:29.296154 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117985ac-d1a7-4cd9-bd99-cd893d5d6761" containerName="ovn-config" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.296169 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="117985ac-d1a7-4cd9-bd99-cd893d5d6761" containerName="ovn-config" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.296329 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="117985ac-d1a7-4cd9-bd99-cd893d5d6761" containerName="ovn-config" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.296853 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.298595 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.320249 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-47b4k-config-jl69s"] Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.349866 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run-ovn\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.349916 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-scripts\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.349966 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.349987 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8q6g\" (UniqueName: \"kubernetes.io/projected/f46cddea-99a5-484d-9287-a59c9056bcf9-kube-api-access-j8q6g\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.350079 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-log-ovn\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.350097 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-additional-scripts\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.451061 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-additional-scripts\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.451187 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run-ovn\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.451237 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-scripts\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.451292 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.451321 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8q6g\" (UniqueName: \"kubernetes.io/projected/f46cddea-99a5-484d-9287-a59c9056bcf9-kube-api-access-j8q6g\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.451397 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-log-ovn\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.451586 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-log-ovn\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.451612 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run-ovn\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.451733 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-additional-scripts\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.451737 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.453535 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-scripts\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.476011 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8q6g\" (UniqueName: \"kubernetes.io/projected/f46cddea-99a5-484d-9287-a59c9056bcf9-kube-api-access-j8q6g\") pod \"ovn-controller-47b4k-config-jl69s\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.622132 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.804953 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"58c08274-46ea-48be-a135-0c1174cd6135","Type":"ContainerStarted","Data":"b05714ada64dee7eaed39017f863e151b219f928c230aa2f336910df9726668b"} Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.805890 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 09 10:20:29 crc kubenswrapper[5002]: I1209 10:20:29.833518 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371963.02128 podStartE2EDuration="1m13.833496276s" podCreationTimestamp="2025-12-09 10:19:16 +0000 UTC" firstStartedPulling="2025-12-09 10:19:18.806710239 +0000 UTC m=+1091.198761320" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:20:29.824868993 +0000 UTC m=+1162.216920074" watchObservedRunningTime="2025-12-09 10:20:29.833496276 +0000 UTC m=+1162.225547357" Dec 09 10:20:30 crc kubenswrapper[5002]: I1209 10:20:30.071443 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117985ac-d1a7-4cd9-bd99-cd893d5d6761" path="/var/lib/kubelet/pods/117985ac-d1a7-4cd9-bd99-cd893d5d6761/volumes" Dec 09 10:20:30 crc kubenswrapper[5002]: I1209 10:20:30.117031 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-47b4k-config-jl69s"] Dec 09 10:20:30 crc kubenswrapper[5002]: W1209 10:20:30.126608 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf46cddea_99a5_484d_9287_a59c9056bcf9.slice/crio-7689990aa458651bd0c6fb2041fc1f278d7ff398082e0c28ab0b2ad359cdfee4 WatchSource:0}: Error finding container 7689990aa458651bd0c6fb2041fc1f278d7ff398082e0c28ab0b2ad359cdfee4: Status 404 returned error can't find the container with id 7689990aa458651bd0c6fb2041fc1f278d7ff398082e0c28ab0b2ad359cdfee4 Dec 09 10:20:30 crc kubenswrapper[5002]: I1209 10:20:30.815846 5002 generic.go:334] "Generic (PLEG): container finished" podID="f46cddea-99a5-484d-9287-a59c9056bcf9" containerID="bf71ec6fe28fd4abe441e77ddd99cb0e8c7339fa9a0888289ab5a90b48904a26" exitCode=0 Dec 09 10:20:30 crc kubenswrapper[5002]: I1209 10:20:30.817348 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-47b4k-config-jl69s" event={"ID":"f46cddea-99a5-484d-9287-a59c9056bcf9","Type":"ContainerDied","Data":"bf71ec6fe28fd4abe441e77ddd99cb0e8c7339fa9a0888289ab5a90b48904a26"} Dec 09 10:20:30 crc kubenswrapper[5002]: I1209 10:20:30.817374 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-47b4k-config-jl69s" event={"ID":"f46cddea-99a5-484d-9287-a59c9056bcf9","Type":"ContainerStarted","Data":"7689990aa458651bd0c6fb2041fc1f278d7ff398082e0c28ab0b2ad359cdfee4"} Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.180289 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.210985 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8q6g\" (UniqueName: \"kubernetes.io/projected/f46cddea-99a5-484d-9287-a59c9056bcf9-kube-api-access-j8q6g\") pod \"f46cddea-99a5-484d-9287-a59c9056bcf9\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.211053 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-scripts\") pod \"f46cddea-99a5-484d-9287-a59c9056bcf9\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.211097 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run\") pod \"f46cddea-99a5-484d-9287-a59c9056bcf9\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.211146 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-log-ovn\") pod \"f46cddea-99a5-484d-9287-a59c9056bcf9\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.211175 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run-ovn\") pod \"f46cddea-99a5-484d-9287-a59c9056bcf9\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.211276 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-additional-scripts\") pod \"f46cddea-99a5-484d-9287-a59c9056bcf9\" (UID: \"f46cddea-99a5-484d-9287-a59c9056bcf9\") " Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.212577 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f46cddea-99a5-484d-9287-a59c9056bcf9" (UID: "f46cddea-99a5-484d-9287-a59c9056bcf9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.216795 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-scripts" (OuterVolumeSpecName: "scripts") pod "f46cddea-99a5-484d-9287-a59c9056bcf9" (UID: "f46cddea-99a5-484d-9287-a59c9056bcf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.216879 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run" (OuterVolumeSpecName: "var-run") pod "f46cddea-99a5-484d-9287-a59c9056bcf9" (UID: "f46cddea-99a5-484d-9287-a59c9056bcf9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.216906 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f46cddea-99a5-484d-9287-a59c9056bcf9" (UID: "f46cddea-99a5-484d-9287-a59c9056bcf9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.216929 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f46cddea-99a5-484d-9287-a59c9056bcf9" (UID: "f46cddea-99a5-484d-9287-a59c9056bcf9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.291915 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46cddea-99a5-484d-9287-a59c9056bcf9-kube-api-access-j8q6g" (OuterVolumeSpecName: "kube-api-access-j8q6g") pod "f46cddea-99a5-484d-9287-a59c9056bcf9" (UID: "f46cddea-99a5-484d-9287-a59c9056bcf9"). InnerVolumeSpecName "kube-api-access-j8q6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.315046 5002 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.315405 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8q6g\" (UniqueName: \"kubernetes.io/projected/f46cddea-99a5-484d-9287-a59c9056bcf9-kube-api-access-j8q6g\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.315420 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f46cddea-99a5-484d-9287-a59c9056bcf9-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.315430 5002 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.315440 5002 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.315449 5002 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f46cddea-99a5-484d-9287-a59c9056bcf9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.844109 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-47b4k-config-jl69s" event={"ID":"f46cddea-99a5-484d-9287-a59c9056bcf9","Type":"ContainerDied","Data":"7689990aa458651bd0c6fb2041fc1f278d7ff398082e0c28ab0b2ad359cdfee4"} Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.844150 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7689990aa458651bd0c6fb2041fc1f278d7ff398082e0c28ab0b2ad359cdfee4" Dec 09 10:20:32 crc kubenswrapper[5002]: I1209 10:20:32.844176 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k-config-jl69s" Dec 09 10:20:33 crc kubenswrapper[5002]: I1209 10:20:33.260800 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-47b4k-config-jl69s"] Dec 09 10:20:33 crc kubenswrapper[5002]: I1209 10:20:33.266229 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-47b4k-config-jl69s"] Dec 09 10:20:34 crc kubenswrapper[5002]: I1209 10:20:34.077736 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46cddea-99a5-484d-9287-a59c9056bcf9" path="/var/lib/kubelet/pods/f46cddea-99a5-484d-9287-a59c9056bcf9/volumes" Dec 09 10:20:36 crc kubenswrapper[5002]: I1209 10:20:36.314418 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:36 crc kubenswrapper[5002]: I1209 10:20:36.323026 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift\") pod \"swift-storage-0\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " pod="openstack/swift-storage-0" Dec 09 10:20:36 crc kubenswrapper[5002]: I1209 10:20:36.418777 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 10:20:38 crc kubenswrapper[5002]: I1209 10:20:38.514480 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:20:43 crc kubenswrapper[5002]: E1209 10:20:43.937261 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 09 10:20:43 crc kubenswrapper[5002]: E1209 10:20:43.938197 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zmfs8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-h2xt6_openstack(d846942e-6d0d-4e42-a584-a910a56d9718): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:20:43 crc kubenswrapper[5002]: E1209 10:20:43.940340 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-h2xt6" podUID="d846942e-6d0d-4e42-a584-a910a56d9718" Dec 09 10:20:44 crc kubenswrapper[5002]: I1209 10:20:44.334603 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 09 10:20:44 crc kubenswrapper[5002]: I1209 10:20:44.957935 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"2e0b06393713cb143da6941041054636068a7b1e4845e8f8db452660be757378"} Dec 09 10:20:44 crc kubenswrapper[5002]: E1209 10:20:44.960056 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-h2xt6" podUID="d846942e-6d0d-4e42-a584-a910a56d9718" Dec 09 10:20:46 crc kubenswrapper[5002]: I1209 10:20:46.975571 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47"} Dec 09 10:20:46 crc kubenswrapper[5002]: I1209 10:20:46.976089 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3"} Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.215029 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.515391 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-8796r"] Dec 09 10:20:48 crc kubenswrapper[5002]: E1209 10:20:48.515991 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46cddea-99a5-484d-9287-a59c9056bcf9" containerName="ovn-config" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.516006 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46cddea-99a5-484d-9287-a59c9056bcf9" containerName="ovn-config" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.516169 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46cddea-99a5-484d-9287-a59c9056bcf9" containerName="ovn-config" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.516663 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8796r" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.530328 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8796r"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.618108 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-kl927"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.623793 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kl927" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.640911 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e0a9-account-create-update-8w48l"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.642264 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e0a9-account-create-update-8w48l" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.647163 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.653976 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kl927"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.660931 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37720060-6c72-494c-b89f-9525e48f9f8d-operator-scripts\") pod \"barbican-db-create-8796r\" (UID: \"37720060-6c72-494c-b89f-9525e48f9f8d\") " pod="openstack/barbican-db-create-8796r" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.661093 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zldw2\" (UniqueName: \"kubernetes.io/projected/37720060-6c72-494c-b89f-9525e48f9f8d-kube-api-access-zldw2\") pod \"barbican-db-create-8796r\" (UID: \"37720060-6c72-494c-b89f-9525e48f9f8d\") " pod="openstack/barbican-db-create-8796r" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.668860 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e0a9-account-create-update-8w48l"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.727945 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-73fb-account-create-update-49bs8"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.732970 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-73fb-account-create-update-49bs8" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.736131 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.751721 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-73fb-account-create-update-49bs8"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.770083 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x58s\" (UniqueName: \"kubernetes.io/projected/4c94da26-e330-42bc-b73c-3c0134b7924d-kube-api-access-9x58s\") pod \"cinder-db-create-kl927\" (UID: \"4c94da26-e330-42bc-b73c-3c0134b7924d\") " pod="openstack/cinder-db-create-kl927" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.770143 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zldw2\" (UniqueName: \"kubernetes.io/projected/37720060-6c72-494c-b89f-9525e48f9f8d-kube-api-access-zldw2\") pod \"barbican-db-create-8796r\" (UID: \"37720060-6c72-494c-b89f-9525e48f9f8d\") " pod="openstack/barbican-db-create-8796r" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.770207 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm9nh\" (UniqueName: \"kubernetes.io/projected/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-kube-api-access-vm9nh\") pod \"cinder-e0a9-account-create-update-8w48l\" (UID: \"518e1d88-71f4-4fe3-9ad6-f938249f1ae3\") " pod="openstack/cinder-e0a9-account-create-update-8w48l" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.770226 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-operator-scripts\") pod \"cinder-e0a9-account-create-update-8w48l\" (UID: \"518e1d88-71f4-4fe3-9ad6-f938249f1ae3\") " pod="openstack/cinder-e0a9-account-create-update-8w48l" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.770258 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37720060-6c72-494c-b89f-9525e48f9f8d-operator-scripts\") pod \"barbican-db-create-8796r\" (UID: \"37720060-6c72-494c-b89f-9525e48f9f8d\") " pod="openstack/barbican-db-create-8796r" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.770308 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c94da26-e330-42bc-b73c-3c0134b7924d-operator-scripts\") pod \"cinder-db-create-kl927\" (UID: \"4c94da26-e330-42bc-b73c-3c0134b7924d\") " pod="openstack/cinder-db-create-kl927" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.771833 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37720060-6c72-494c-b89f-9525e48f9f8d-operator-scripts\") pod \"barbican-db-create-8796r\" (UID: \"37720060-6c72-494c-b89f-9525e48f9f8d\") " pod="openstack/barbican-db-create-8796r" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.799738 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zldw2\" (UniqueName: \"kubernetes.io/projected/37720060-6c72-494c-b89f-9525e48f9f8d-kube-api-access-zldw2\") pod \"barbican-db-create-8796r\" (UID: \"37720060-6c72-494c-b89f-9525e48f9f8d\") " pod="openstack/barbican-db-create-8796r" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.828371 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rztff"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.829824 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rztff" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.842566 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8796r" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.847024 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rztff"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.872160 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c94da26-e330-42bc-b73c-3c0134b7924d-operator-scripts\") pod \"cinder-db-create-kl927\" (UID: \"4c94da26-e330-42bc-b73c-3c0134b7924d\") " pod="openstack/cinder-db-create-kl927" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.873755 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x58s\" (UniqueName: \"kubernetes.io/projected/4c94da26-e330-42bc-b73c-3c0134b7924d-kube-api-access-9x58s\") pod \"cinder-db-create-kl927\" (UID: \"4c94da26-e330-42bc-b73c-3c0134b7924d\") " pod="openstack/cinder-db-create-kl927" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.874351 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm9nh\" (UniqueName: \"kubernetes.io/projected/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-kube-api-access-vm9nh\") pod \"cinder-e0a9-account-create-update-8w48l\" (UID: \"518e1d88-71f4-4fe3-9ad6-f938249f1ae3\") " pod="openstack/cinder-e0a9-account-create-update-8w48l" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.874567 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9b20e8-85f5-4a54-8505-50fc885caa71-operator-scripts\") pod \"barbican-73fb-account-create-update-49bs8\" (UID: \"ff9b20e8-85f5-4a54-8505-50fc885caa71\") " pod="openstack/barbican-73fb-account-create-update-49bs8" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.874617 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-operator-scripts\") pod \"cinder-e0a9-account-create-update-8w48l\" (UID: \"518e1d88-71f4-4fe3-9ad6-f938249f1ae3\") " pod="openstack/cinder-e0a9-account-create-update-8w48l" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.875311 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-operator-scripts\") pod \"cinder-e0a9-account-create-update-8w48l\" (UID: \"518e1d88-71f4-4fe3-9ad6-f938249f1ae3\") " pod="openstack/cinder-e0a9-account-create-update-8w48l" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.875515 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbdlr\" (UniqueName: \"kubernetes.io/projected/ff9b20e8-85f5-4a54-8505-50fc885caa71-kube-api-access-nbdlr\") pod \"barbican-73fb-account-create-update-49bs8\" (UID: \"ff9b20e8-85f5-4a54-8505-50fc885caa71\") " pod="openstack/barbican-73fb-account-create-update-49bs8" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.875875 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c94da26-e330-42bc-b73c-3c0134b7924d-operator-scripts\") pod \"cinder-db-create-kl927\" (UID: \"4c94da26-e330-42bc-b73c-3c0134b7924d\") " pod="openstack/cinder-db-create-kl927" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.911108 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x58s\" (UniqueName: \"kubernetes.io/projected/4c94da26-e330-42bc-b73c-3c0134b7924d-kube-api-access-9x58s\") pod \"cinder-db-create-kl927\" (UID: \"4c94da26-e330-42bc-b73c-3c0134b7924d\") " pod="openstack/cinder-db-create-kl927" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.922925 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm9nh\" (UniqueName: \"kubernetes.io/projected/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-kube-api-access-vm9nh\") pod \"cinder-e0a9-account-create-update-8w48l\" (UID: \"518e1d88-71f4-4fe3-9ad6-f938249f1ae3\") " pod="openstack/cinder-e0a9-account-create-update-8w48l" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.927283 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-7zbg6"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.928593 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.933412 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.933900 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.933971 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q4kd8" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.935896 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.940881 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7zbg6"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.948007 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kl927" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.950375 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1d25-account-create-update-2kb69"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.951651 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1d25-account-create-update-2kb69" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.957613 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.960756 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e0a9-account-create-update-8w48l" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.962533 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1d25-account-create-update-2kb69"] Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.983499 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec1e643-1f06-471e-8215-c690f691bb3c-operator-scripts\") pod \"neutron-db-create-rztff\" (UID: \"cec1e643-1f06-471e-8215-c690f691bb3c\") " pod="openstack/neutron-db-create-rztff" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.983629 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8md9m\" (UniqueName: \"kubernetes.io/projected/cec1e643-1f06-471e-8215-c690f691bb3c-kube-api-access-8md9m\") pod \"neutron-db-create-rztff\" (UID: \"cec1e643-1f06-471e-8215-c690f691bb3c\") " pod="openstack/neutron-db-create-rztff" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.983995 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9b20e8-85f5-4a54-8505-50fc885caa71-operator-scripts\") pod \"barbican-73fb-account-create-update-49bs8\" (UID: \"ff9b20e8-85f5-4a54-8505-50fc885caa71\") " pod="openstack/barbican-73fb-account-create-update-49bs8" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.984116 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbdlr\" (UniqueName: \"kubernetes.io/projected/ff9b20e8-85f5-4a54-8505-50fc885caa71-kube-api-access-nbdlr\") pod \"barbican-73fb-account-create-update-49bs8\" (UID: \"ff9b20e8-85f5-4a54-8505-50fc885caa71\") " pod="openstack/barbican-73fb-account-create-update-49bs8" Dec 09 10:20:48 crc kubenswrapper[5002]: I1209 10:20:48.985512 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9b20e8-85f5-4a54-8505-50fc885caa71-operator-scripts\") pod \"barbican-73fb-account-create-update-49bs8\" (UID: \"ff9b20e8-85f5-4a54-8505-50fc885caa71\") " pod="openstack/barbican-73fb-account-create-update-49bs8" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.008610 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbdlr\" (UniqueName: \"kubernetes.io/projected/ff9b20e8-85f5-4a54-8505-50fc885caa71-kube-api-access-nbdlr\") pod \"barbican-73fb-account-create-update-49bs8\" (UID: \"ff9b20e8-85f5-4a54-8505-50fc885caa71\") " pod="openstack/barbican-73fb-account-create-update-49bs8" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.061602 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-73fb-account-create-update-49bs8" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.085264 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec1e643-1f06-471e-8215-c690f691bb3c-operator-scripts\") pod \"neutron-db-create-rztff\" (UID: \"cec1e643-1f06-471e-8215-c690f691bb3c\") " pod="openstack/neutron-db-create-rztff" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.085307 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8md9m\" (UniqueName: \"kubernetes.io/projected/cec1e643-1f06-471e-8215-c690f691bb3c-kube-api-access-8md9m\") pod \"neutron-db-create-rztff\" (UID: \"cec1e643-1f06-471e-8215-c690f691bb3c\") " pod="openstack/neutron-db-create-rztff" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.085353 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-config-data\") pod \"keystone-db-sync-7zbg6\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.085391 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgvpf\" (UniqueName: \"kubernetes.io/projected/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-kube-api-access-kgvpf\") pod \"keystone-db-sync-7zbg6\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.085421 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-combined-ca-bundle\") pod \"keystone-db-sync-7zbg6\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.085447 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d84ld\" (UniqueName: \"kubernetes.io/projected/eea79378-54f4-4bc9-9673-04bdf650eb92-kube-api-access-d84ld\") pod \"neutron-1d25-account-create-update-2kb69\" (UID: \"eea79378-54f4-4bc9-9673-04bdf650eb92\") " pod="openstack/neutron-1d25-account-create-update-2kb69" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.085537 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eea79378-54f4-4bc9-9673-04bdf650eb92-operator-scripts\") pod \"neutron-1d25-account-create-update-2kb69\" (UID: \"eea79378-54f4-4bc9-9673-04bdf650eb92\") " pod="openstack/neutron-1d25-account-create-update-2kb69" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.086484 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec1e643-1f06-471e-8215-c690f691bb3c-operator-scripts\") pod \"neutron-db-create-rztff\" (UID: \"cec1e643-1f06-471e-8215-c690f691bb3c\") " pod="openstack/neutron-db-create-rztff" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.123924 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8md9m\" (UniqueName: \"kubernetes.io/projected/cec1e643-1f06-471e-8215-c690f691bb3c-kube-api-access-8md9m\") pod \"neutron-db-create-rztff\" (UID: \"cec1e643-1f06-471e-8215-c690f691bb3c\") " pod="openstack/neutron-db-create-rztff" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.155596 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rztff" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.186788 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eea79378-54f4-4bc9-9673-04bdf650eb92-operator-scripts\") pod \"neutron-1d25-account-create-update-2kb69\" (UID: \"eea79378-54f4-4bc9-9673-04bdf650eb92\") " pod="openstack/neutron-1d25-account-create-update-2kb69" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.187221 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-config-data\") pod \"keystone-db-sync-7zbg6\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.187251 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgvpf\" (UniqueName: \"kubernetes.io/projected/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-kube-api-access-kgvpf\") pod \"keystone-db-sync-7zbg6\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.187290 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-combined-ca-bundle\") pod \"keystone-db-sync-7zbg6\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.187311 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d84ld\" (UniqueName: \"kubernetes.io/projected/eea79378-54f4-4bc9-9673-04bdf650eb92-kube-api-access-d84ld\") pod \"neutron-1d25-account-create-update-2kb69\" (UID: \"eea79378-54f4-4bc9-9673-04bdf650eb92\") " pod="openstack/neutron-1d25-account-create-update-2kb69" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.187659 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eea79378-54f4-4bc9-9673-04bdf650eb92-operator-scripts\") pod \"neutron-1d25-account-create-update-2kb69\" (UID: \"eea79378-54f4-4bc9-9673-04bdf650eb92\") " pod="openstack/neutron-1d25-account-create-update-2kb69" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.194298 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-combined-ca-bundle\") pod \"keystone-db-sync-7zbg6\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.195404 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-config-data\") pod \"keystone-db-sync-7zbg6\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.211212 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d84ld\" (UniqueName: \"kubernetes.io/projected/eea79378-54f4-4bc9-9673-04bdf650eb92-kube-api-access-d84ld\") pod \"neutron-1d25-account-create-update-2kb69\" (UID: \"eea79378-54f4-4bc9-9673-04bdf650eb92\") " pod="openstack/neutron-1d25-account-create-update-2kb69" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.215873 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgvpf\" (UniqueName: \"kubernetes.io/projected/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-kube-api-access-kgvpf\") pod \"keystone-db-sync-7zbg6\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.337130 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.345068 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1d25-account-create-update-2kb69" Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.345298 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kl927"] Dec 09 10:20:49 crc kubenswrapper[5002]: W1209 10:20:49.347845 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c94da26_e330_42bc_b73c_3c0134b7924d.slice/crio-9d1cc650fd6f5957aff68623373889a3ced8939cfcb2d9e250ee452ac9e4a2eb WatchSource:0}: Error finding container 9d1cc650fd6f5957aff68623373889a3ced8939cfcb2d9e250ee452ac9e4a2eb: Status 404 returned error can't find the container with id 9d1cc650fd6f5957aff68623373889a3ced8939cfcb2d9e250ee452ac9e4a2eb Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.426038 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8796r"] Dec 09 10:20:49 crc kubenswrapper[5002]: W1209 10:20:49.440714 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37720060_6c72_494c_b89f_9525e48f9f8d.slice/crio-4c1d83724c43b651d712e2a01cd085da6179f3e6f2c56779554fa38f862b4f69 WatchSource:0}: Error finding container 4c1d83724c43b651d712e2a01cd085da6179f3e6f2c56779554fa38f862b4f69: Status 404 returned error can't find the container with id 4c1d83724c43b651d712e2a01cd085da6179f3e6f2c56779554fa38f862b4f69 Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.626890 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e0a9-account-create-update-8w48l"] Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.650911 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1d25-account-create-update-2kb69"] Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.667889 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-73fb-account-create-update-49bs8"] Dec 09 10:20:49 crc kubenswrapper[5002]: W1209 10:20:49.669728 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeea79378_54f4_4bc9_9673_04bdf650eb92.slice/crio-3d6669539a7da1a25d32a2556ee75d4756640eb0e2bad91bc83f3bb7642e2420 WatchSource:0}: Error finding container 3d6669539a7da1a25d32a2556ee75d4756640eb0e2bad91bc83f3bb7642e2420: Status 404 returned error can't find the container with id 3d6669539a7da1a25d32a2556ee75d4756640eb0e2bad91bc83f3bb7642e2420 Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.785412 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rztff"] Dec 09 10:20:49 crc kubenswrapper[5002]: I1209 10:20:49.925108 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7zbg6"] Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.025197 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8796r" event={"ID":"37720060-6c72-494c-b89f-9525e48f9f8d","Type":"ContainerStarted","Data":"91af6e127a8ca7009fb6fa302ad55f99230b8bcca01665a6d992e98fca89cf49"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.025253 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8796r" event={"ID":"37720060-6c72-494c-b89f-9525e48f9f8d","Type":"ContainerStarted","Data":"4c1d83724c43b651d712e2a01cd085da6179f3e6f2c56779554fa38f862b4f69"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.027477 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rztff" event={"ID":"cec1e643-1f06-471e-8215-c690f691bb3c","Type":"ContainerStarted","Data":"3df43bdb9649ed2eb7fdc58e5208a8cf7d35f42d152916ff55ccef22c8bd2f83"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.029549 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e0a9-account-create-update-8w48l" event={"ID":"518e1d88-71f4-4fe3-9ad6-f938249f1ae3","Type":"ContainerStarted","Data":"62986b9fc31c4835230b3d39f29d40e56e1c5bc39d5bdb95ff2621f76b756f46"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.029580 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e0a9-account-create-update-8w48l" event={"ID":"518e1d88-71f4-4fe3-9ad6-f938249f1ae3","Type":"ContainerStarted","Data":"011c71e303021d642a0291f4cb708db25e770a91d3a6da9fc09efcd78e008610"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.032466 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-73fb-account-create-update-49bs8" event={"ID":"ff9b20e8-85f5-4a54-8505-50fc885caa71","Type":"ContainerStarted","Data":"4bb7e154e5db3f98ebc63925989b215f6fce8ac5f5d6bc5e8751c3904ca59f6a"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.035845 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.037108 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1d25-account-create-update-2kb69" event={"ID":"eea79378-54f4-4bc9-9673-04bdf650eb92","Type":"ContainerStarted","Data":"ab9110ac35a8f003734d1fb5eccd9029fa547e73a8c6c4310717cc3e133cd2b9"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.037145 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1d25-account-create-update-2kb69" event={"ID":"eea79378-54f4-4bc9-9673-04bdf650eb92","Type":"ContainerStarted","Data":"3d6669539a7da1a25d32a2556ee75d4756640eb0e2bad91bc83f3bb7642e2420"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.041154 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kl927" event={"ID":"4c94da26-e330-42bc-b73c-3c0134b7924d","Type":"ContainerStarted","Data":"b33f5e5fc465757e0c888b2e806d84dfbc4b581048d50595ef724d4fadc97425"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.041211 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kl927" event={"ID":"4c94da26-e330-42bc-b73c-3c0134b7924d","Type":"ContainerStarted","Data":"9d1cc650fd6f5957aff68623373889a3ced8939cfcb2d9e250ee452ac9e4a2eb"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.043320 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7zbg6" event={"ID":"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae","Type":"ContainerStarted","Data":"45b0b9bd6d3f8c1a1a8cc37a97ad28daa7553d53acdfb46ca09c5860d3c9a0b4"} Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.044552 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-8796r" podStartSLOduration=2.044532173 podStartE2EDuration="2.044532173s" podCreationTimestamp="2025-12-09 10:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:20:50.038206992 +0000 UTC m=+1182.430258083" watchObservedRunningTime="2025-12-09 10:20:50.044532173 +0000 UTC m=+1182.436583264" Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.058264 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e0a9-account-create-update-8w48l" podStartSLOduration=2.058246344 podStartE2EDuration="2.058246344s" podCreationTimestamp="2025-12-09 10:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:20:50.05107319 +0000 UTC m=+1182.443124281" watchObservedRunningTime="2025-12-09 10:20:50.058246344 +0000 UTC m=+1182.450297425" Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.074877 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-1d25-account-create-update-2kb69" podStartSLOduration=2.074853834 podStartE2EDuration="2.074853834s" podCreationTimestamp="2025-12-09 10:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:20:50.066439586 +0000 UTC m=+1182.458490677" watchObservedRunningTime="2025-12-09 10:20:50.074853834 +0000 UTC m=+1182.466904915" Dec 09 10:20:50 crc kubenswrapper[5002]: I1209 10:20:50.091681 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-kl927" podStartSLOduration=2.091660378 podStartE2EDuration="2.091660378s" podCreationTimestamp="2025-12-09 10:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:20:50.078489982 +0000 UTC m=+1182.470541053" watchObservedRunningTime="2025-12-09 10:20:50.091660378 +0000 UTC m=+1182.483711459" Dec 09 10:20:51 crc kubenswrapper[5002]: I1209 10:20:51.059786 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rztff" event={"ID":"cec1e643-1f06-471e-8215-c690f691bb3c","Type":"ContainerStarted","Data":"c57d634a0d27647736fc38f55ce26634bc1bc214a5344d82a1f49282d5bfad44"} Dec 09 10:20:51 crc kubenswrapper[5002]: I1209 10:20:51.064296 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-73fb-account-create-update-49bs8" event={"ID":"ff9b20e8-85f5-4a54-8505-50fc885caa71","Type":"ContainerStarted","Data":"27d444bfbd27fdffdca54947d3c047948faf97cbca7f3940b0b42fe2b1c91d86"} Dec 09 10:20:51 crc kubenswrapper[5002]: I1209 10:20:51.067975 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567"} Dec 09 10:20:51 crc kubenswrapper[5002]: I1209 10:20:51.072084 5002 generic.go:334] "Generic (PLEG): container finished" podID="4c94da26-e330-42bc-b73c-3c0134b7924d" containerID="b33f5e5fc465757e0c888b2e806d84dfbc4b581048d50595ef724d4fadc97425" exitCode=0 Dec 09 10:20:51 crc kubenswrapper[5002]: I1209 10:20:51.072159 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kl927" event={"ID":"4c94da26-e330-42bc-b73c-3c0134b7924d","Type":"ContainerDied","Data":"b33f5e5fc465757e0c888b2e806d84dfbc4b581048d50595ef724d4fadc97425"} Dec 09 10:20:51 crc kubenswrapper[5002]: I1209 10:20:51.073533 5002 generic.go:334] "Generic (PLEG): container finished" podID="37720060-6c72-494c-b89f-9525e48f9f8d" containerID="91af6e127a8ca7009fb6fa302ad55f99230b8bcca01665a6d992e98fca89cf49" exitCode=0 Dec 09 10:20:51 crc kubenswrapper[5002]: I1209 10:20:51.074064 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8796r" event={"ID":"37720060-6c72-494c-b89f-9525e48f9f8d","Type":"ContainerDied","Data":"91af6e127a8ca7009fb6fa302ad55f99230b8bcca01665a6d992e98fca89cf49"} Dec 09 10:20:51 crc kubenswrapper[5002]: I1209 10:20:51.089896 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-rztff" podStartSLOduration=3.089869723 podStartE2EDuration="3.089869723s" podCreationTimestamp="2025-12-09 10:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:20:51.081706672 +0000 UTC m=+1183.473757763" watchObservedRunningTime="2025-12-09 10:20:51.089869723 +0000 UTC m=+1183.481920814" Dec 09 10:20:51 crc kubenswrapper[5002]: I1209 10:20:51.139795 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-73fb-account-create-update-49bs8" podStartSLOduration=3.139774944 podStartE2EDuration="3.139774944s" podCreationTimestamp="2025-12-09 10:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:20:51.131419518 +0000 UTC m=+1183.523470599" watchObservedRunningTime="2025-12-09 10:20:51.139774944 +0000 UTC m=+1183.531826025" Dec 09 10:20:52 crc kubenswrapper[5002]: I1209 10:20:52.082503 5002 generic.go:334] "Generic (PLEG): container finished" podID="cec1e643-1f06-471e-8215-c690f691bb3c" containerID="c57d634a0d27647736fc38f55ce26634bc1bc214a5344d82a1f49282d5bfad44" exitCode=0 Dec 09 10:20:52 crc kubenswrapper[5002]: I1209 10:20:52.082570 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rztff" event={"ID":"cec1e643-1f06-471e-8215-c690f691bb3c","Type":"ContainerDied","Data":"c57d634a0d27647736fc38f55ce26634bc1bc214a5344d82a1f49282d5bfad44"} Dec 09 10:20:53 crc kubenswrapper[5002]: I1209 10:20:53.104981 5002 generic.go:334] "Generic (PLEG): container finished" podID="518e1d88-71f4-4fe3-9ad6-f938249f1ae3" containerID="62986b9fc31c4835230b3d39f29d40e56e1c5bc39d5bdb95ff2621f76b756f46" exitCode=0 Dec 09 10:20:53 crc kubenswrapper[5002]: I1209 10:20:53.105085 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e0a9-account-create-update-8w48l" event={"ID":"518e1d88-71f4-4fe3-9ad6-f938249f1ae3","Type":"ContainerDied","Data":"62986b9fc31c4835230b3d39f29d40e56e1c5bc39d5bdb95ff2621f76b756f46"} Dec 09 10:20:55 crc kubenswrapper[5002]: I1209 10:20:55.998293 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8796r" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.025104 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kl927" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.050553 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e0a9-account-create-update-8w48l" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.078301 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rztff" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.121553 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x58s\" (UniqueName: \"kubernetes.io/projected/4c94da26-e330-42bc-b73c-3c0134b7924d-kube-api-access-9x58s\") pod \"4c94da26-e330-42bc-b73c-3c0134b7924d\" (UID: \"4c94da26-e330-42bc-b73c-3c0134b7924d\") " Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.121621 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zldw2\" (UniqueName: \"kubernetes.io/projected/37720060-6c72-494c-b89f-9525e48f9f8d-kube-api-access-zldw2\") pod \"37720060-6c72-494c-b89f-9525e48f9f8d\" (UID: \"37720060-6c72-494c-b89f-9525e48f9f8d\") " Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.121680 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37720060-6c72-494c-b89f-9525e48f9f8d-operator-scripts\") pod \"37720060-6c72-494c-b89f-9525e48f9f8d\" (UID: \"37720060-6c72-494c-b89f-9525e48f9f8d\") " Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.121749 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c94da26-e330-42bc-b73c-3c0134b7924d-operator-scripts\") pod \"4c94da26-e330-42bc-b73c-3c0134b7924d\" (UID: \"4c94da26-e330-42bc-b73c-3c0134b7924d\") " Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.123112 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c94da26-e330-42bc-b73c-3c0134b7924d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c94da26-e330-42bc-b73c-3c0134b7924d" (UID: "4c94da26-e330-42bc-b73c-3c0134b7924d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.123188 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37720060-6c72-494c-b89f-9525e48f9f8d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37720060-6c72-494c-b89f-9525e48f9f8d" (UID: "37720060-6c72-494c-b89f-9525e48f9f8d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.127535 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37720060-6c72-494c-b89f-9525e48f9f8d-kube-api-access-zldw2" (OuterVolumeSpecName: "kube-api-access-zldw2") pod "37720060-6c72-494c-b89f-9525e48f9f8d" (UID: "37720060-6c72-494c-b89f-9525e48f9f8d"). InnerVolumeSpecName "kube-api-access-zldw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.130364 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c94da26-e330-42bc-b73c-3c0134b7924d-kube-api-access-9x58s" (OuterVolumeSpecName: "kube-api-access-9x58s") pod "4c94da26-e330-42bc-b73c-3c0134b7924d" (UID: "4c94da26-e330-42bc-b73c-3c0134b7924d"). InnerVolumeSpecName "kube-api-access-9x58s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.137189 5002 generic.go:334] "Generic (PLEG): container finished" podID="eea79378-54f4-4bc9-9673-04bdf650eb92" containerID="ab9110ac35a8f003734d1fb5eccd9029fa547e73a8c6c4310717cc3e133cd2b9" exitCode=0 Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.137252 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1d25-account-create-update-2kb69" event={"ID":"eea79378-54f4-4bc9-9673-04bdf650eb92","Type":"ContainerDied","Data":"ab9110ac35a8f003734d1fb5eccd9029fa547e73a8c6c4310717cc3e133cd2b9"} Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.139932 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kl927" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.140243 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kl927" event={"ID":"4c94da26-e330-42bc-b73c-3c0134b7924d","Type":"ContainerDied","Data":"9d1cc650fd6f5957aff68623373889a3ced8939cfcb2d9e250ee452ac9e4a2eb"} Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.140349 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d1cc650fd6f5957aff68623373889a3ced8939cfcb2d9e250ee452ac9e4a2eb" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.141338 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7zbg6" event={"ID":"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae","Type":"ContainerStarted","Data":"20ae5f344909f1778bd970bb441b4e8f054eb7da8d7d4f81a65ab39091b3eb8e"} Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.142835 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8796r" event={"ID":"37720060-6c72-494c-b89f-9525e48f9f8d","Type":"ContainerDied","Data":"4c1d83724c43b651d712e2a01cd085da6179f3e6f2c56779554fa38f862b4f69"} Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.142927 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c1d83724c43b651d712e2a01cd085da6179f3e6f2c56779554fa38f862b4f69" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.143223 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8796r" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.152020 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rztff" event={"ID":"cec1e643-1f06-471e-8215-c690f691bb3c","Type":"ContainerDied","Data":"3df43bdb9649ed2eb7fdc58e5208a8cf7d35f42d152916ff55ccef22c8bd2f83"} Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.152075 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3df43bdb9649ed2eb7fdc58e5208a8cf7d35f42d152916ff55ccef22c8bd2f83" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.152151 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rztff" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.159627 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e0a9-account-create-update-8w48l" event={"ID":"518e1d88-71f4-4fe3-9ad6-f938249f1ae3","Type":"ContainerDied","Data":"011c71e303021d642a0291f4cb708db25e770a91d3a6da9fc09efcd78e008610"} Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.159698 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="011c71e303021d642a0291f4cb708db25e770a91d3a6da9fc09efcd78e008610" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.159628 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e0a9-account-create-update-8w48l" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.161197 5002 generic.go:334] "Generic (PLEG): container finished" podID="ff9b20e8-85f5-4a54-8505-50fc885caa71" containerID="27d444bfbd27fdffdca54947d3c047948faf97cbca7f3940b0b42fe2b1c91d86" exitCode=0 Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.161279 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-73fb-account-create-update-49bs8" event={"ID":"ff9b20e8-85f5-4a54-8505-50fc885caa71","Type":"ContainerDied","Data":"27d444bfbd27fdffdca54947d3c047948faf97cbca7f3940b0b42fe2b1c91d86"} Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.167992 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364"} Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.179466 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-7zbg6" podStartSLOduration=2.301096187 podStartE2EDuration="8.179444992s" podCreationTimestamp="2025-12-09 10:20:48 +0000 UTC" firstStartedPulling="2025-12-09 10:20:49.934046963 +0000 UTC m=+1182.326098044" lastFinishedPulling="2025-12-09 10:20:55.812395768 +0000 UTC m=+1188.204446849" observedRunningTime="2025-12-09 10:20:56.166931273 +0000 UTC m=+1188.558982354" watchObservedRunningTime="2025-12-09 10:20:56.179444992 +0000 UTC m=+1188.571496073" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.223695 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8md9m\" (UniqueName: \"kubernetes.io/projected/cec1e643-1f06-471e-8215-c690f691bb3c-kube-api-access-8md9m\") pod \"cec1e643-1f06-471e-8215-c690f691bb3c\" (UID: \"cec1e643-1f06-471e-8215-c690f691bb3c\") " Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.223846 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm9nh\" (UniqueName: \"kubernetes.io/projected/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-kube-api-access-vm9nh\") pod \"518e1d88-71f4-4fe3-9ad6-f938249f1ae3\" (UID: \"518e1d88-71f4-4fe3-9ad6-f938249f1ae3\") " Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.224327 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-operator-scripts\") pod \"518e1d88-71f4-4fe3-9ad6-f938249f1ae3\" (UID: \"518e1d88-71f4-4fe3-9ad6-f938249f1ae3\") " Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.224382 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec1e643-1f06-471e-8215-c690f691bb3c-operator-scripts\") pod \"cec1e643-1f06-471e-8215-c690f691bb3c\" (UID: \"cec1e643-1f06-471e-8215-c690f691bb3c\") " Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.224941 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "518e1d88-71f4-4fe3-9ad6-f938249f1ae3" (UID: "518e1d88-71f4-4fe3-9ad6-f938249f1ae3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.225130 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cec1e643-1f06-471e-8215-c690f691bb3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cec1e643-1f06-471e-8215-c690f691bb3c" (UID: "cec1e643-1f06-471e-8215-c690f691bb3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.225909 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c94da26-e330-42bc-b73c-3c0134b7924d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.225928 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x58s\" (UniqueName: \"kubernetes.io/projected/4c94da26-e330-42bc-b73c-3c0134b7924d-kube-api-access-9x58s\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.225939 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.225950 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zldw2\" (UniqueName: \"kubernetes.io/projected/37720060-6c72-494c-b89f-9525e48f9f8d-kube-api-access-zldw2\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.225958 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec1e643-1f06-471e-8215-c690f691bb3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.225966 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37720060-6c72-494c-b89f-9525e48f9f8d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.226785 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec1e643-1f06-471e-8215-c690f691bb3c-kube-api-access-8md9m" (OuterVolumeSpecName: "kube-api-access-8md9m") pod "cec1e643-1f06-471e-8215-c690f691bb3c" (UID: "cec1e643-1f06-471e-8215-c690f691bb3c"). InnerVolumeSpecName "kube-api-access-8md9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.227107 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-kube-api-access-vm9nh" (OuterVolumeSpecName: "kube-api-access-vm9nh") pod "518e1d88-71f4-4fe3-9ad6-f938249f1ae3" (UID: "518e1d88-71f4-4fe3-9ad6-f938249f1ae3"). InnerVolumeSpecName "kube-api-access-vm9nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.327446 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8md9m\" (UniqueName: \"kubernetes.io/projected/cec1e643-1f06-471e-8215-c690f691bb3c-kube-api-access-8md9m\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:56 crc kubenswrapper[5002]: I1209 10:20:56.327493 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm9nh\" (UniqueName: \"kubernetes.io/projected/518e1d88-71f4-4fe3-9ad6-f938249f1ae3-kube-api-access-vm9nh\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.179618 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec"} Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.179674 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04"} Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.179688 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423"} Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.563792 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-73fb-account-create-update-49bs8" Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.570763 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1d25-account-create-update-2kb69" Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.652805 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9b20e8-85f5-4a54-8505-50fc885caa71-operator-scripts\") pod \"ff9b20e8-85f5-4a54-8505-50fc885caa71\" (UID: \"ff9b20e8-85f5-4a54-8505-50fc885caa71\") " Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.652902 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eea79378-54f4-4bc9-9673-04bdf650eb92-operator-scripts\") pod \"eea79378-54f4-4bc9-9673-04bdf650eb92\" (UID: \"eea79378-54f4-4bc9-9673-04bdf650eb92\") " Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.652958 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d84ld\" (UniqueName: \"kubernetes.io/projected/eea79378-54f4-4bc9-9673-04bdf650eb92-kube-api-access-d84ld\") pod \"eea79378-54f4-4bc9-9673-04bdf650eb92\" (UID: \"eea79378-54f4-4bc9-9673-04bdf650eb92\") " Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.653052 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbdlr\" (UniqueName: \"kubernetes.io/projected/ff9b20e8-85f5-4a54-8505-50fc885caa71-kube-api-access-nbdlr\") pod \"ff9b20e8-85f5-4a54-8505-50fc885caa71\" (UID: \"ff9b20e8-85f5-4a54-8505-50fc885caa71\") " Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.654060 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9b20e8-85f5-4a54-8505-50fc885caa71-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff9b20e8-85f5-4a54-8505-50fc885caa71" (UID: "ff9b20e8-85f5-4a54-8505-50fc885caa71"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.654169 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eea79378-54f4-4bc9-9673-04bdf650eb92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eea79378-54f4-4bc9-9673-04bdf650eb92" (UID: "eea79378-54f4-4bc9-9673-04bdf650eb92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.657977 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9b20e8-85f5-4a54-8505-50fc885caa71-kube-api-access-nbdlr" (OuterVolumeSpecName: "kube-api-access-nbdlr") pod "ff9b20e8-85f5-4a54-8505-50fc885caa71" (UID: "ff9b20e8-85f5-4a54-8505-50fc885caa71"). InnerVolumeSpecName "kube-api-access-nbdlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.660174 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea79378-54f4-4bc9-9673-04bdf650eb92-kube-api-access-d84ld" (OuterVolumeSpecName: "kube-api-access-d84ld") pod "eea79378-54f4-4bc9-9673-04bdf650eb92" (UID: "eea79378-54f4-4bc9-9673-04bdf650eb92"). InnerVolumeSpecName "kube-api-access-d84ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.754988 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d84ld\" (UniqueName: \"kubernetes.io/projected/eea79378-54f4-4bc9-9673-04bdf650eb92-kube-api-access-d84ld\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.755046 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbdlr\" (UniqueName: \"kubernetes.io/projected/ff9b20e8-85f5-4a54-8505-50fc885caa71-kube-api-access-nbdlr\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.755062 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9b20e8-85f5-4a54-8505-50fc885caa71-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:57 crc kubenswrapper[5002]: I1209 10:20:57.755080 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eea79378-54f4-4bc9-9673-04bdf650eb92-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:20:58 crc kubenswrapper[5002]: I1209 10:20:58.187954 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1d25-account-create-update-2kb69" event={"ID":"eea79378-54f4-4bc9-9673-04bdf650eb92","Type":"ContainerDied","Data":"3d6669539a7da1a25d32a2556ee75d4756640eb0e2bad91bc83f3bb7642e2420"} Dec 09 10:20:58 crc kubenswrapper[5002]: I1209 10:20:58.188237 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d6669539a7da1a25d32a2556ee75d4756640eb0e2bad91bc83f3bb7642e2420" Dec 09 10:20:58 crc kubenswrapper[5002]: I1209 10:20:58.187969 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1d25-account-create-update-2kb69" Dec 09 10:20:58 crc kubenswrapper[5002]: I1209 10:20:58.189241 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-73fb-account-create-update-49bs8" event={"ID":"ff9b20e8-85f5-4a54-8505-50fc885caa71","Type":"ContainerDied","Data":"4bb7e154e5db3f98ebc63925989b215f6fce8ac5f5d6bc5e8751c3904ca59f6a"} Dec 09 10:20:58 crc kubenswrapper[5002]: I1209 10:20:58.189271 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bb7e154e5db3f98ebc63925989b215f6fce8ac5f5d6bc5e8751c3904ca59f6a" Dec 09 10:20:58 crc kubenswrapper[5002]: I1209 10:20:58.189310 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-73fb-account-create-update-49bs8" Dec 09 10:20:59 crc kubenswrapper[5002]: I1209 10:20:59.233453 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb"} Dec 09 10:20:59 crc kubenswrapper[5002]: I1209 10:20:59.233904 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e"} Dec 09 10:20:59 crc kubenswrapper[5002]: I1209 10:20:59.233923 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda"} Dec 09 10:20:59 crc kubenswrapper[5002]: I1209 10:20:59.236228 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-h2xt6" event={"ID":"d846942e-6d0d-4e42-a584-a910a56d9718","Type":"ContainerStarted","Data":"b11f0e175cf62d7332d70370bdf186828c934bc8e83952200f4524367cd0c303"} Dec 09 10:20:59 crc kubenswrapper[5002]: I1209 10:20:59.260597 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-h2xt6" podStartSLOduration=2.313524376 podStartE2EDuration="32.260576207s" podCreationTimestamp="2025-12-09 10:20:27 +0000 UTC" firstStartedPulling="2025-12-09 10:20:28.101171806 +0000 UTC m=+1160.493222907" lastFinishedPulling="2025-12-09 10:20:58.048223657 +0000 UTC m=+1190.440274738" observedRunningTime="2025-12-09 10:20:59.25363314 +0000 UTC m=+1191.645684221" watchObservedRunningTime="2025-12-09 10:20:59.260576207 +0000 UTC m=+1191.652627288" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.252117 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f"} Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.252464 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20"} Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.252476 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033"} Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.252485 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerStarted","Data":"1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149"} Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.254147 5002 generic.go:334] "Generic (PLEG): container finished" podID="5cb87512-ad2e-4510-ab87-ac4a0a8d09ae" containerID="20ae5f344909f1778bd970bb441b4e8f054eb7da8d7d4f81a65ab39091b3eb8e" exitCode=0 Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.254190 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7zbg6" event={"ID":"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae","Type":"ContainerDied","Data":"20ae5f344909f1778bd970bb441b4e8f054eb7da8d7d4f81a65ab39091b3eb8e"} Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.294404 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=43.271158764 podStartE2EDuration="57.294383685s" podCreationTimestamp="2025-12-09 10:20:03 +0000 UTC" firstStartedPulling="2025-12-09 10:20:44.347959207 +0000 UTC m=+1176.740010288" lastFinishedPulling="2025-12-09 10:20:58.371184128 +0000 UTC m=+1190.763235209" observedRunningTime="2025-12-09 10:21:00.288231138 +0000 UTC m=+1192.680282219" watchObservedRunningTime="2025-12-09 10:21:00.294383685 +0000 UTC m=+1192.686434776" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.568416 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4czbs"] Dec 09 10:21:00 crc kubenswrapper[5002]: E1209 10:21:00.568782 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9b20e8-85f5-4a54-8505-50fc885caa71" containerName="mariadb-account-create-update" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.568798 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9b20e8-85f5-4a54-8505-50fc885caa71" containerName="mariadb-account-create-update" Dec 09 10:21:00 crc kubenswrapper[5002]: E1209 10:21:00.568834 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37720060-6c72-494c-b89f-9525e48f9f8d" containerName="mariadb-database-create" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.568841 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="37720060-6c72-494c-b89f-9525e48f9f8d" containerName="mariadb-database-create" Dec 09 10:21:00 crc kubenswrapper[5002]: E1209 10:21:00.568857 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="518e1d88-71f4-4fe3-9ad6-f938249f1ae3" containerName="mariadb-account-create-update" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.568866 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="518e1d88-71f4-4fe3-9ad6-f938249f1ae3" containerName="mariadb-account-create-update" Dec 09 10:21:00 crc kubenswrapper[5002]: E1209 10:21:00.568877 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea79378-54f4-4bc9-9673-04bdf650eb92" containerName="mariadb-account-create-update" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.568883 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea79378-54f4-4bc9-9673-04bdf650eb92" containerName="mariadb-account-create-update" Dec 09 10:21:00 crc kubenswrapper[5002]: E1209 10:21:00.568897 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c94da26-e330-42bc-b73c-3c0134b7924d" containerName="mariadb-database-create" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.568903 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c94da26-e330-42bc-b73c-3c0134b7924d" containerName="mariadb-database-create" Dec 09 10:21:00 crc kubenswrapper[5002]: E1209 10:21:00.568916 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec1e643-1f06-471e-8215-c690f691bb3c" containerName="mariadb-database-create" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.568923 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec1e643-1f06-471e-8215-c690f691bb3c" containerName="mariadb-database-create" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.569072 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="518e1d88-71f4-4fe3-9ad6-f938249f1ae3" containerName="mariadb-account-create-update" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.569097 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="37720060-6c72-494c-b89f-9525e48f9f8d" containerName="mariadb-database-create" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.569111 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea79378-54f4-4bc9-9673-04bdf650eb92" containerName="mariadb-account-create-update" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.569120 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9b20e8-85f5-4a54-8505-50fc885caa71" containerName="mariadb-account-create-update" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.569129 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec1e643-1f06-471e-8215-c690f691bb3c" containerName="mariadb-database-create" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.569141 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c94da26-e330-42bc-b73c-3c0134b7924d" containerName="mariadb-database-create" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.570010 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.571832 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.581399 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4czbs"] Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.702968 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.703045 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.703183 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjwj2\" (UniqueName: \"kubernetes.io/projected/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-kube-api-access-zjwj2\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.703207 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.703239 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-config\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.703308 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-svc\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.804755 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-svc\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.804833 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.804873 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.804948 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjwj2\" (UniqueName: \"kubernetes.io/projected/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-kube-api-access-zjwj2\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.804969 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.804988 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-config\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.805766 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.805934 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-svc\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.806000 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-config\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.806077 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.806150 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.824945 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjwj2\" (UniqueName: \"kubernetes.io/projected/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-kube-api-access-zjwj2\") pod \"dnsmasq-dns-764c5664d7-4czbs\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:00 crc kubenswrapper[5002]: I1209 10:21:00.884121 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:01 crc kubenswrapper[5002]: I1209 10:21:01.369653 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4czbs"] Dec 09 10:21:01 crc kubenswrapper[5002]: W1209 10:21:01.379904 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e0cdee_587a_4b73_b669_8a0b8e0f79b6.slice/crio-f7bf85391156370fa3410346ec6c8f6cdbb54298bcf80714f2283db03b2940f1 WatchSource:0}: Error finding container f7bf85391156370fa3410346ec6c8f6cdbb54298bcf80714f2283db03b2940f1: Status 404 returned error can't find the container with id f7bf85391156370fa3410346ec6c8f6cdbb54298bcf80714f2283db03b2940f1 Dec 09 10:21:01 crc kubenswrapper[5002]: I1209 10:21:01.504938 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:21:01 crc kubenswrapper[5002]: I1209 10:21:01.623024 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-config-data\") pod \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " Dec 09 10:21:01 crc kubenswrapper[5002]: I1209 10:21:01.623413 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-combined-ca-bundle\") pod \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " Dec 09 10:21:01 crc kubenswrapper[5002]: I1209 10:21:01.623442 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgvpf\" (UniqueName: \"kubernetes.io/projected/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-kube-api-access-kgvpf\") pod \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\" (UID: \"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae\") " Dec 09 10:21:01 crc kubenswrapper[5002]: I1209 10:21:01.629311 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-kube-api-access-kgvpf" (OuterVolumeSpecName: "kube-api-access-kgvpf") pod "5cb87512-ad2e-4510-ab87-ac4a0a8d09ae" (UID: "5cb87512-ad2e-4510-ab87-ac4a0a8d09ae"). InnerVolumeSpecName "kube-api-access-kgvpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:21:01 crc kubenswrapper[5002]: I1209 10:21:01.668723 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cb87512-ad2e-4510-ab87-ac4a0a8d09ae" (UID: "5cb87512-ad2e-4510-ab87-ac4a0a8d09ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:01 crc kubenswrapper[5002]: I1209 10:21:01.694166 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-config-data" (OuterVolumeSpecName: "config-data") pod "5cb87512-ad2e-4510-ab87-ac4a0a8d09ae" (UID: "5cb87512-ad2e-4510-ab87-ac4a0a8d09ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:01 crc kubenswrapper[5002]: I1209 10:21:01.725467 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:01 crc kubenswrapper[5002]: I1209 10:21:01.725494 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:01 crc kubenswrapper[5002]: I1209 10:21:01.725505 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgvpf\" (UniqueName: \"kubernetes.io/projected/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae-kube-api-access-kgvpf\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.272472 5002 generic.go:334] "Generic (PLEG): container finished" podID="21e0cdee-587a-4b73-b669-8a0b8e0f79b6" containerID="489c643394704f1bfc4cfa53dcbf2b3de6ef27224974f6af6b19889b7a2aab6d" exitCode=0 Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.272570 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4czbs" event={"ID":"21e0cdee-587a-4b73-b669-8a0b8e0f79b6","Type":"ContainerDied","Data":"489c643394704f1bfc4cfa53dcbf2b3de6ef27224974f6af6b19889b7a2aab6d"} Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.272915 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4czbs" event={"ID":"21e0cdee-587a-4b73-b669-8a0b8e0f79b6","Type":"ContainerStarted","Data":"f7bf85391156370fa3410346ec6c8f6cdbb54298bcf80714f2283db03b2940f1"} Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.276456 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7zbg6" event={"ID":"5cb87512-ad2e-4510-ab87-ac4a0a8d09ae","Type":"ContainerDied","Data":"45b0b9bd6d3f8c1a1a8cc37a97ad28daa7553d53acdfb46ca09c5860d3c9a0b4"} Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.276491 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45b0b9bd6d3f8c1a1a8cc37a97ad28daa7553d53acdfb46ca09c5860d3c9a0b4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.276546 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7zbg6" Dec 09 10:21:02 crc kubenswrapper[5002]: E1209 10:21:02.578826 5002 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 09 10:21:02 crc kubenswrapper[5002]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/21e0cdee-587a-4b73-b669-8a0b8e0f79b6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 09 10:21:02 crc kubenswrapper[5002]: > podSandboxID="f7bf85391156370fa3410346ec6c8f6cdbb54298bcf80714f2283db03b2940f1" Dec 09 10:21:02 crc kubenswrapper[5002]: E1209 10:21:02.579247 5002 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 09 10:21:02 crc kubenswrapper[5002]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66chbbh56dh7fhfh68chf9hfdhbdh587h5b9h568h68fh77h5b5h559h577h687h574h5d5h584h8chd9hb4h66h566h545h699h564h568h66fhc9q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zjwj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-764c5664d7-4czbs_openstack(21e0cdee-587a-4b73-b669-8a0b8e0f79b6): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/21e0cdee-587a-4b73-b669-8a0b8e0f79b6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 09 10:21:02 crc kubenswrapper[5002]: > logger="UnhandledError" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.579545 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pqcn4"] Dec 09 10:21:02 crc kubenswrapper[5002]: E1209 10:21:02.579904 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cb87512-ad2e-4510-ab87-ac4a0a8d09ae" containerName="keystone-db-sync" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.579919 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cb87512-ad2e-4510-ab87-ac4a0a8d09ae" containerName="keystone-db-sync" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.580102 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cb87512-ad2e-4510-ab87-ac4a0a8d09ae" containerName="keystone-db-sync" Dec 09 10:21:02 crc kubenswrapper[5002]: E1209 10:21:02.580579 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/21e0cdee-587a-4b73-b669-8a0b8e0f79b6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-764c5664d7-4czbs" podUID="21e0cdee-587a-4b73-b669-8a0b8e0f79b6" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.580653 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.599385 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.600163 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q4kd8" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.600288 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.606662 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.606889 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.613031 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4czbs"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.624917 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pqcn4"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.633881 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-9zdhz"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.635665 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.645424 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-9zdhz"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.729220 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.733512 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.737426 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.740006 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.751481 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-fernet-keys\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.751522 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.751561 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-svc\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.751575 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5mtq\" (UniqueName: \"kubernetes.io/projected/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-kube-api-access-s5mtq\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.751604 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-scripts\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.751622 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2ss2\" (UniqueName: \"kubernetes.io/projected/c3ee85ff-9d22-483c-9a01-3366518ca2d3-kube-api-access-p2ss2\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.751765 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-credential-keys\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.751890 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.751961 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-config-data\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.752018 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-config\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.752037 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.752089 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-combined-ca-bundle\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.758634 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-4kmzk"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.759696 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.764025 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.764028 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.764190 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xsqqp" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.772458 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4kmzk"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.785996 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.815077 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-52w6p"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.817170 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52w6p" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.822657 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7tnw8" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.822684 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.822754 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.843062 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-t7vtz"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.844334 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.850413 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.850584 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4q7mf" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.856851 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-credential-keys\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.857915 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.857946 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-db-sync-config-data\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.857962 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-scripts\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.857986 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-config-data\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858013 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858061 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-config-data\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858106 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhwbz\" (UniqueName: \"kubernetes.io/projected/75879941-b12b-4731-a78c-0a0a98142ec0-kube-api-access-bhwbz\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858129 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-config\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858148 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858181 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf8zz\" (UniqueName: \"kubernetes.io/projected/20ebb6ea-f36b-440a-a437-ff39f9766fca-kube-api-access-hf8zz\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858214 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-combined-ca-bundle\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858242 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-run-httpd\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858283 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-fernet-keys\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858302 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-scripts\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858318 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-combined-ca-bundle\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858342 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858368 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20ebb6ea-f36b-440a-a437-ff39f9766fca-etc-machine-id\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858397 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858432 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-svc\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858448 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5mtq\" (UniqueName: \"kubernetes.io/projected/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-kube-api-access-s5mtq\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858465 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-log-httpd\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858505 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-config-data\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858525 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-scripts\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.858547 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2ss2\" (UniqueName: \"kubernetes.io/projected/c3ee85ff-9d22-483c-9a01-3366518ca2d3-kube-api-access-p2ss2\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.862305 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-svc\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.862430 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.862850 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.864039 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.866134 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-config\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.866723 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-credential-keys\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.867714 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-scripts\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.868481 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-config-data\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.869409 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-52w6p"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.877901 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-fernet-keys\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.884316 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-9zdhz"] Dec 09 10:21:02 crc kubenswrapper[5002]: E1209 10:21:02.885030 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-p2ss2], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" podUID="c3ee85ff-9d22-483c-9a01-3366518ca2d3" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.889227 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5mtq\" (UniqueName: \"kubernetes.io/projected/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-kube-api-access-s5mtq\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.891094 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-combined-ca-bundle\") pod \"keystone-bootstrap-pqcn4\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.898987 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2ss2\" (UniqueName: \"kubernetes.io/projected/c3ee85ff-9d22-483c-9a01-3366518ca2d3-kube-api-access-p2ss2\") pod \"dnsmasq-dns-5959f8865f-9zdhz\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.903036 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-h7bpf"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.904406 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.910123 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.910550 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.911528 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f557k" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.916085 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h7bpf"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.925980 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-t7vtz"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960577 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf8zz\" (UniqueName: \"kubernetes.io/projected/20ebb6ea-f36b-440a-a437-ff39f9766fca-kube-api-access-hf8zz\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960627 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-run-httpd\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960658 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-scripts\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960677 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-db-sync-config-data\") pod \"barbican-db-sync-t7vtz\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960695 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-combined-ca-bundle\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960717 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg4pg\" (UniqueName: \"kubernetes.io/projected/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-kube-api-access-sg4pg\") pod \"barbican-db-sync-t7vtz\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960738 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20ebb6ea-f36b-440a-a437-ff39f9766fca-etc-machine-id\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960756 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960781 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-log-httpd\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960806 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-config-data\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960866 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960883 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-db-sync-config-data\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960897 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-scripts\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960915 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-config-data\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960948 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htx5l\" (UniqueName: \"kubernetes.io/projected/d8f27813-7f40-4967-9e37-34e4ae205cb7-kube-api-access-htx5l\") pod \"neutron-db-sync-52w6p\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " pod="openstack/neutron-db-sync-52w6p" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960966 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-combined-ca-bundle\") pod \"neutron-db-sync-52w6p\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " pod="openstack/neutron-db-sync-52w6p" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.960984 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-combined-ca-bundle\") pod \"barbican-db-sync-t7vtz\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.961000 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-config\") pod \"neutron-db-sync-52w6p\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " pod="openstack/neutron-db-sync-52w6p" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.961012 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20ebb6ea-f36b-440a-a437-ff39f9766fca-etc-machine-id\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.961872 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-run-httpd\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.961018 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhwbz\" (UniqueName: \"kubernetes.io/projected/75879941-b12b-4731-a78c-0a0a98142ec0-kube-api-access-bhwbz\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.964834 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-combined-ca-bundle\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.965325 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.971754 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-scripts\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.972036 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-log-httpd\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.972312 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.973128 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.974943 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-config-data\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.975973 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f7nbc"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.977629 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.978772 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-config-data\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.979197 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf8zz\" (UniqueName: \"kubernetes.io/projected/20ebb6ea-f36b-440a-a437-ff39f9766fca-kube-api-access-hf8zz\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.988728 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhwbz\" (UniqueName: \"kubernetes.io/projected/75879941-b12b-4731-a78c-0a0a98142ec0-kube-api-access-bhwbz\") pod \"ceilometer-0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " pod="openstack/ceilometer-0" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.990147 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f7nbc"] Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.996103 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-scripts\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:02 crc kubenswrapper[5002]: I1209 10:21:02.996626 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-db-sync-config-data\") pod \"cinder-db-sync-4kmzk\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.058584 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.063839 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d382de4-ddc6-4781-9815-76b74cbccadc-logs\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.063905 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-combined-ca-bundle\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064048 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064087 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htx5l\" (UniqueName: \"kubernetes.io/projected/d8f27813-7f40-4967-9e37-34e4ae205cb7-kube-api-access-htx5l\") pod \"neutron-db-sync-52w6p\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " pod="openstack/neutron-db-sync-52w6p" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064111 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-combined-ca-bundle\") pod \"neutron-db-sync-52w6p\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " pod="openstack/neutron-db-sync-52w6p" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064127 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064158 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-combined-ca-bundle\") pod \"barbican-db-sync-t7vtz\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064180 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-config\") pod \"neutron-db-sync-52w6p\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " pod="openstack/neutron-db-sync-52w6p" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064201 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64x56\" (UniqueName: \"kubernetes.io/projected/3d382de4-ddc6-4781-9815-76b74cbccadc-kube-api-access-64x56\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064267 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-scripts\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064283 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064319 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-config-data\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064355 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-db-sync-config-data\") pod \"barbican-db-sync-t7vtz\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064371 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064402 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg4pg\" (UniqueName: \"kubernetes.io/projected/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-kube-api-access-sg4pg\") pod \"barbican-db-sync-t7vtz\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064429 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2rmc\" (UniqueName: \"kubernetes.io/projected/c970f7fd-c67b-4cda-9124-7fb1709c8d73-kube-api-access-p2rmc\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.064460 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-config\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.069571 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-combined-ca-bundle\") pod \"barbican-db-sync-t7vtz\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.072382 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-db-sync-config-data\") pod \"barbican-db-sync-t7vtz\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.083192 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.086633 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htx5l\" (UniqueName: \"kubernetes.io/projected/d8f27813-7f40-4967-9e37-34e4ae205cb7-kube-api-access-htx5l\") pod \"neutron-db-sync-52w6p\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " pod="openstack/neutron-db-sync-52w6p" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.087984 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg4pg\" (UniqueName: \"kubernetes.io/projected/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-kube-api-access-sg4pg\") pod \"barbican-db-sync-t7vtz\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.090018 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-combined-ca-bundle\") pod \"neutron-db-sync-52w6p\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " pod="openstack/neutron-db-sync-52w6p" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.095097 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-config\") pod \"neutron-db-sync-52w6p\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " pod="openstack/neutron-db-sync-52w6p" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.131580 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52w6p" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.161044 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.165312 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-config-data\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.165346 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.165375 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2rmc\" (UniqueName: \"kubernetes.io/projected/c970f7fd-c67b-4cda-9124-7fb1709c8d73-kube-api-access-p2rmc\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.165395 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-config\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.165436 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d382de4-ddc6-4781-9815-76b74cbccadc-logs\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.165469 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-combined-ca-bundle\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.165510 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.165526 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.165545 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64x56\" (UniqueName: \"kubernetes.io/projected/3d382de4-ddc6-4781-9815-76b74cbccadc-kube-api-access-64x56\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.165582 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-scripts\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.165596 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.166355 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.166552 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d382de4-ddc6-4781-9815-76b74cbccadc-logs\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.166574 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-config\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.167009 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.167194 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.167843 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.174397 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-combined-ca-bundle\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.174770 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-scripts\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.175385 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-config-data\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.185026 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2rmc\" (UniqueName: \"kubernetes.io/projected/c970f7fd-c67b-4cda-9124-7fb1709c8d73-kube-api-access-p2rmc\") pod \"dnsmasq-dns-58dd9ff6bc-f7nbc\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.187547 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64x56\" (UniqueName: \"kubernetes.io/projected/3d382de4-ddc6-4781-9815-76b74cbccadc-kube-api-access-64x56\") pod \"placement-db-sync-h7bpf\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.260973 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h7bpf" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.286594 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.329353 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.359228 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.476258 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pqcn4"] Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.481534 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-svc\") pod \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.481590 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-swift-storage-0\") pod \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.481680 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-config\") pod \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.481835 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2ss2\" (UniqueName: \"kubernetes.io/projected/c3ee85ff-9d22-483c-9a01-3366518ca2d3-kube-api-access-p2ss2\") pod \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.481869 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-nb\") pod \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.481998 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-sb\") pod \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\" (UID: \"c3ee85ff-9d22-483c-9a01-3366518ca2d3\") " Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.482759 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c3ee85ff-9d22-483c-9a01-3366518ca2d3" (UID: "c3ee85ff-9d22-483c-9a01-3366518ca2d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.483164 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3ee85ff-9d22-483c-9a01-3366518ca2d3" (UID: "c3ee85ff-9d22-483c-9a01-3366518ca2d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.483502 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c3ee85ff-9d22-483c-9a01-3366518ca2d3" (UID: "c3ee85ff-9d22-483c-9a01-3366518ca2d3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.483843 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-config" (OuterVolumeSpecName: "config") pod "c3ee85ff-9d22-483c-9a01-3366518ca2d3" (UID: "c3ee85ff-9d22-483c-9a01-3366518ca2d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.484650 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c3ee85ff-9d22-483c-9a01-3366518ca2d3" (UID: "c3ee85ff-9d22-483c-9a01-3366518ca2d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:03 crc kubenswrapper[5002]: I1209 10:21:03.487473 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ee85ff-9d22-483c-9a01-3366518ca2d3-kube-api-access-p2ss2" (OuterVolumeSpecName: "kube-api-access-p2ss2") pod "c3ee85ff-9d22-483c-9a01-3366518ca2d3" (UID: "c3ee85ff-9d22-483c-9a01-3366518ca2d3"). InnerVolumeSpecName "kube-api-access-p2ss2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:03.587906 5002 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:03.587949 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:03.587963 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2ss2\" (UniqueName: \"kubernetes.io/projected/c3ee85ff-9d22-483c-9a01-3366518ca2d3-kube-api-access-p2ss2\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:03.587977 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:03.587987 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:03.587998 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3ee85ff-9d22-483c-9a01-3366518ca2d3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:03.670337 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:21:04 crc kubenswrapper[5002]: E1209 10:21:03.687676 5002 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 09 10:21:04 crc kubenswrapper[5002]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/21e0cdee-587a-4b73-b669-8a0b8e0f79b6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 09 10:21:04 crc kubenswrapper[5002]: > podSandboxID="f7bf85391156370fa3410346ec6c8f6cdbb54298bcf80714f2283db03b2940f1" Dec 09 10:21:04 crc kubenswrapper[5002]: E1209 10:21:03.688075 5002 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 09 10:21:04 crc kubenswrapper[5002]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66chbbh56dh7fhfh68chf9hfdhbdh587h5b9h568h68fh77h5b5h559h577h687h574h5d5h584h8chd9hb4h66h566h545h699h564h568h66fhc9q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zjwj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-764c5664d7-4czbs_openstack(21e0cdee-587a-4b73-b669-8a0b8e0f79b6): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/21e0cdee-587a-4b73-b669-8a0b8e0f79b6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 09 10:21:04 crc kubenswrapper[5002]: > logger="UnhandledError" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:03.688367 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4kmzk"] Dec 09 10:21:04 crc kubenswrapper[5002]: E1209 10:21:03.689410 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/21e0cdee-587a-4b73-b669-8a0b8e0f79b6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-764c5664d7-4czbs" podUID="21e0cdee-587a-4b73-b669-8a0b8e0f79b6" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:03.695269 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-52w6p"] Dec 09 10:21:04 crc kubenswrapper[5002]: W1209 10:21:03.702248 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75879941_b12b_4731_a78c_0a0a98142ec0.slice/crio-486c0ca39358d11caa0e68e7d6cd892da2db57788aa03abb8378ffde555f2603 WatchSource:0}: Error finding container 486c0ca39358d11caa0e68e7d6cd892da2db57788aa03abb8378ffde555f2603: Status 404 returned error can't find the container with id 486c0ca39358d11caa0e68e7d6cd892da2db57788aa03abb8378ffde555f2603 Dec 09 10:21:04 crc kubenswrapper[5002]: W1209 10:21:03.721216 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8f27813_7f40_4967_9e37_34e4ae205cb7.slice/crio-a5cbf5897b9db96ee4c754867f430c82c7025fed0cea4a490c80f9ba179946e2 WatchSource:0}: Error finding container a5cbf5897b9db96ee4c754867f430c82c7025fed0cea4a490c80f9ba179946e2: Status 404 returned error can't find the container with id a5cbf5897b9db96ee4c754867f430c82c7025fed0cea4a490c80f9ba179946e2 Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:03.835270 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-t7vtz"] Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:03.953348 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h7bpf"] Dec 09 10:21:04 crc kubenswrapper[5002]: W1209 10:21:03.959729 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d382de4_ddc6_4781_9815_76b74cbccadc.slice/crio-232e87488380ff257ae17fd57d7b581238edbae3601538af844b23a4915dc98d WatchSource:0}: Error finding container 232e87488380ff257ae17fd57d7b581238edbae3601538af844b23a4915dc98d: Status 404 returned error can't find the container with id 232e87488380ff257ae17fd57d7b581238edbae3601538af844b23a4915dc98d Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.307734 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75879941-b12b-4731-a78c-0a0a98142ec0","Type":"ContainerStarted","Data":"486c0ca39358d11caa0e68e7d6cd892da2db57788aa03abb8378ffde555f2603"} Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.320512 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4kmzk" event={"ID":"20ebb6ea-f36b-440a-a437-ff39f9766fca","Type":"ContainerStarted","Data":"589aa57ae28bb3cbe2d4e332bed90a95cc6c91e2463510caa923a4364e2e4192"} Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.323842 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pqcn4" event={"ID":"a626587d-9f0c-47cb-88e6-c28ab4cd1c51","Type":"ContainerStarted","Data":"b04ded4e2cd792d948ee1a8ccbcfbfc4efad912a0818fcb71735aedb71fee7dc"} Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.323872 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pqcn4" event={"ID":"a626587d-9f0c-47cb-88e6-c28ab4cd1c51","Type":"ContainerStarted","Data":"5b8827d3d63aea2f932d9bdff7a65e5b0ccd7bea1102bbe2875b5b0964df27bc"} Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.325743 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h7bpf" event={"ID":"3d382de4-ddc6-4781-9815-76b74cbccadc","Type":"ContainerStarted","Data":"232e87488380ff257ae17fd57d7b581238edbae3601538af844b23a4915dc98d"} Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.328176 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t7vtz" event={"ID":"354d641d-dc9c-4aa4-b821-90ce72ef6d5c","Type":"ContainerStarted","Data":"e6e9aa3b544c40c5e27fc72016be1fec40be275e0ddd1554eef4aea11e103b02"} Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.331590 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52w6p" event={"ID":"d8f27813-7f40-4967-9e37-34e4ae205cb7","Type":"ContainerStarted","Data":"51397172bc6109b173eaa3e4fefc729d3fc17a5efc791a6027086db757378d60"} Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.331622 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52w6p" event={"ID":"d8f27813-7f40-4967-9e37-34e4ae205cb7","Type":"ContainerStarted","Data":"a5cbf5897b9db96ee4c754867f430c82c7025fed0cea4a490c80f9ba179946e2"} Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.333959 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-9zdhz" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.365403 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pqcn4" podStartSLOduration=2.365351978 podStartE2EDuration="2.365351978s" podCreationTimestamp="2025-12-09 10:21:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:21:04.363711573 +0000 UTC m=+1196.755762684" watchObservedRunningTime="2025-12-09 10:21:04.365351978 +0000 UTC m=+1196.757403059" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.440471 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-9zdhz"] Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.455926 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-9zdhz"] Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.465062 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-52w6p" podStartSLOduration=2.465042836 podStartE2EDuration="2.465042836s" podCreationTimestamp="2025-12-09 10:21:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:21:04.462056365 +0000 UTC m=+1196.854107456" watchObservedRunningTime="2025-12-09 10:21:04.465042836 +0000 UTC m=+1196.857093927" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.557291 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f7nbc"] Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.820157 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.928724 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-svc\") pod \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.928768 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjwj2\" (UniqueName: \"kubernetes.io/projected/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-kube-api-access-zjwj2\") pod \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.928848 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-swift-storage-0\") pod \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.928927 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-nb\") pod \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.928961 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-config\") pod \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.929040 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-sb\") pod \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\" (UID: \"21e0cdee-587a-4b73-b669-8a0b8e0f79b6\") " Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.937137 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-kube-api-access-zjwj2" (OuterVolumeSpecName: "kube-api-access-zjwj2") pod "21e0cdee-587a-4b73-b669-8a0b8e0f79b6" (UID: "21e0cdee-587a-4b73-b669-8a0b8e0f79b6"). InnerVolumeSpecName "kube-api-access-zjwj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:21:04 crc kubenswrapper[5002]: I1209 10:21:04.969360 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21e0cdee-587a-4b73-b669-8a0b8e0f79b6" (UID: "21e0cdee-587a-4b73-b669-8a0b8e0f79b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.022439 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-config" (OuterVolumeSpecName: "config") pod "21e0cdee-587a-4b73-b669-8a0b8e0f79b6" (UID: "21e0cdee-587a-4b73-b669-8a0b8e0f79b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.032926 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.032947 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.032956 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjwj2\" (UniqueName: \"kubernetes.io/projected/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-kube-api-access-zjwj2\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.040224 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21e0cdee-587a-4b73-b669-8a0b8e0f79b6" (UID: "21e0cdee-587a-4b73-b669-8a0b8e0f79b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.057777 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21e0cdee-587a-4b73-b669-8a0b8e0f79b6" (UID: "21e0cdee-587a-4b73-b669-8a0b8e0f79b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.067492 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21e0cdee-587a-4b73-b669-8a0b8e0f79b6" (UID: "21e0cdee-587a-4b73-b669-8a0b8e0f79b6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.134727 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.134772 5002 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.134781 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21e0cdee-587a-4b73-b669-8a0b8e0f79b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.269514 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.355479 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4czbs" event={"ID":"21e0cdee-587a-4b73-b669-8a0b8e0f79b6","Type":"ContainerDied","Data":"f7bf85391156370fa3410346ec6c8f6cdbb54298bcf80714f2283db03b2940f1"} Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.355508 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4czbs" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.355532 5002 scope.go:117] "RemoveContainer" containerID="489c643394704f1bfc4cfa53dcbf2b3de6ef27224974f6af6b19889b7a2aab6d" Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.364027 5002 generic.go:334] "Generic (PLEG): container finished" podID="c970f7fd-c67b-4cda-9124-7fb1709c8d73" containerID="dadfd246b3ac2b0292fc0636f26839560ad077ac4fccca19414d5ff12e33abec" exitCode=0 Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.365191 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" event={"ID":"c970f7fd-c67b-4cda-9124-7fb1709c8d73","Type":"ContainerDied","Data":"dadfd246b3ac2b0292fc0636f26839560ad077ac4fccca19414d5ff12e33abec"} Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.365215 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" event={"ID":"c970f7fd-c67b-4cda-9124-7fb1709c8d73","Type":"ContainerStarted","Data":"1c889b2782a0a8e98e182f9263a87ed7328a8e3f02a7deaa0fd74e86f56ee667"} Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.463723 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4czbs"] Dec 09 10:21:05 crc kubenswrapper[5002]: I1209 10:21:05.475771 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4czbs"] Dec 09 10:21:06 crc kubenswrapper[5002]: I1209 10:21:06.094731 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e0cdee-587a-4b73-b669-8a0b8e0f79b6" path="/var/lib/kubelet/pods/21e0cdee-587a-4b73-b669-8a0b8e0f79b6/volumes" Dec 09 10:21:06 crc kubenswrapper[5002]: I1209 10:21:06.095923 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ee85ff-9d22-483c-9a01-3366518ca2d3" path="/var/lib/kubelet/pods/c3ee85ff-9d22-483c-9a01-3366518ca2d3/volumes" Dec 09 10:21:06 crc kubenswrapper[5002]: I1209 10:21:06.386562 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" event={"ID":"c970f7fd-c67b-4cda-9124-7fb1709c8d73","Type":"ContainerStarted","Data":"3b3e416e7518a7d77f215c8aaaec78e67b328f6a7988e8127692752dfbd2bd6c"} Dec 09 10:21:06 crc kubenswrapper[5002]: I1209 10:21:06.386671 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:06 crc kubenswrapper[5002]: I1209 10:21:06.416585 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" podStartSLOduration=4.416559749 podStartE2EDuration="4.416559749s" podCreationTimestamp="2025-12-09 10:21:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:21:06.412621064 +0000 UTC m=+1198.804672165" watchObservedRunningTime="2025-12-09 10:21:06.416559749 +0000 UTC m=+1198.808610830" Dec 09 10:21:07 crc kubenswrapper[5002]: I1209 10:21:07.964758 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:21:07 crc kubenswrapper[5002]: I1209 10:21:07.965137 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:21:08 crc kubenswrapper[5002]: I1209 10:21:08.408479 5002 generic.go:334] "Generic (PLEG): container finished" podID="a626587d-9f0c-47cb-88e6-c28ab4cd1c51" containerID="b04ded4e2cd792d948ee1a8ccbcfbfc4efad912a0818fcb71735aedb71fee7dc" exitCode=0 Dec 09 10:21:08 crc kubenswrapper[5002]: I1209 10:21:08.408533 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pqcn4" event={"ID":"a626587d-9f0c-47cb-88e6-c28ab4cd1c51","Type":"ContainerDied","Data":"b04ded4e2cd792d948ee1a8ccbcfbfc4efad912a0818fcb71735aedb71fee7dc"} Dec 09 10:21:13 crc kubenswrapper[5002]: I1209 10:21:13.362240 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:21:13 crc kubenswrapper[5002]: I1209 10:21:13.470639 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bbdqx"] Dec 09 10:21:13 crc kubenswrapper[5002]: I1209 10:21:13.470930 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-bbdqx" podUID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerName="dnsmasq-dns" containerID="cri-o://0f54f46d5b1dd9e6fe86f181086d298dc0411f77090795c15f7773a0005ebd85" gracePeriod=10 Dec 09 10:21:17 crc kubenswrapper[5002]: I1209 10:21:17.492517 5002 generic.go:334] "Generic (PLEG): container finished" podID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerID="0f54f46d5b1dd9e6fe86f181086d298dc0411f77090795c15f7773a0005ebd85" exitCode=0 Dec 09 10:21:17 crc kubenswrapper[5002]: I1209 10:21:17.492595 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bbdqx" event={"ID":"f36484d9-764c-460b-b48e-f0c36e1145e8","Type":"ContainerDied","Data":"0f54f46d5b1dd9e6fe86f181086d298dc0411f77090795c15f7773a0005ebd85"} Dec 09 10:21:18 crc kubenswrapper[5002]: I1209 10:21:18.470757 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-bbdqx" podUID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Dec 09 10:21:18 crc kubenswrapper[5002]: E1209 10:21:18.839109 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 09 10:21:18 crc kubenswrapper[5002]: E1209 10:21:18.839287 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55fh5cch595h568hf6h689h68fh578h4h5c9h5bbh78hb5h675hc8h97h567h676hdch58ch5ddh657h649h576h677h5dch5b8h565h6dh8bhf5h5f9q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bhwbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(75879941-b12b-4731-a78c-0a0a98142ec0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:21:21 crc kubenswrapper[5002]: E1209 10:21:21.395755 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 09 10:21:21 crc kubenswrapper[5002]: E1209 10:21:21.396522 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-64x56,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-h7bpf_openstack(3d382de4-ddc6-4781-9815-76b74cbccadc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:21:21 crc kubenswrapper[5002]: E1209 10:21:21.397646 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-h7bpf" podUID="3d382de4-ddc6-4781-9815-76b74cbccadc" Dec 09 10:21:21 crc kubenswrapper[5002]: E1209 10:21:21.526431 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-h7bpf" podUID="3d382de4-ddc6-4781-9815-76b74cbccadc" Dec 09 10:21:22 crc kubenswrapper[5002]: E1209 10:21:22.157963 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 09 10:21:22 crc kubenswrapper[5002]: E1209 10:21:22.158508 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sg4pg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-t7vtz_openstack(354d641d-dc9c-4aa4-b821-90ce72ef6d5c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:21:22 crc kubenswrapper[5002]: E1209 10:21:22.162116 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-t7vtz" podUID="354d641d-dc9c-4aa4-b821-90ce72ef6d5c" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.247184 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.325558 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-credential-keys\") pod \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.325615 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5mtq\" (UniqueName: \"kubernetes.io/projected/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-kube-api-access-s5mtq\") pod \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.325642 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-scripts\") pod \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.325699 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-fernet-keys\") pod \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.325799 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-combined-ca-bundle\") pod \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.325851 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-config-data\") pod \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\" (UID: \"a626587d-9f0c-47cb-88e6-c28ab4cd1c51\") " Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.331961 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-kube-api-access-s5mtq" (OuterVolumeSpecName: "kube-api-access-s5mtq") pod "a626587d-9f0c-47cb-88e6-c28ab4cd1c51" (UID: "a626587d-9f0c-47cb-88e6-c28ab4cd1c51"). InnerVolumeSpecName "kube-api-access-s5mtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.332462 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-scripts" (OuterVolumeSpecName: "scripts") pod "a626587d-9f0c-47cb-88e6-c28ab4cd1c51" (UID: "a626587d-9f0c-47cb-88e6-c28ab4cd1c51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.332977 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a626587d-9f0c-47cb-88e6-c28ab4cd1c51" (UID: "a626587d-9f0c-47cb-88e6-c28ab4cd1c51"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.337904 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a626587d-9f0c-47cb-88e6-c28ab4cd1c51" (UID: "a626587d-9f0c-47cb-88e6-c28ab4cd1c51"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.350964 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a626587d-9f0c-47cb-88e6-c28ab4cd1c51" (UID: "a626587d-9f0c-47cb-88e6-c28ab4cd1c51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.356107 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-config-data" (OuterVolumeSpecName: "config-data") pod "a626587d-9f0c-47cb-88e6-c28ab4cd1c51" (UID: "a626587d-9f0c-47cb-88e6-c28ab4cd1c51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.427825 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.427897 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.427911 5002 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.427922 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5mtq\" (UniqueName: \"kubernetes.io/projected/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-kube-api-access-s5mtq\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.427936 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.427946 5002 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a626587d-9f0c-47cb-88e6-c28ab4cd1c51-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.533977 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pqcn4" event={"ID":"a626587d-9f0c-47cb-88e6-c28ab4cd1c51","Type":"ContainerDied","Data":"5b8827d3d63aea2f932d9bdff7a65e5b0ccd7bea1102bbe2875b5b0964df27bc"} Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.534005 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pqcn4" Dec 09 10:21:22 crc kubenswrapper[5002]: I1209 10:21:22.534028 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b8827d3d63aea2f932d9bdff7a65e5b0ccd7bea1102bbe2875b5b0964df27bc" Dec 09 10:21:22 crc kubenswrapper[5002]: E1209 10:21:22.535970 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-t7vtz" podUID="354d641d-dc9c-4aa4-b821-90ce72ef6d5c" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.340625 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pqcn4"] Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.350937 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pqcn4"] Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.440042 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jrbkq"] Dec 09 10:21:23 crc kubenswrapper[5002]: E1209 10:21:23.440620 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a626587d-9f0c-47cb-88e6-c28ab4cd1c51" containerName="keystone-bootstrap" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.440907 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a626587d-9f0c-47cb-88e6-c28ab4cd1c51" containerName="keystone-bootstrap" Dec 09 10:21:23 crc kubenswrapper[5002]: E1209 10:21:23.440940 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e0cdee-587a-4b73-b669-8a0b8e0f79b6" containerName="init" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.440946 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e0cdee-587a-4b73-b669-8a0b8e0f79b6" containerName="init" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.441098 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e0cdee-587a-4b73-b669-8a0b8e0f79b6" containerName="init" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.441124 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a626587d-9f0c-47cb-88e6-c28ab4cd1c51" containerName="keystone-bootstrap" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.441667 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.444992 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.445117 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.445141 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.445141 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q4kd8" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.445999 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.448972 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jrbkq"] Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.546808 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-config-data\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.546907 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzr7r\" (UniqueName: \"kubernetes.io/projected/af9df436-38ae-4001-bce6-e97e5e8d9cd2-kube-api-access-zzr7r\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.546945 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-combined-ca-bundle\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.546976 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-fernet-keys\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.547117 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-credential-keys\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.547221 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-scripts\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.648371 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-config-data\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.648464 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzr7r\" (UniqueName: \"kubernetes.io/projected/af9df436-38ae-4001-bce6-e97e5e8d9cd2-kube-api-access-zzr7r\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.648502 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-combined-ca-bundle\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.648534 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-fernet-keys\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.648571 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-credential-keys\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.648608 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-scripts\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.653751 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-scripts\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.655434 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-combined-ca-bundle\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.655631 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-config-data\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.656757 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-credential-keys\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.666252 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-fernet-keys\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.671518 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzr7r\" (UniqueName: \"kubernetes.io/projected/af9df436-38ae-4001-bce6-e97e5e8d9cd2-kube-api-access-zzr7r\") pod \"keystone-bootstrap-jrbkq\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:23 crc kubenswrapper[5002]: I1209 10:21:23.762572 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:24 crc kubenswrapper[5002]: I1209 10:21:24.097609 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a626587d-9f0c-47cb-88e6-c28ab4cd1c51" path="/var/lib/kubelet/pods/a626587d-9f0c-47cb-88e6-c28ab4cd1c51/volumes" Dec 09 10:21:28 crc kubenswrapper[5002]: I1209 10:21:28.471172 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-bbdqx" podUID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Dec 09 10:21:33 crc kubenswrapper[5002]: I1209 10:21:33.472077 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-bbdqx" podUID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Dec 09 10:21:33 crc kubenswrapper[5002]: I1209 10:21:33.473114 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.115183 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.240752 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-nb\") pod \"f36484d9-764c-460b-b48e-f0c36e1145e8\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.240793 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-sb\") pod \"f36484d9-764c-460b-b48e-f0c36e1145e8\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.240940 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-config\") pod \"f36484d9-764c-460b-b48e-f0c36e1145e8\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.240995 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zmpm\" (UniqueName: \"kubernetes.io/projected/f36484d9-764c-460b-b48e-f0c36e1145e8-kube-api-access-9zmpm\") pod \"f36484d9-764c-460b-b48e-f0c36e1145e8\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.241057 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-dns-svc\") pod \"f36484d9-764c-460b-b48e-f0c36e1145e8\" (UID: \"f36484d9-764c-460b-b48e-f0c36e1145e8\") " Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.247218 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f36484d9-764c-460b-b48e-f0c36e1145e8-kube-api-access-9zmpm" (OuterVolumeSpecName: "kube-api-access-9zmpm") pod "f36484d9-764c-460b-b48e-f0c36e1145e8" (UID: "f36484d9-764c-460b-b48e-f0c36e1145e8"). InnerVolumeSpecName "kube-api-access-9zmpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.282470 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-config" (OuterVolumeSpecName: "config") pod "f36484d9-764c-460b-b48e-f0c36e1145e8" (UID: "f36484d9-764c-460b-b48e-f0c36e1145e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.283050 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f36484d9-764c-460b-b48e-f0c36e1145e8" (UID: "f36484d9-764c-460b-b48e-f0c36e1145e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.287057 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f36484d9-764c-460b-b48e-f0c36e1145e8" (UID: "f36484d9-764c-460b-b48e-f0c36e1145e8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.289232 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f36484d9-764c-460b-b48e-f0c36e1145e8" (UID: "f36484d9-764c-460b-b48e-f0c36e1145e8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.343588 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zmpm\" (UniqueName: \"kubernetes.io/projected/f36484d9-764c-460b-b48e-f0c36e1145e8-kube-api-access-9zmpm\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.343705 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.343719 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.343733 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.343801 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36484d9-764c-460b-b48e-f0c36e1145e8-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.637882 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bbdqx" event={"ID":"f36484d9-764c-460b-b48e-f0c36e1145e8","Type":"ContainerDied","Data":"7caf4911096d14f6d61486c85bbc7f7c971fceb139005e01326f0df4e4dba26f"} Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.637933 5002 scope.go:117] "RemoveContainer" containerID="0f54f46d5b1dd9e6fe86f181086d298dc0411f77090795c15f7773a0005ebd85" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.637931 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bbdqx" Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.689226 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bbdqx"] Dec 09 10:21:34 crc kubenswrapper[5002]: I1209 10:21:34.699925 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bbdqx"] Dec 09 10:21:36 crc kubenswrapper[5002]: I1209 10:21:36.079868 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f36484d9-764c-460b-b48e-f0c36e1145e8" path="/var/lib/kubelet/pods/f36484d9-764c-460b-b48e-f0c36e1145e8/volumes" Dec 09 10:21:37 crc kubenswrapper[5002]: I1209 10:21:37.964459 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:21:37 crc kubenswrapper[5002]: I1209 10:21:37.964541 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:21:38 crc kubenswrapper[5002]: I1209 10:21:38.473798 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-bbdqx" podUID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Dec 09 10:21:40 crc kubenswrapper[5002]: I1209 10:21:40.298321 5002 scope.go:117] "RemoveContainer" containerID="0061a469361fca7875a55c02553365038f8dd7530d1c4f33aecb9ad66561977a" Dec 09 10:21:40 crc kubenswrapper[5002]: E1209 10:21:40.354168 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 09 10:21:40 crc kubenswrapper[5002]: E1209 10:21:40.354325 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hf8zz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-4kmzk_openstack(20ebb6ea-f36b-440a-a437-ff39f9766fca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:21:40 crc kubenswrapper[5002]: E1209 10:21:40.355868 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-4kmzk" podUID="20ebb6ea-f36b-440a-a437-ff39f9766fca" Dec 09 10:21:40 crc kubenswrapper[5002]: E1209 10:21:40.696097 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-4kmzk" podUID="20ebb6ea-f36b-440a-a437-ff39f9766fca" Dec 09 10:21:40 crc kubenswrapper[5002]: I1209 10:21:40.737456 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jrbkq"] Dec 09 10:21:40 crc kubenswrapper[5002]: E1209 10:21:40.902372 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified" Dec 09 10:21:40 crc kubenswrapper[5002]: E1209 10:21:40.902799 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55fh5cch595h568hf6h689h68fh578h4h5c9h5bbh78hb5h675hc8h97h567h676hdch58ch5ddh657h649h576h677h5dch5b8h565h6dh8bhf5h5f9q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bhwbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(75879941-b12b-4731-a78c-0a0a98142ec0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 10:21:41 crc kubenswrapper[5002]: I1209 10:21:41.704561 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h7bpf" event={"ID":"3d382de4-ddc6-4781-9815-76b74cbccadc","Type":"ContainerStarted","Data":"3e5dbfb4b382b7b6d2a3eddf1c13961cc5734c5034d5bf7a485b13ed5c7407e1"} Dec 09 10:21:41 crc kubenswrapper[5002]: I1209 10:21:41.709369 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t7vtz" event={"ID":"354d641d-dc9c-4aa4-b821-90ce72ef6d5c","Type":"ContainerStarted","Data":"75a1a6ebfe287494a07162022a2dce2dfa8561b0452360830b376abd08ad6489"} Dec 09 10:21:41 crc kubenswrapper[5002]: I1209 10:21:41.711155 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jrbkq" event={"ID":"af9df436-38ae-4001-bce6-e97e5e8d9cd2","Type":"ContainerStarted","Data":"e7cb3a713e4627a3bf966f071a06f11e0cf188dfd353e6767a883425d62ba995"} Dec 09 10:21:41 crc kubenswrapper[5002]: I1209 10:21:41.711186 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jrbkq" event={"ID":"af9df436-38ae-4001-bce6-e97e5e8d9cd2","Type":"ContainerStarted","Data":"8cfe47444b74dc34134a9be9f94d49ad5b4ee47936bb5a626f52c8534d1f3b37"} Dec 09 10:21:41 crc kubenswrapper[5002]: I1209 10:21:41.734842 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-h7bpf" podStartSLOduration=2.829628392 podStartE2EDuration="39.73480395s" podCreationTimestamp="2025-12-09 10:21:02 +0000 UTC" firstStartedPulling="2025-12-09 10:21:03.962095694 +0000 UTC m=+1196.354146775" lastFinishedPulling="2025-12-09 10:21:40.867271252 +0000 UTC m=+1233.259322333" observedRunningTime="2025-12-09 10:21:41.718578031 +0000 UTC m=+1234.110629122" watchObservedRunningTime="2025-12-09 10:21:41.73480395 +0000 UTC m=+1234.126855041" Dec 09 10:21:41 crc kubenswrapper[5002]: I1209 10:21:41.752345 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jrbkq" podStartSLOduration=18.752328184 podStartE2EDuration="18.752328184s" podCreationTimestamp="2025-12-09 10:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:21:41.743784383 +0000 UTC m=+1234.135835464" watchObservedRunningTime="2025-12-09 10:21:41.752328184 +0000 UTC m=+1234.144379265" Dec 09 10:21:41 crc kubenswrapper[5002]: I1209 10:21:41.772544 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-t7vtz" podStartSLOduration=2.74934314 podStartE2EDuration="39.772525911s" podCreationTimestamp="2025-12-09 10:21:02 +0000 UTC" firstStartedPulling="2025-12-09 10:21:03.844091401 +0000 UTC m=+1196.236142482" lastFinishedPulling="2025-12-09 10:21:40.867274152 +0000 UTC m=+1233.259325253" observedRunningTime="2025-12-09 10:21:41.764162384 +0000 UTC m=+1234.156213485" watchObservedRunningTime="2025-12-09 10:21:41.772525911 +0000 UTC m=+1234.164576982" Dec 09 10:21:42 crc kubenswrapper[5002]: I1209 10:21:42.723659 5002 generic.go:334] "Generic (PLEG): container finished" podID="d846942e-6d0d-4e42-a584-a910a56d9718" containerID="b11f0e175cf62d7332d70370bdf186828c934bc8e83952200f4524367cd0c303" exitCode=0 Dec 09 10:21:42 crc kubenswrapper[5002]: I1209 10:21:42.724410 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-h2xt6" event={"ID":"d846942e-6d0d-4e42-a584-a910a56d9718","Type":"ContainerDied","Data":"b11f0e175cf62d7332d70370bdf186828c934bc8e83952200f4524367cd0c303"} Dec 09 10:21:46 crc kubenswrapper[5002]: I1209 10:21:46.760232 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-h2xt6" event={"ID":"d846942e-6d0d-4e42-a584-a910a56d9718","Type":"ContainerDied","Data":"56b299b470e773b7eef13a6091808032a0b755b9eb90b13c8655a128cb44c6a3"} Dec 09 10:21:46 crc kubenswrapper[5002]: I1209 10:21:46.760719 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56b299b470e773b7eef13a6091808032a0b755b9eb90b13c8655a128cb44c6a3" Dec 09 10:21:46 crc kubenswrapper[5002]: I1209 10:21:46.832710 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-h2xt6" Dec 09 10:21:46 crc kubenswrapper[5002]: I1209 10:21:46.961047 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-config-data\") pod \"d846942e-6d0d-4e42-a584-a910a56d9718\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " Dec 09 10:21:46 crc kubenswrapper[5002]: I1209 10:21:46.961117 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmfs8\" (UniqueName: \"kubernetes.io/projected/d846942e-6d0d-4e42-a584-a910a56d9718-kube-api-access-zmfs8\") pod \"d846942e-6d0d-4e42-a584-a910a56d9718\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " Dec 09 10:21:46 crc kubenswrapper[5002]: I1209 10:21:46.961246 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-db-sync-config-data\") pod \"d846942e-6d0d-4e42-a584-a910a56d9718\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " Dec 09 10:21:46 crc kubenswrapper[5002]: I1209 10:21:46.961278 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-combined-ca-bundle\") pod \"d846942e-6d0d-4e42-a584-a910a56d9718\" (UID: \"d846942e-6d0d-4e42-a584-a910a56d9718\") " Dec 09 10:21:46 crc kubenswrapper[5002]: I1209 10:21:46.966722 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d846942e-6d0d-4e42-a584-a910a56d9718" (UID: "d846942e-6d0d-4e42-a584-a910a56d9718"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:46 crc kubenswrapper[5002]: I1209 10:21:46.967535 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d846942e-6d0d-4e42-a584-a910a56d9718-kube-api-access-zmfs8" (OuterVolumeSpecName: "kube-api-access-zmfs8") pod "d846942e-6d0d-4e42-a584-a910a56d9718" (UID: "d846942e-6d0d-4e42-a584-a910a56d9718"). InnerVolumeSpecName "kube-api-access-zmfs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:21:46 crc kubenswrapper[5002]: I1209 10:21:46.989756 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d846942e-6d0d-4e42-a584-a910a56d9718" (UID: "d846942e-6d0d-4e42-a584-a910a56d9718"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:47 crc kubenswrapper[5002]: I1209 10:21:47.012179 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-config-data" (OuterVolumeSpecName: "config-data") pod "d846942e-6d0d-4e42-a584-a910a56d9718" (UID: "d846942e-6d0d-4e42-a584-a910a56d9718"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:47 crc kubenswrapper[5002]: I1209 10:21:47.063420 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:47 crc kubenswrapper[5002]: I1209 10:21:47.063480 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:47 crc kubenswrapper[5002]: I1209 10:21:47.063499 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmfs8\" (UniqueName: \"kubernetes.io/projected/d846942e-6d0d-4e42-a584-a910a56d9718-kube-api-access-zmfs8\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:47 crc kubenswrapper[5002]: I1209 10:21:47.063521 5002 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d846942e-6d0d-4e42-a584-a910a56d9718-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:47 crc kubenswrapper[5002]: I1209 10:21:47.773008 5002 generic.go:334] "Generic (PLEG): container finished" podID="af9df436-38ae-4001-bce6-e97e5e8d9cd2" containerID="e7cb3a713e4627a3bf966f071a06f11e0cf188dfd353e6767a883425d62ba995" exitCode=0 Dec 09 10:21:47 crc kubenswrapper[5002]: I1209 10:21:47.773093 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jrbkq" event={"ID":"af9df436-38ae-4001-bce6-e97e5e8d9cd2","Type":"ContainerDied","Data":"e7cb3a713e4627a3bf966f071a06f11e0cf188dfd353e6767a883425d62ba995"} Dec 09 10:21:47 crc kubenswrapper[5002]: I1209 10:21:47.775913 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-h2xt6" Dec 09 10:21:47 crc kubenswrapper[5002]: I1209 10:21:47.775911 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75879941-b12b-4731-a78c-0a0a98142ec0","Type":"ContainerStarted","Data":"e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7"} Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.241259 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jw5tg"] Dec 09 10:21:48 crc kubenswrapper[5002]: E1209 10:21:48.241606 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerName="init" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.241626 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerName="init" Dec 09 10:21:48 crc kubenswrapper[5002]: E1209 10:21:48.241669 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d846942e-6d0d-4e42-a584-a910a56d9718" containerName="glance-db-sync" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.241676 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d846942e-6d0d-4e42-a584-a910a56d9718" containerName="glance-db-sync" Dec 09 10:21:48 crc kubenswrapper[5002]: E1209 10:21:48.241707 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerName="dnsmasq-dns" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.241716 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerName="dnsmasq-dns" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.243221 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f36484d9-764c-460b-b48e-f0c36e1145e8" containerName="dnsmasq-dns" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.243256 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d846942e-6d0d-4e42-a584-a910a56d9718" containerName="glance-db-sync" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.244115 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.268615 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jw5tg"] Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.384751 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-config\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.385127 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.385190 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.385224 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zqvc\" (UniqueName: \"kubernetes.io/projected/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-kube-api-access-4zqvc\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.385252 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.385283 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.486376 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.486442 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zqvc\" (UniqueName: \"kubernetes.io/projected/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-kube-api-access-4zqvc\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.486489 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.486529 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.486623 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-config\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.486677 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.487697 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.487733 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.487723 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.487727 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.488120 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-config\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.504856 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zqvc\" (UniqueName: \"kubernetes.io/projected/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-kube-api-access-4zqvc\") pod \"dnsmasq-dns-785d8bcb8c-jw5tg\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.572944 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:48 crc kubenswrapper[5002]: I1209 10:21:48.867618 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jw5tg"] Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.110308 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.145952 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:21:49 crc kubenswrapper[5002]: E1209 10:21:49.146313 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9df436-38ae-4001-bce6-e97e5e8d9cd2" containerName="keystone-bootstrap" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.146325 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9df436-38ae-4001-bce6-e97e5e8d9cd2" containerName="keystone-bootstrap" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.146505 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="af9df436-38ae-4001-bce6-e97e5e8d9cd2" containerName="keystone-bootstrap" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.147408 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.168801 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.168833 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.168853 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-55wxd" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.184697 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.305588 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-combined-ca-bundle\") pod \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.305687 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-fernet-keys\") pod \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.305728 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-scripts\") pod \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.305857 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzr7r\" (UniqueName: \"kubernetes.io/projected/af9df436-38ae-4001-bce6-e97e5e8d9cd2-kube-api-access-zzr7r\") pod \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.305929 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-credential-keys\") pod \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.306034 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-config-data\") pod \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\" (UID: \"af9df436-38ae-4001-bce6-e97e5e8d9cd2\") " Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.306425 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.306485 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.306509 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-logs\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.306546 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.306688 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjkz8\" (UniqueName: \"kubernetes.io/projected/d7d18f09-a22c-4b80-bada-ccf2af571218-kube-api-access-sjkz8\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.306727 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.307297 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.310620 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "af9df436-38ae-4001-bce6-e97e5e8d9cd2" (UID: "af9df436-38ae-4001-bce6-e97e5e8d9cd2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.311066 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "af9df436-38ae-4001-bce6-e97e5e8d9cd2" (UID: "af9df436-38ae-4001-bce6-e97e5e8d9cd2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.312161 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-scripts" (OuterVolumeSpecName: "scripts") pod "af9df436-38ae-4001-bce6-e97e5e8d9cd2" (UID: "af9df436-38ae-4001-bce6-e97e5e8d9cd2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.312520 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af9df436-38ae-4001-bce6-e97e5e8d9cd2-kube-api-access-zzr7r" (OuterVolumeSpecName: "kube-api-access-zzr7r") pod "af9df436-38ae-4001-bce6-e97e5e8d9cd2" (UID: "af9df436-38ae-4001-bce6-e97e5e8d9cd2"). InnerVolumeSpecName "kube-api-access-zzr7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.337287 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af9df436-38ae-4001-bce6-e97e5e8d9cd2" (UID: "af9df436-38ae-4001-bce6-e97e5e8d9cd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.337924 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-config-data" (OuterVolumeSpecName: "config-data") pod "af9df436-38ae-4001-bce6-e97e5e8d9cd2" (UID: "af9df436-38ae-4001-bce6-e97e5e8d9cd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.369271 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.370920 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.381106 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.389308 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.411944 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-logs\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412091 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412288 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412353 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjkz8\" (UniqueName: \"kubernetes.io/projected/d7d18f09-a22c-4b80-bada-ccf2af571218-kube-api-access-sjkz8\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412378 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412438 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412524 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412579 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412594 5002 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412618 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412630 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzr7r\" (UniqueName: \"kubernetes.io/projected/af9df436-38ae-4001-bce6-e97e5e8d9cd2-kube-api-access-zzr7r\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412644 5002 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.412654 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af9df436-38ae-4001-bce6-e97e5e8d9cd2-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.414310 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-logs\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.414561 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.415517 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.418824 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.419467 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.419696 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.433357 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjkz8\" (UniqueName: \"kubernetes.io/projected/d7d18f09-a22c-4b80-bada-ccf2af571218-kube-api-access-sjkz8\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.441062 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.489410 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.515821 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.515868 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.515893 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.515940 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.515973 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.515990 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28d5n\" (UniqueName: \"kubernetes.io/projected/3680e6da-327f-4b27-b912-307ee92b8b3c-kube-api-access-28d5n\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.516009 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.618531 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.618882 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.618909 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.618961 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.618992 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.619011 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28d5n\" (UniqueName: \"kubernetes.io/projected/3680e6da-327f-4b27-b912-307ee92b8b3c-kube-api-access-28d5n\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.619008 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.619033 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.619527 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.619563 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.626637 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.626936 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.627197 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.641585 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28d5n\" (UniqueName: \"kubernetes.io/projected/3680e6da-327f-4b27-b912-307ee92b8b3c-kube-api-access-28d5n\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.641603 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.705904 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.868661 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jrbkq" event={"ID":"af9df436-38ae-4001-bce6-e97e5e8d9cd2","Type":"ContainerDied","Data":"8cfe47444b74dc34134a9be9f94d49ad5b4ee47936bb5a626f52c8534d1f3b37"} Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.869002 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cfe47444b74dc34134a9be9f94d49ad5b4ee47936bb5a626f52c8534d1f3b37" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.869080 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jrbkq" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.885879 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6d565f9c5b-d7trd"] Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.887354 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.891045 5002 generic.go:334] "Generic (PLEG): container finished" podID="bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" containerID="38c2b87abaa0f4cd658a92e5537f9dc6e5d0715ab297e11b8a8eb163590eba9f" exitCode=0 Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.891102 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" event={"ID":"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62","Type":"ContainerDied","Data":"38c2b87abaa0f4cd658a92e5537f9dc6e5d0715ab297e11b8a8eb163590eba9f"} Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.891133 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" event={"ID":"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62","Type":"ContainerStarted","Data":"68e63d63ece6441c6edeac172f450d5d628db113e8c081c1d369b8d9be939350"} Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.895660 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.896031 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.896375 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q4kd8" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.897013 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.900614 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.900732 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.905593 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d565f9c5b-d7trd"] Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.929449 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-fernet-keys\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.929529 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v2fw\" (UniqueName: \"kubernetes.io/projected/f514395b-6067-4e42-98e6-f3c5ac427982-kube-api-access-7v2fw\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.929598 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-combined-ca-bundle\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.929653 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-internal-tls-certs\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.929698 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-credential-keys\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.929735 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-public-tls-certs\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.929789 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-scripts\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:49 crc kubenswrapper[5002]: I1209 10:21:49.929930 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-config-data\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.030770 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v2fw\" (UniqueName: \"kubernetes.io/projected/f514395b-6067-4e42-98e6-f3c5ac427982-kube-api-access-7v2fw\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.030861 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-combined-ca-bundle\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.030900 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-internal-tls-certs\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.030932 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-credential-keys\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.030954 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-public-tls-certs\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.030994 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-scripts\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.031075 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-config-data\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.031101 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-fernet-keys\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.045044 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-combined-ca-bundle\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.050989 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.052717 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-config-data\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.053399 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-public-tls-certs\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.054358 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-internal-tls-certs\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.054960 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-credential-keys\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.056339 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-fernet-keys\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.062116 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-scripts\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.089487 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v2fw\" (UniqueName: \"kubernetes.io/projected/f514395b-6067-4e42-98e6-f3c5ac427982-kube-api-access-7v2fw\") pod \"keystone-6d565f9c5b-d7trd\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.236014 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.327564 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.709184 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d565f9c5b-d7trd"] Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.914730 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3680e6da-327f-4b27-b912-307ee92b8b3c","Type":"ContainerStarted","Data":"c85127b1a63299183a99026f7dc1c9ae06d3713f7bb1c5d4c8dc8f3065cf5a72"} Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.916444 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7d18f09-a22c-4b80-bada-ccf2af571218","Type":"ContainerStarted","Data":"5150fe2770c3343f2bce77e39927565d446a081c8f210fe4c24951fa04891b5a"} Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.919560 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d565f9c5b-d7trd" event={"ID":"f514395b-6067-4e42-98e6-f3c5ac427982","Type":"ContainerStarted","Data":"4db14adbffe643510f2f241ba1a2554a6f2f31368c7be043902223832098dcb1"} Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.923744 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" event={"ID":"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62","Type":"ContainerStarted","Data":"06b2d93793641692f807caaff4a3e07155c43cb5b30ab6012f124dd348926e3b"} Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.925092 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:50 crc kubenswrapper[5002]: I1209 10:21:50.942591 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" podStartSLOduration=2.942568669 podStartE2EDuration="2.942568669s" podCreationTimestamp="2025-12-09 10:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:21:50.940924785 +0000 UTC m=+1243.332975886" watchObservedRunningTime="2025-12-09 10:21:50.942568669 +0000 UTC m=+1243.334619750" Dec 09 10:21:51 crc kubenswrapper[5002]: I1209 10:21:51.593428 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:21:51 crc kubenswrapper[5002]: I1209 10:21:51.646979 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:21:51 crc kubenswrapper[5002]: I1209 10:21:51.940741 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3680e6da-327f-4b27-b912-307ee92b8b3c","Type":"ContainerStarted","Data":"343bf0f00e7d5ff02daa6ab4c861dfa5f5d856789d91797a06ba2c289542d5a4"} Dec 09 10:21:51 crc kubenswrapper[5002]: I1209 10:21:51.944127 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7d18f09-a22c-4b80-bada-ccf2af571218","Type":"ContainerStarted","Data":"ccfe910742e430d2244c08e5d19f308204730fa199e05b3ffafb774142544899"} Dec 09 10:21:51 crc kubenswrapper[5002]: I1209 10:21:51.946020 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d565f9c5b-d7trd" event={"ID":"f514395b-6067-4e42-98e6-f3c5ac427982","Type":"ContainerStarted","Data":"6037ffe3713fc44574c5f932602a50f9c94b830e5d71df9a677e79d52c3571ba"} Dec 09 10:21:51 crc kubenswrapper[5002]: I1209 10:21:51.946177 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:21:51 crc kubenswrapper[5002]: I1209 10:21:51.967570 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6d565f9c5b-d7trd" podStartSLOduration=2.967548569 podStartE2EDuration="2.967548569s" podCreationTimestamp="2025-12-09 10:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:21:51.963693714 +0000 UTC m=+1244.355744795" watchObservedRunningTime="2025-12-09 10:21:51.967548569 +0000 UTC m=+1244.359599650" Dec 09 10:21:55 crc kubenswrapper[5002]: I1209 10:21:55.987338 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3680e6da-327f-4b27-b912-307ee92b8b3c","Type":"ContainerStarted","Data":"8bd12516058a654ec2af00a29d00a8d21b1037635dc36ad3d93e623fbf95b1a3"} Dec 09 10:21:55 crc kubenswrapper[5002]: I1209 10:21:55.987453 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3680e6da-327f-4b27-b912-307ee92b8b3c" containerName="glance-log" containerID="cri-o://343bf0f00e7d5ff02daa6ab4c861dfa5f5d856789d91797a06ba2c289542d5a4" gracePeriod=30 Dec 09 10:21:55 crc kubenswrapper[5002]: I1209 10:21:55.987497 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3680e6da-327f-4b27-b912-307ee92b8b3c" containerName="glance-httpd" containerID="cri-o://8bd12516058a654ec2af00a29d00a8d21b1037635dc36ad3d93e623fbf95b1a3" gracePeriod=30 Dec 09 10:21:55 crc kubenswrapper[5002]: I1209 10:21:55.991787 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7d18f09-a22c-4b80-bada-ccf2af571218","Type":"ContainerStarted","Data":"a292317375c9fce2a951dd64047a1cb400be86eeec2cd87b898d31b7f28e226c"} Dec 09 10:21:55 crc kubenswrapper[5002]: I1209 10:21:55.992713 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d7d18f09-a22c-4b80-bada-ccf2af571218" containerName="glance-log" containerID="cri-o://ccfe910742e430d2244c08e5d19f308204730fa199e05b3ffafb774142544899" gracePeriod=30 Dec 09 10:21:55 crc kubenswrapper[5002]: I1209 10:21:55.992762 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d7d18f09-a22c-4b80-bada-ccf2af571218" containerName="glance-httpd" containerID="cri-o://a292317375c9fce2a951dd64047a1cb400be86eeec2cd87b898d31b7f28e226c" gracePeriod=30 Dec 09 10:21:56 crc kubenswrapper[5002]: I1209 10:21:56.027852 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.027832712 podStartE2EDuration="8.027832712s" podCreationTimestamp="2025-12-09 10:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:21:56.015768646 +0000 UTC m=+1248.407819737" watchObservedRunningTime="2025-12-09 10:21:56.027832712 +0000 UTC m=+1248.419883793" Dec 09 10:21:56 crc kubenswrapper[5002]: I1209 10:21:56.044969 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.044939165 podStartE2EDuration="8.044939165s" podCreationTimestamp="2025-12-09 10:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:21:56.035665124 +0000 UTC m=+1248.427716225" watchObservedRunningTime="2025-12-09 10:21:56.044939165 +0000 UTC m=+1248.436990266" Dec 09 10:21:57 crc kubenswrapper[5002]: I1209 10:21:57.001548 5002 generic.go:334] "Generic (PLEG): container finished" podID="3680e6da-327f-4b27-b912-307ee92b8b3c" containerID="8bd12516058a654ec2af00a29d00a8d21b1037635dc36ad3d93e623fbf95b1a3" exitCode=0 Dec 09 10:21:57 crc kubenswrapper[5002]: I1209 10:21:57.001988 5002 generic.go:334] "Generic (PLEG): container finished" podID="3680e6da-327f-4b27-b912-307ee92b8b3c" containerID="343bf0f00e7d5ff02daa6ab4c861dfa5f5d856789d91797a06ba2c289542d5a4" exitCode=143 Dec 09 10:21:57 crc kubenswrapper[5002]: I1209 10:21:57.001625 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3680e6da-327f-4b27-b912-307ee92b8b3c","Type":"ContainerDied","Data":"8bd12516058a654ec2af00a29d00a8d21b1037635dc36ad3d93e623fbf95b1a3"} Dec 09 10:21:57 crc kubenswrapper[5002]: I1209 10:21:57.002079 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3680e6da-327f-4b27-b912-307ee92b8b3c","Type":"ContainerDied","Data":"343bf0f00e7d5ff02daa6ab4c861dfa5f5d856789d91797a06ba2c289542d5a4"} Dec 09 10:21:57 crc kubenswrapper[5002]: I1209 10:21:57.004488 5002 generic.go:334] "Generic (PLEG): container finished" podID="d7d18f09-a22c-4b80-bada-ccf2af571218" containerID="a292317375c9fce2a951dd64047a1cb400be86eeec2cd87b898d31b7f28e226c" exitCode=0 Dec 09 10:21:57 crc kubenswrapper[5002]: I1209 10:21:57.004528 5002 generic.go:334] "Generic (PLEG): container finished" podID="d7d18f09-a22c-4b80-bada-ccf2af571218" containerID="ccfe910742e430d2244c08e5d19f308204730fa199e05b3ffafb774142544899" exitCode=143 Dec 09 10:21:57 crc kubenswrapper[5002]: I1209 10:21:57.004547 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7d18f09-a22c-4b80-bada-ccf2af571218","Type":"ContainerDied","Data":"a292317375c9fce2a951dd64047a1cb400be86eeec2cd87b898d31b7f28e226c"} Dec 09 10:21:57 crc kubenswrapper[5002]: I1209 10:21:57.004566 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7d18f09-a22c-4b80-bada-ccf2af571218","Type":"ContainerDied","Data":"ccfe910742e430d2244c08e5d19f308204730fa199e05b3ffafb774142544899"} Dec 09 10:21:58 crc kubenswrapper[5002]: I1209 10:21:58.575107 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:21:58 crc kubenswrapper[5002]: I1209 10:21:58.684530 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f7nbc"] Dec 09 10:21:58 crc kubenswrapper[5002]: I1209 10:21:58.684837 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" podUID="c970f7fd-c67b-4cda-9124-7fb1709c8d73" containerName="dnsmasq-dns" containerID="cri-o://3b3e416e7518a7d77f215c8aaaec78e67b328f6a7988e8127692752dfbd2bd6c" gracePeriod=10 Dec 09 10:22:00 crc kubenswrapper[5002]: I1209 10:22:00.034045 5002 generic.go:334] "Generic (PLEG): container finished" podID="c970f7fd-c67b-4cda-9124-7fb1709c8d73" containerID="3b3e416e7518a7d77f215c8aaaec78e67b328f6a7988e8127692752dfbd2bd6c" exitCode=0 Dec 09 10:22:00 crc kubenswrapper[5002]: I1209 10:22:00.034121 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" event={"ID":"c970f7fd-c67b-4cda-9124-7fb1709c8d73","Type":"ContainerDied","Data":"3b3e416e7518a7d77f215c8aaaec78e67b328f6a7988e8127692752dfbd2bd6c"} Dec 09 10:22:02 crc kubenswrapper[5002]: I1209 10:22:02.052484 5002 generic.go:334] "Generic (PLEG): container finished" podID="3d382de4-ddc6-4781-9815-76b74cbccadc" containerID="3e5dbfb4b382b7b6d2a3eddf1c13961cc5734c5034d5bf7a485b13ed5c7407e1" exitCode=0 Dec 09 10:22:02 crc kubenswrapper[5002]: I1209 10:22:02.052613 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h7bpf" event={"ID":"3d382de4-ddc6-4781-9815-76b74cbccadc","Type":"ContainerDied","Data":"3e5dbfb4b382b7b6d2a3eddf1c13961cc5734c5034d5bf7a485b13ed5c7407e1"} Dec 09 10:22:03 crc kubenswrapper[5002]: E1209 10:22:03.887729 5002 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Dec 09 10:22:03 crc kubenswrapper[5002]: E1209 10:22:03.888342 5002 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bhwbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(75879941-b12b-4731-a78c-0a0a98142ec0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 10:22:03 crc kubenswrapper[5002]: E1209 10:22:03.889924 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="75879941-b12b-4731-a78c-0a0a98142ec0" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.099905 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75879941-b12b-4731-a78c-0a0a98142ec0" containerName="sg-core" containerID="cri-o://e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7" gracePeriod=30 Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.634656 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.644488 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.645599 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.702772 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h7bpf" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741113 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-scripts\") pod \"d7d18f09-a22c-4b80-bada-ccf2af571218\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741180 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-logs\") pod \"3680e6da-327f-4b27-b912-307ee92b8b3c\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741218 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"3680e6da-327f-4b27-b912-307ee92b8b3c\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741251 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjkz8\" (UniqueName: \"kubernetes.io/projected/d7d18f09-a22c-4b80-bada-ccf2af571218-kube-api-access-sjkz8\") pod \"d7d18f09-a22c-4b80-bada-ccf2af571218\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741290 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-combined-ca-bundle\") pod \"d7d18f09-a22c-4b80-bada-ccf2af571218\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741336 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-swift-storage-0\") pod \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741372 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-config-data\") pod \"d7d18f09-a22c-4b80-bada-ccf2af571218\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741400 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64x56\" (UniqueName: \"kubernetes.io/projected/3d382de4-ddc6-4781-9815-76b74cbccadc-kube-api-access-64x56\") pod \"3d382de4-ddc6-4781-9815-76b74cbccadc\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741434 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-svc\") pod \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741472 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-logs\") pod \"d7d18f09-a22c-4b80-bada-ccf2af571218\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741498 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-config-data\") pod \"3680e6da-327f-4b27-b912-307ee92b8b3c\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741530 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-sb\") pod \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741566 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28d5n\" (UniqueName: \"kubernetes.io/projected/3680e6da-327f-4b27-b912-307ee92b8b3c-kube-api-access-28d5n\") pod \"3680e6da-327f-4b27-b912-307ee92b8b3c\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741596 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-scripts\") pod \"3680e6da-327f-4b27-b912-307ee92b8b3c\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741616 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-combined-ca-bundle\") pod \"3d382de4-ddc6-4781-9815-76b74cbccadc\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741634 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-nb\") pod \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741657 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-config-data\") pod \"3d382de4-ddc6-4781-9815-76b74cbccadc\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741727 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d382de4-ddc6-4781-9815-76b74cbccadc-logs\") pod \"3d382de4-ddc6-4781-9815-76b74cbccadc\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741755 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2rmc\" (UniqueName: \"kubernetes.io/projected/c970f7fd-c67b-4cda-9124-7fb1709c8d73-kube-api-access-p2rmc\") pod \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741777 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-config\") pod \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\" (UID: \"c970f7fd-c67b-4cda-9124-7fb1709c8d73\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741825 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-combined-ca-bundle\") pod \"3680e6da-327f-4b27-b912-307ee92b8b3c\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741852 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-scripts\") pod \"3d382de4-ddc6-4781-9815-76b74cbccadc\" (UID: \"3d382de4-ddc6-4781-9815-76b74cbccadc\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741870 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"d7d18f09-a22c-4b80-bada-ccf2af571218\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741911 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-httpd-run\") pod \"3680e6da-327f-4b27-b912-307ee92b8b3c\" (UID: \"3680e6da-327f-4b27-b912-307ee92b8b3c\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.741937 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-httpd-run\") pod \"d7d18f09-a22c-4b80-bada-ccf2af571218\" (UID: \"d7d18f09-a22c-4b80-bada-ccf2af571218\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.742652 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d7d18f09-a22c-4b80-bada-ccf2af571218" (UID: "d7d18f09-a22c-4b80-bada-ccf2af571218"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.751423 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-scripts" (OuterVolumeSpecName: "scripts") pod "3680e6da-327f-4b27-b912-307ee92b8b3c" (UID: "3680e6da-327f-4b27-b912-307ee92b8b3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.752512 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-logs" (OuterVolumeSpecName: "logs") pod "d7d18f09-a22c-4b80-bada-ccf2af571218" (UID: "d7d18f09-a22c-4b80-bada-ccf2af571218"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.757617 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3680e6da-327f-4b27-b912-307ee92b8b3c-kube-api-access-28d5n" (OuterVolumeSpecName: "kube-api-access-28d5n") pod "3680e6da-327f-4b27-b912-307ee92b8b3c" (UID: "3680e6da-327f-4b27-b912-307ee92b8b3c"). InnerVolumeSpecName "kube-api-access-28d5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.757650 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d382de4-ddc6-4781-9815-76b74cbccadc-logs" (OuterVolumeSpecName: "logs") pod "3d382de4-ddc6-4781-9815-76b74cbccadc" (UID: "3d382de4-ddc6-4781-9815-76b74cbccadc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.757904 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-logs" (OuterVolumeSpecName: "logs") pod "3680e6da-327f-4b27-b912-307ee92b8b3c" (UID: "3680e6da-327f-4b27-b912-307ee92b8b3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.758072 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3680e6da-327f-4b27-b912-307ee92b8b3c" (UID: "3680e6da-327f-4b27-b912-307ee92b8b3c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.758184 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d382de4-ddc6-4781-9815-76b74cbccadc-kube-api-access-64x56" (OuterVolumeSpecName: "kube-api-access-64x56") pod "3d382de4-ddc6-4781-9815-76b74cbccadc" (UID: "3d382de4-ddc6-4781-9815-76b74cbccadc"). InnerVolumeSpecName "kube-api-access-64x56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.758528 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c970f7fd-c67b-4cda-9124-7fb1709c8d73-kube-api-access-p2rmc" (OuterVolumeSpecName: "kube-api-access-p2rmc") pod "c970f7fd-c67b-4cda-9124-7fb1709c8d73" (UID: "c970f7fd-c67b-4cda-9124-7fb1709c8d73"). InnerVolumeSpecName "kube-api-access-p2rmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.759887 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d18f09-a22c-4b80-bada-ccf2af571218-kube-api-access-sjkz8" (OuterVolumeSpecName: "kube-api-access-sjkz8") pod "d7d18f09-a22c-4b80-bada-ccf2af571218" (UID: "d7d18f09-a22c-4b80-bada-ccf2af571218"). InnerVolumeSpecName "kube-api-access-sjkz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.760385 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "3680e6da-327f-4b27-b912-307ee92b8b3c" (UID: "3680e6da-327f-4b27-b912-307ee92b8b3c"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.766536 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-scripts" (OuterVolumeSpecName: "scripts") pod "d7d18f09-a22c-4b80-bada-ccf2af571218" (UID: "d7d18f09-a22c-4b80-bada-ccf2af571218"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.766539 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "d7d18f09-a22c-4b80-bada-ccf2af571218" (UID: "d7d18f09-a22c-4b80-bada-ccf2af571218"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.766551 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-scripts" (OuterVolumeSpecName: "scripts") pod "3d382de4-ddc6-4781-9815-76b74cbccadc" (UID: "3d382de4-ddc6-4781-9815-76b74cbccadc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.801919 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7d18f09-a22c-4b80-bada-ccf2af571218" (UID: "d7d18f09-a22c-4b80-bada-ccf2af571218"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.806069 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d382de4-ddc6-4781-9815-76b74cbccadc" (UID: "3d382de4-ddc6-4781-9815-76b74cbccadc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.815137 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-config-data" (OuterVolumeSpecName: "config-data") pod "d7d18f09-a22c-4b80-bada-ccf2af571218" (UID: "d7d18f09-a22c-4b80-bada-ccf2af571218"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.830528 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3680e6da-327f-4b27-b912-307ee92b8b3c" (UID: "3680e6da-327f-4b27-b912-307ee92b8b3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.832172 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-config-data" (OuterVolumeSpecName: "config-data") pod "3d382de4-ddc6-4781-9815-76b74cbccadc" (UID: "3d382de4-ddc6-4781-9815-76b74cbccadc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.834850 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c970f7fd-c67b-4cda-9124-7fb1709c8d73" (UID: "c970f7fd-c67b-4cda-9124-7fb1709c8d73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.834943 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c970f7fd-c67b-4cda-9124-7fb1709c8d73" (UID: "c970f7fd-c67b-4cda-9124-7fb1709c8d73"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.843906 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d382de4-ddc6-4781-9815-76b74cbccadc-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.843944 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2rmc\" (UniqueName: \"kubernetes.io/projected/c970f7fd-c67b-4cda-9124-7fb1709c8d73-kube-api-access-p2rmc\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.843961 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.843973 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844008 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844024 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844036 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844046 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844056 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3680e6da-327f-4b27-b912-307ee92b8b3c-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844082 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844095 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjkz8\" (UniqueName: \"kubernetes.io/projected/d7d18f09-a22c-4b80-bada-ccf2af571218-kube-api-access-sjkz8\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844108 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844120 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d18f09-a22c-4b80-bada-ccf2af571218-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844131 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64x56\" (UniqueName: \"kubernetes.io/projected/3d382de4-ddc6-4781-9815-76b74cbccadc-kube-api-access-64x56\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844145 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844158 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7d18f09-a22c-4b80-bada-ccf2af571218-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844170 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28d5n\" (UniqueName: \"kubernetes.io/projected/3680e6da-327f-4b27-b912-307ee92b8b3c-kube-api-access-28d5n\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844181 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844191 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844203 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844215 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d382de4-ddc6-4781-9815-76b74cbccadc-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.844308 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-config" (OuterVolumeSpecName: "config") pod "c970f7fd-c67b-4cda-9124-7fb1709c8d73" (UID: "c970f7fd-c67b-4cda-9124-7fb1709c8d73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.855831 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c970f7fd-c67b-4cda-9124-7fb1709c8d73" (UID: "c970f7fd-c67b-4cda-9124-7fb1709c8d73"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.856057 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-config-data" (OuterVolumeSpecName: "config-data") pod "3680e6da-327f-4b27-b912-307ee92b8b3c" (UID: "3680e6da-327f-4b27-b912-307ee92b8b3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.865308 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.868207 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c970f7fd-c67b-4cda-9124-7fb1709c8d73" (UID: "c970f7fd-c67b-4cda-9124-7fb1709c8d73"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.873042 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.884882 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.945758 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-log-httpd\") pod \"75879941-b12b-4731-a78c-0a0a98142ec0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.945807 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-run-httpd\") pod \"75879941-b12b-4731-a78c-0a0a98142ec0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.945936 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-config-data\") pod \"75879941-b12b-4731-a78c-0a0a98142ec0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.945980 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-scripts\") pod \"75879941-b12b-4731-a78c-0a0a98142ec0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.946054 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-combined-ca-bundle\") pod \"75879941-b12b-4731-a78c-0a0a98142ec0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.946093 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-sg-core-conf-yaml\") pod \"75879941-b12b-4731-a78c-0a0a98142ec0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.946119 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhwbz\" (UniqueName: \"kubernetes.io/projected/75879941-b12b-4731-a78c-0a0a98142ec0-kube-api-access-bhwbz\") pod \"75879941-b12b-4731-a78c-0a0a98142ec0\" (UID: \"75879941-b12b-4731-a78c-0a0a98142ec0\") " Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.946581 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.946599 5002 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.946612 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3680e6da-327f-4b27-b912-307ee92b8b3c-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.946622 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.946633 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c970f7fd-c67b-4cda-9124-7fb1709c8d73-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.946644 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.947301 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "75879941-b12b-4731-a78c-0a0a98142ec0" (UID: "75879941-b12b-4731-a78c-0a0a98142ec0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.947487 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "75879941-b12b-4731-a78c-0a0a98142ec0" (UID: "75879941-b12b-4731-a78c-0a0a98142ec0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.949371 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-config-data" (OuterVolumeSpecName: "config-data") pod "75879941-b12b-4731-a78c-0a0a98142ec0" (UID: "75879941-b12b-4731-a78c-0a0a98142ec0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.949440 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75879941-b12b-4731-a78c-0a0a98142ec0-kube-api-access-bhwbz" (OuterVolumeSpecName: "kube-api-access-bhwbz") pod "75879941-b12b-4731-a78c-0a0a98142ec0" (UID: "75879941-b12b-4731-a78c-0a0a98142ec0"). InnerVolumeSpecName "kube-api-access-bhwbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.949678 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-scripts" (OuterVolumeSpecName: "scripts") pod "75879941-b12b-4731-a78c-0a0a98142ec0" (UID: "75879941-b12b-4731-a78c-0a0a98142ec0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.949985 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75879941-b12b-4731-a78c-0a0a98142ec0" (UID: "75879941-b12b-4731-a78c-0a0a98142ec0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:04 crc kubenswrapper[5002]: I1209 10:22:04.966914 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "75879941-b12b-4731-a78c-0a0a98142ec0" (UID: "75879941-b12b-4731-a78c-0a0a98142ec0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.048174 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.048208 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.048220 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.048234 5002 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75879941-b12b-4731-a78c-0a0a98142ec0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.048244 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhwbz\" (UniqueName: \"kubernetes.io/projected/75879941-b12b-4731-a78c-0a0a98142ec0-kube-api-access-bhwbz\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.048252 5002 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.048259 5002 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75879941-b12b-4731-a78c-0a0a98142ec0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.119197 5002 generic.go:334] "Generic (PLEG): container finished" podID="75879941-b12b-4731-a78c-0a0a98142ec0" containerID="e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7" exitCode=2 Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.119318 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75879941-b12b-4731-a78c-0a0a98142ec0","Type":"ContainerDied","Data":"e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7"} Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.119367 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75879941-b12b-4731-a78c-0a0a98142ec0","Type":"ContainerDied","Data":"486c0ca39358d11caa0e68e7d6cd892da2db57788aa03abb8378ffde555f2603"} Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.119404 5002 scope.go:117] "RemoveContainer" containerID="e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.119630 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.127742 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h7bpf" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.127751 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h7bpf" event={"ID":"3d382de4-ddc6-4781-9815-76b74cbccadc","Type":"ContainerDied","Data":"232e87488380ff257ae17fd57d7b581238edbae3601538af844b23a4915dc98d"} Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.127806 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="232e87488380ff257ae17fd57d7b581238edbae3601538af844b23a4915dc98d" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.134524 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.136116 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3680e6da-327f-4b27-b912-307ee92b8b3c","Type":"ContainerDied","Data":"c85127b1a63299183a99026f7dc1c9ae06d3713f7bb1c5d4c8dc8f3065cf5a72"} Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.139223 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.139211 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" event={"ID":"c970f7fd-c67b-4cda-9124-7fb1709c8d73","Type":"ContainerDied","Data":"1c889b2782a0a8e98e182f9263a87ed7328a8e3f02a7deaa0fd74e86f56ee667"} Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.142184 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7d18f09-a22c-4b80-bada-ccf2af571218","Type":"ContainerDied","Data":"5150fe2770c3343f2bce77e39927565d446a081c8f210fe4c24951fa04891b5a"} Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.142289 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.171044 5002 scope.go:117] "RemoveContainer" containerID="e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7" Dec 09 10:22:05 crc kubenswrapper[5002]: E1209 10:22:05.171574 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7\": container with ID starting with e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7 not found: ID does not exist" containerID="e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.171618 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7"} err="failed to get container status \"e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7\": rpc error: code = NotFound desc = could not find container \"e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7\": container with ID starting with e8e7f162ad7d3f433f73ef45ae73730565d0f4530dc8afec383b2535046455a7 not found: ID does not exist" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.171648 5002 scope.go:117] "RemoveContainer" containerID="8bd12516058a654ec2af00a29d00a8d21b1037635dc36ad3d93e623fbf95b1a3" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.232744 5002 scope.go:117] "RemoveContainer" containerID="343bf0f00e7d5ff02daa6ab4c861dfa5f5d856789d91797a06ba2c289542d5a4" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.248716 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.256413 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.264065 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.271128 5002 scope.go:117] "RemoveContainer" containerID="3b3e416e7518a7d77f215c8aaaec78e67b328f6a7988e8127692752dfbd2bd6c" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.285516 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.285587 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: E1209 10:22:05.285863 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3680e6da-327f-4b27-b912-307ee92b8b3c" containerName="glance-log" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.285878 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3680e6da-327f-4b27-b912-307ee92b8b3c" containerName="glance-log" Dec 09 10:22:05 crc kubenswrapper[5002]: E1209 10:22:05.285896 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c970f7fd-c67b-4cda-9124-7fb1709c8d73" containerName="dnsmasq-dns" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.285903 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c970f7fd-c67b-4cda-9124-7fb1709c8d73" containerName="dnsmasq-dns" Dec 09 10:22:05 crc kubenswrapper[5002]: E1209 10:22:05.285915 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3680e6da-327f-4b27-b912-307ee92b8b3c" containerName="glance-httpd" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.285921 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3680e6da-327f-4b27-b912-307ee92b8b3c" containerName="glance-httpd" Dec 09 10:22:05 crc kubenswrapper[5002]: E1209 10:22:05.285931 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c970f7fd-c67b-4cda-9124-7fb1709c8d73" containerName="init" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.285936 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c970f7fd-c67b-4cda-9124-7fb1709c8d73" containerName="init" Dec 09 10:22:05 crc kubenswrapper[5002]: E1209 10:22:05.285946 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75879941-b12b-4731-a78c-0a0a98142ec0" containerName="sg-core" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.285952 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="75879941-b12b-4731-a78c-0a0a98142ec0" containerName="sg-core" Dec 09 10:22:05 crc kubenswrapper[5002]: E1209 10:22:05.285965 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d18f09-a22c-4b80-bada-ccf2af571218" containerName="glance-log" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.285971 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d18f09-a22c-4b80-bada-ccf2af571218" containerName="glance-log" Dec 09 10:22:05 crc kubenswrapper[5002]: E1209 10:22:05.285981 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d18f09-a22c-4b80-bada-ccf2af571218" containerName="glance-httpd" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.285987 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d18f09-a22c-4b80-bada-ccf2af571218" containerName="glance-httpd" Dec 09 10:22:05 crc kubenswrapper[5002]: E1209 10:22:05.285998 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d382de4-ddc6-4781-9815-76b74cbccadc" containerName="placement-db-sync" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.286003 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d382de4-ddc6-4781-9815-76b74cbccadc" containerName="placement-db-sync" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.286144 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d382de4-ddc6-4781-9815-76b74cbccadc" containerName="placement-db-sync" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.286158 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d18f09-a22c-4b80-bada-ccf2af571218" containerName="glance-log" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.286168 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3680e6da-327f-4b27-b912-307ee92b8b3c" containerName="glance-log" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.286179 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c970f7fd-c67b-4cda-9124-7fb1709c8d73" containerName="dnsmasq-dns" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.286189 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="75879941-b12b-4731-a78c-0a0a98142ec0" containerName="sg-core" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.286195 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d18f09-a22c-4b80-bada-ccf2af571218" containerName="glance-httpd" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.286206 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3680e6da-327f-4b27-b912-307ee92b8b3c" containerName="glance-httpd" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.287555 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.287637 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.299731 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.300112 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.301412 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.309383 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.316163 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.317431 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.318021 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-55wxd" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.318353 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.335115 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f7nbc"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.335317 5002 scope.go:117] "RemoveContainer" containerID="dadfd246b3ac2b0292fc0636f26839560ad077ac4fccca19414d5ff12e33abec" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.354735 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f7nbc"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.356923 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-scripts\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.356979 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.357100 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.357425 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-run-httpd\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.357461 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-log-httpd\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.357513 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q954\" (UniqueName: \"kubernetes.io/projected/c75244ea-a44f-4b49-b1dc-05a025bda463-kube-api-access-4q954\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.357545 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvsrm\" (UniqueName: \"kubernetes.io/projected/291d30af-cc1c-4fa3-9496-9695e50f0c6d-kube-api-access-mvsrm\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.357605 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.357688 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.357745 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-config-data\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.357789 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.357835 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.357989 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-logs\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.358109 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.358147 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.373405 5002 scope.go:117] "RemoveContainer" containerID="a292317375c9fce2a951dd64047a1cb400be86eeec2cd87b898d31b7f28e226c" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.378265 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.396312 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.404341 5002 scope.go:117] "RemoveContainer" containerID="ccfe910742e430d2244c08e5d19f308204730fa199e05b3ffafb774142544899" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.406507 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.417097 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.418736 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.420752 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.421401 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.438781 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461358 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q954\" (UniqueName: \"kubernetes.io/projected/c75244ea-a44f-4b49-b1dc-05a025bda463-kube-api-access-4q954\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461407 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvsrm\" (UniqueName: \"kubernetes.io/projected/291d30af-cc1c-4fa3-9496-9695e50f0c6d-kube-api-access-mvsrm\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461438 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461461 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461480 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-config-data\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461512 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461531 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461556 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-config-data\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461579 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461598 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461640 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461660 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-logs\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461690 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-scripts\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461705 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96lf6\" (UniqueName: \"kubernetes.io/projected/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-kube-api-access-96lf6\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461726 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-logs\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461748 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461764 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461779 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461795 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-scripts\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461873 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461902 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461919 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-run-httpd\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.461936 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-log-httpd\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.463035 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-log-httpd\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.463165 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.463441 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-run-httpd\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.466630 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.467182 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.467296 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.467665 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-logs\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.467928 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.469486 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.469875 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.470406 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-config-data\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.470472 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.470867 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-scripts\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.477760 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q954\" (UniqueName: \"kubernetes.io/projected/c75244ea-a44f-4b49-b1dc-05a025bda463-kube-api-access-4q954\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.484744 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvsrm\" (UniqueName: \"kubernetes.io/projected/291d30af-cc1c-4fa3-9496-9695e50f0c6d-kube-api-access-mvsrm\") pod \"ceilometer-0\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.491225 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.563348 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.563585 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.563856 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-config-data\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.563942 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.564002 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.564049 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-scripts\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.564069 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96lf6\" (UniqueName: \"kubernetes.io/projected/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-kube-api-access-96lf6\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.564093 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-logs\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.564122 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.564655 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.566700 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-logs\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.569655 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.569943 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-config-data\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.570852 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.576328 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-scripts\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.581612 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96lf6\" (UniqueName: \"kubernetes.io/projected/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-kube-api-access-96lf6\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.600207 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.648996 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.669329 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.736190 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.874268 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-56f74754d8-5pd9q"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.876412 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.879763 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.879835 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.880137 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f557k" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.880264 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.884275 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.896777 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56f74754d8-5pd9q"] Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.970265 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-public-tls-certs\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.970355 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-config-data\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.970393 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-logs\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.970409 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lln2n\" (UniqueName: \"kubernetes.io/projected/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-kube-api-access-lln2n\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.970454 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-internal-tls-certs\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.970482 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-scripts\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:05 crc kubenswrapper[5002]: I1209 10:22:05.970533 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-combined-ca-bundle\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.071885 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-config-data\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.071951 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-logs\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.071977 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lln2n\" (UniqueName: \"kubernetes.io/projected/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-kube-api-access-lln2n\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.072041 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-internal-tls-certs\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.072076 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-scripts\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.072129 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-combined-ca-bundle\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.072154 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-public-tls-certs\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.073233 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-logs\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.074940 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3680e6da-327f-4b27-b912-307ee92b8b3c" path="/var/lib/kubelet/pods/3680e6da-327f-4b27-b912-307ee92b8b3c/volumes" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.075999 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75879941-b12b-4731-a78c-0a0a98142ec0" path="/var/lib/kubelet/pods/75879941-b12b-4731-a78c-0a0a98142ec0/volumes" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.076763 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-internal-tls-certs\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.077471 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-config-data\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.077674 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-combined-ca-bundle\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.078169 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c970f7fd-c67b-4cda-9124-7fb1709c8d73" path="/var/lib/kubelet/pods/c970f7fd-c67b-4cda-9124-7fb1709c8d73/volumes" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.079557 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d18f09-a22c-4b80-bada-ccf2af571218" path="/var/lib/kubelet/pods/d7d18f09-a22c-4b80-bada-ccf2af571218/volumes" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.080836 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-public-tls-certs\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.082264 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-scripts\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.091704 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lln2n\" (UniqueName: \"kubernetes.io/projected/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-kube-api-access-lln2n\") pod \"placement-56f74754d8-5pd9q\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.220513 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.380267 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.390726 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.482313 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:22:06 crc kubenswrapper[5002]: W1209 10:22:06.485482 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80f62273_a2cb_48fc_9dc4_e3bbe09bb517.slice/crio-35b3b18d0464130b0caf24ef40d7ddaa70838f5a6d56273b6211e485d18b1c4e WatchSource:0}: Error finding container 35b3b18d0464130b0caf24ef40d7ddaa70838f5a6d56273b6211e485d18b1c4e: Status 404 returned error can't find the container with id 35b3b18d0464130b0caf24ef40d7ddaa70838f5a6d56273b6211e485d18b1c4e Dec 09 10:22:06 crc kubenswrapper[5002]: W1209 10:22:06.683253 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0172d8ed_9ef1_4aac_b246_1b1ed0df87fc.slice/crio-aea051762e75525f31b492676d261ace96e48ef4c3f11cdd6c08c119ff71d01a WatchSource:0}: Error finding container aea051762e75525f31b492676d261ace96e48ef4c3f11cdd6c08c119ff71d01a: Status 404 returned error can't find the container with id aea051762e75525f31b492676d261ace96e48ef4c3f11cdd6c08c119ff71d01a Dec 09 10:22:06 crc kubenswrapper[5002]: I1209 10:22:06.691807 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56f74754d8-5pd9q"] Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.172168 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80f62273-a2cb-48fc-9dc4-e3bbe09bb517","Type":"ContainerStarted","Data":"8184adac7ec070e2ec700e89b74a2520d43784f0369369c0ef173edb3c57e1cf"} Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.172513 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80f62273-a2cb-48fc-9dc4-e3bbe09bb517","Type":"ContainerStarted","Data":"35b3b18d0464130b0caf24ef40d7ddaa70838f5a6d56273b6211e485d18b1c4e"} Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.175374 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4kmzk" event={"ID":"20ebb6ea-f36b-440a-a437-ff39f9766fca","Type":"ContainerStarted","Data":"0dd06cfd3c38da8c60bac35260c461aa9a32defee6ab2c78a1bf7739889b67d1"} Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.181354 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c75244ea-a44f-4b49-b1dc-05a025bda463","Type":"ContainerStarted","Data":"6ef1103090d4b3937fed5f5bb7e4f7971545d5a53212bed951a69d369cd8977f"} Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.181389 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c75244ea-a44f-4b49-b1dc-05a025bda463","Type":"ContainerStarted","Data":"d0ed4bc24d24b47504be690fd23fabf977bc8994658940bb1646e023334390b9"} Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.183212 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56f74754d8-5pd9q" event={"ID":"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc","Type":"ContainerStarted","Data":"e31fd05bd7e517de1ef420971d431d6a2f3089efe18a744cd769acc8bbc26a81"} Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.183237 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56f74754d8-5pd9q" event={"ID":"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc","Type":"ContainerStarted","Data":"aea051762e75525f31b492676d261ace96e48ef4c3f11cdd6c08c119ff71d01a"} Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.188505 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291d30af-cc1c-4fa3-9496-9695e50f0c6d","Type":"ContainerStarted","Data":"fe819e8b4ac5ab36a4e3321f6ac6e5980c36812e86424f20003abaefcd42d0c9"} Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.964713 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.965315 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.965357 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.965974 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3884e46cf25151268d65649fde8e75f33e599a76a13b5c73816d374f2399025a"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:22:07 crc kubenswrapper[5002]: I1209 10:22:07.966037 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://3884e46cf25151268d65649fde8e75f33e599a76a13b5c73816d374f2399025a" gracePeriod=600 Dec 09 10:22:08 crc kubenswrapper[5002]: I1209 10:22:08.105418 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-4kmzk" podStartSLOduration=5.288680271 podStartE2EDuration="1m6.105403617s" podCreationTimestamp="2025-12-09 10:21:02 +0000 UTC" firstStartedPulling="2025-12-09 10:21:03.70179981 +0000 UTC m=+1196.093850891" lastFinishedPulling="2025-12-09 10:22:04.518523156 +0000 UTC m=+1256.910574237" observedRunningTime="2025-12-09 10:22:07.198157485 +0000 UTC m=+1259.590208566" watchObservedRunningTime="2025-12-09 10:22:08.105403617 +0000 UTC m=+1260.497454698" Dec 09 10:22:08 crc kubenswrapper[5002]: I1209 10:22:08.197744 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c75244ea-a44f-4b49-b1dc-05a025bda463","Type":"ContainerStarted","Data":"473e3dfd3dbf213aeb987539706506bc04d20d03a4539c96c16ec84e35a95b56"} Dec 09 10:22:08 crc kubenswrapper[5002]: I1209 10:22:08.199729 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56f74754d8-5pd9q" event={"ID":"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc","Type":"ContainerStarted","Data":"1767450d54e834d07c9f4e17540dd48734de123d1bb880696d3cd80a533970df"} Dec 09 10:22:08 crc kubenswrapper[5002]: I1209 10:22:08.200680 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:08 crc kubenswrapper[5002]: I1209 10:22:08.200714 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:08 crc kubenswrapper[5002]: I1209 10:22:08.203035 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291d30af-cc1c-4fa3-9496-9695e50f0c6d","Type":"ContainerStarted","Data":"2b2f21b165e9b00eb03e02f6f6c0144e56f440afcbd50f22076c2a201b9d2b99"} Dec 09 10:22:08 crc kubenswrapper[5002]: I1209 10:22:08.204966 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80f62273-a2cb-48fc-9dc4-e3bbe09bb517","Type":"ContainerStarted","Data":"5ca2ef47d14df1b7854d3d7794e4e04fdddf2bb4f558c18e3a96da0d4a26ce8c"} Dec 09 10:22:08 crc kubenswrapper[5002]: I1209 10:22:08.257686 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.2576651180000002 podStartE2EDuration="3.257665118s" podCreationTimestamp="2025-12-09 10:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:08.232186369 +0000 UTC m=+1260.624237460" watchObservedRunningTime="2025-12-09 10:22:08.257665118 +0000 UTC m=+1260.649716199" Dec 09 10:22:08 crc kubenswrapper[5002]: I1209 10:22:08.260527 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.260519295 podStartE2EDuration="3.260519295s" podCreationTimestamp="2025-12-09 10:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:08.253071264 +0000 UTC m=+1260.645122345" watchObservedRunningTime="2025-12-09 10:22:08.260519295 +0000 UTC m=+1260.652570376" Dec 09 10:22:08 crc kubenswrapper[5002]: I1209 10:22:08.275450 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-56f74754d8-5pd9q" podStartSLOduration=3.275411438 podStartE2EDuration="3.275411438s" podCreationTimestamp="2025-12-09 10:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:08.274210806 +0000 UTC m=+1260.666261897" watchObservedRunningTime="2025-12-09 10:22:08.275411438 +0000 UTC m=+1260.667462539" Dec 09 10:22:08 crc kubenswrapper[5002]: I1209 10:22:08.360408 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-f7nbc" podUID="c970f7fd-c67b-4cda-9124-7fb1709c8d73" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: i/o timeout" Dec 09 10:22:14 crc kubenswrapper[5002]: I1209 10:22:14.958374 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="3884e46cf25151268d65649fde8e75f33e599a76a13b5c73816d374f2399025a" exitCode=0 Dec 09 10:22:14 crc kubenswrapper[5002]: I1209 10:22:14.958439 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"3884e46cf25151268d65649fde8e75f33e599a76a13b5c73816d374f2399025a"} Dec 09 10:22:14 crc kubenswrapper[5002]: I1209 10:22:14.959126 5002 scope.go:117] "RemoveContainer" containerID="9580dfbfe31ac43f61d3b220c2620f364cbabb0180f5e1a555d93ea2015032be" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.670662 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.670761 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.714627 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.731970 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.736627 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.736659 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.779881 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.787928 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.967831 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.968158 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.968170 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 10:22:15 crc kubenswrapper[5002]: I1209 10:22:15.968181 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:17 crc kubenswrapper[5002]: I1209 10:22:17.899463 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 10:22:17 crc kubenswrapper[5002]: I1209 10:22:17.923852 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:17 crc kubenswrapper[5002]: I1209 10:22:17.924882 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:17 crc kubenswrapper[5002]: I1209 10:22:17.995366 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 10:22:18 crc kubenswrapper[5002]: I1209 10:22:18.418025 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 10:22:21 crc kubenswrapper[5002]: I1209 10:22:21.026107 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"8882b3e4cc037c99de652a814b6e830546393f19945b2204e6e01c0052e460f5"} Dec 09 10:22:21 crc kubenswrapper[5002]: I1209 10:22:21.760479 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.037877 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291d30af-cc1c-4fa3-9496-9695e50f0c6d","Type":"ContainerStarted","Data":"3b876ca1c28b7a2bb5b33630fdff22c4cd075a5e2f05e18657738788deaa38d5"} Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.576494 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.577756 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.580756 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-fxjk2" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.580757 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.581018 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.604529 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.688106 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6x44\" (UniqueName: \"kubernetes.io/projected/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-kube-api-access-f6x44\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.688285 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config-secret\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.688353 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.688391 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.790169 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config-secret\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.790613 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.790647 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.790686 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6x44\" (UniqueName: \"kubernetes.io/projected/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-kube-api-access-f6x44\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.791693 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.795019 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config-secret\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.795088 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.813339 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6x44\" (UniqueName: \"kubernetes.io/projected/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-kube-api-access-f6x44\") pod \"openstackclient\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " pod="openstack/openstackclient" Dec 09 10:22:22 crc kubenswrapper[5002]: I1209 10:22:22.899761 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 10:22:23 crc kubenswrapper[5002]: I1209 10:22:23.591172 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 10:22:24 crc kubenswrapper[5002]: I1209 10:22:24.091606 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c9e681d8-0720-4f5e-8893-ec4f1cf43edf","Type":"ContainerStarted","Data":"2305a84a5ea162da296daaee3ea6614a3ead7d60294c32712a2f4ec4e2341378"} Dec 09 10:22:24 crc kubenswrapper[5002]: I1209 10:22:24.092003 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291d30af-cc1c-4fa3-9496-9695e50f0c6d","Type":"ContainerStarted","Data":"32f9dacd4134aa0f7de86368ebffb600dbe712eecaabcbb9ac009df587d26dbd"} Dec 09 10:22:24 crc kubenswrapper[5002]: I1209 10:22:24.488300 5002 generic.go:334] "Generic (PLEG): container finished" podID="354d641d-dc9c-4aa4-b821-90ce72ef6d5c" containerID="75a1a6ebfe287494a07162022a2dce2dfa8561b0452360830b376abd08ad6489" exitCode=0 Dec 09 10:22:24 crc kubenswrapper[5002]: I1209 10:22:24.488388 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t7vtz" event={"ID":"354d641d-dc9c-4aa4-b821-90ce72ef6d5c","Type":"ContainerDied","Data":"75a1a6ebfe287494a07162022a2dce2dfa8561b0452360830b376abd08ad6489"} Dec 09 10:22:25 crc kubenswrapper[5002]: I1209 10:22:25.883405 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.014106 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-combined-ca-bundle\") pod \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.014186 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg4pg\" (UniqueName: \"kubernetes.io/projected/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-kube-api-access-sg4pg\") pod \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.014387 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-db-sync-config-data\") pod \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\" (UID: \"354d641d-dc9c-4aa4-b821-90ce72ef6d5c\") " Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.022085 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "354d641d-dc9c-4aa4-b821-90ce72ef6d5c" (UID: "354d641d-dc9c-4aa4-b821-90ce72ef6d5c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.022714 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-kube-api-access-sg4pg" (OuterVolumeSpecName: "kube-api-access-sg4pg") pod "354d641d-dc9c-4aa4-b821-90ce72ef6d5c" (UID: "354d641d-dc9c-4aa4-b821-90ce72ef6d5c"). InnerVolumeSpecName "kube-api-access-sg4pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.050492 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "354d641d-dc9c-4aa4-b821-90ce72ef6d5c" (UID: "354d641d-dc9c-4aa4-b821-90ce72ef6d5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.116062 5002 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.116099 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.116111 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg4pg\" (UniqueName: \"kubernetes.io/projected/354d641d-dc9c-4aa4-b821-90ce72ef6d5c-kube-api-access-sg4pg\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:26 crc kubenswrapper[5002]: E1209 10:22:26.177656 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod354d641d_dc9c_4aa4_b821_90ce72ef6d5c.slice\": RecentStats: unable to find data in memory cache]" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.505415 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t7vtz" event={"ID":"354d641d-dc9c-4aa4-b821-90ce72ef6d5c","Type":"ContainerDied","Data":"e6e9aa3b544c40c5e27fc72016be1fec40be275e0ddd1554eef4aea11e103b02"} Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.505453 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6e9aa3b544c40c5e27fc72016be1fec40be275e0ddd1554eef4aea11e103b02" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.505503 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t7vtz" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.830998 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5855d5f975-nmr2s"] Dec 09 10:22:26 crc kubenswrapper[5002]: E1209 10:22:26.831561 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354d641d-dc9c-4aa4-b821-90ce72ef6d5c" containerName="barbican-db-sync" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.831577 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="354d641d-dc9c-4aa4-b821-90ce72ef6d5c" containerName="barbican-db-sync" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.831758 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="354d641d-dc9c-4aa4-b821-90ce72ef6d5c" containerName="barbican-db-sync" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.832617 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.838096 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-776dccf8bb-k9gt4"] Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.839402 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.839880 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.841638 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.843548 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.848924 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4q7mf" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.861967 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-776dccf8bb-k9gt4"] Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.881953 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5855d5f975-nmr2s"] Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.921463 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-ld5mf"] Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.925406 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.931738 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.931884 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whgq5\" (UniqueName: \"kubernetes.io/projected/c9198258-4919-4ade-88ba-4a0773b32012-kube-api-access-whgq5\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.931927 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data-custom\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.931970 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9198258-4919-4ade-88ba-4a0773b32012-logs\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.931988 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-combined-ca-bundle\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:26 crc kubenswrapper[5002]: I1209 10:22:26.945535 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-ld5mf"] Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.001873 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b88b97456-wb29h"] Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.003481 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.007136 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.010782 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b88b97456-wb29h"] Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.035804 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-combined-ca-bundle\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.035910 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.035954 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-config\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.035981 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6p84\" (UniqueName: \"kubernetes.io/projected/018e2746-8f8d-47a4-9da1-96b398185024-kube-api-access-t6p84\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036015 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv9wj\" (UniqueName: \"kubernetes.io/projected/c4ddce94-6333-4233-951d-571a761b708f-kube-api-access-vv9wj\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036077 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whgq5\" (UniqueName: \"kubernetes.io/projected/c9198258-4919-4ade-88ba-4a0773b32012-kube-api-access-whgq5\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036107 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036146 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4ddce94-6333-4233-951d-571a761b708f-logs\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036177 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036206 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036343 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data-custom\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036418 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9198258-4919-4ade-88ba-4a0773b32012-logs\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036456 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-combined-ca-bundle\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036492 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data-custom\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036577 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.036630 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.038447 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9198258-4919-4ade-88ba-4a0773b32012-logs\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.042962 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-combined-ca-bundle\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.047410 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data-custom\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.047935 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.073363 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whgq5\" (UniqueName: \"kubernetes.io/projected/c9198258-4919-4ade-88ba-4a0773b32012-kube-api-access-whgq5\") pod \"barbican-worker-5855d5f975-nmr2s\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138279 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138350 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data-custom\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138384 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-combined-ca-bundle\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138424 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzpx4\" (UniqueName: \"kubernetes.io/projected/735c673a-9bc6-4a3c-b553-dea3ba521606-kube-api-access-bzpx4\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138456 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-config\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138587 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6p84\" (UniqueName: \"kubernetes.io/projected/018e2746-8f8d-47a4-9da1-96b398185024-kube-api-access-t6p84\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138650 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv9wj\" (UniqueName: \"kubernetes.io/projected/c4ddce94-6333-4233-951d-571a761b708f-kube-api-access-vv9wj\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138750 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138823 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4ddce94-6333-4233-951d-571a761b708f-logs\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138850 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138867 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.138932 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-combined-ca-bundle\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.139023 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data-custom\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.139097 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/735c673a-9bc6-4a3c-b553-dea3ba521606-logs\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.139183 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.139238 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.139561 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-config\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.140040 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.140795 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.141050 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4ddce94-6333-4233-951d-571a761b708f-logs\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.141476 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.141685 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.145663 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.148657 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data-custom\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.153964 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-combined-ca-bundle\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.157767 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv9wj\" (UniqueName: \"kubernetes.io/projected/c4ddce94-6333-4233-951d-571a761b708f-kube-api-access-vv9wj\") pod \"barbican-keystone-listener-776dccf8bb-k9gt4\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.161055 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6p84\" (UniqueName: \"kubernetes.io/projected/018e2746-8f8d-47a4-9da1-96b398185024-kube-api-access-t6p84\") pod \"dnsmasq-dns-586bdc5f9-ld5mf\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.171241 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.187039 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.241538 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/735c673a-9bc6-4a3c-b553-dea3ba521606-logs\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.241893 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/735c673a-9bc6-4a3c-b553-dea3ba521606-logs\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.241963 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.242733 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data-custom\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.243086 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzpx4\" (UniqueName: \"kubernetes.io/projected/735c673a-9bc6-4a3c-b553-dea3ba521606-kube-api-access-bzpx4\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.243199 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-combined-ca-bundle\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.246230 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-combined-ca-bundle\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.247542 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.253543 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data-custom\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.260270 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.263382 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzpx4\" (UniqueName: \"kubernetes.io/projected/735c673a-9bc6-4a3c-b553-dea3ba521606-kube-api-access-bzpx4\") pod \"barbican-api-6b88b97456-wb29h\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.330467 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.722385 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5855d5f975-nmr2s"] Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.754178 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b88b97456-wb29h"] Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.781191 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-ld5mf"] Dec 09 10:22:27 crc kubenswrapper[5002]: W1209 10:22:27.792901 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod018e2746_8f8d_47a4_9da1_96b398185024.slice/crio-d04e99a64e236bde486bf8dd4c49f3c3745c6aee050572e964b62b2246badbae WatchSource:0}: Error finding container d04e99a64e236bde486bf8dd4c49f3c3745c6aee050572e964b62b2246badbae: Status 404 returned error can't find the container with id d04e99a64e236bde486bf8dd4c49f3c3745c6aee050572e964b62b2246badbae Dec 09 10:22:27 crc kubenswrapper[5002]: I1209 10:22:27.843948 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-776dccf8bb-k9gt4"] Dec 09 10:22:27 crc kubenswrapper[5002]: W1209 10:22:27.850143 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4ddce94_6333_4233_951d_571a761b708f.slice/crio-4b3b23b6fc197a312b7003fe33b10441bdd1ab7558d8a61ed4717d43a14f0af2 WatchSource:0}: Error finding container 4b3b23b6fc197a312b7003fe33b10441bdd1ab7558d8a61ed4717d43a14f0af2: Status 404 returned error can't find the container with id 4b3b23b6fc197a312b7003fe33b10441bdd1ab7558d8a61ed4717d43a14f0af2 Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.538026 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" event={"ID":"c4ddce94-6333-4233-951d-571a761b708f","Type":"ContainerStarted","Data":"4b3b23b6fc197a312b7003fe33b10441bdd1ab7558d8a61ed4717d43a14f0af2"} Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.542945 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b88b97456-wb29h" event={"ID":"735c673a-9bc6-4a3c-b553-dea3ba521606","Type":"ContainerStarted","Data":"1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e"} Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.543066 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b88b97456-wb29h" event={"ID":"735c673a-9bc6-4a3c-b553-dea3ba521606","Type":"ContainerStarted","Data":"128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f"} Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.543079 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b88b97456-wb29h" event={"ID":"735c673a-9bc6-4a3c-b553-dea3ba521606","Type":"ContainerStarted","Data":"b9211056914c225f79c54432f31562b41520803d0cbee2682c1882ea8a90c11d"} Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.544705 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.544762 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.560289 5002 generic.go:334] "Generic (PLEG): container finished" podID="018e2746-8f8d-47a4-9da1-96b398185024" containerID="a60290bc2cab4a6248037321998fe19da156637feb660eb827cf1143edb3e0d8" exitCode=0 Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.560372 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" event={"ID":"018e2746-8f8d-47a4-9da1-96b398185024","Type":"ContainerDied","Data":"a60290bc2cab4a6248037321998fe19da156637feb660eb827cf1143edb3e0d8"} Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.560398 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" event={"ID":"018e2746-8f8d-47a4-9da1-96b398185024","Type":"ContainerStarted","Data":"d04e99a64e236bde486bf8dd4c49f3c3745c6aee050572e964b62b2246badbae"} Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.584075 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291d30af-cc1c-4fa3-9496-9695e50f0c6d","Type":"ContainerStarted","Data":"033c0ec92f452a7bfeaaa22b0a7488d2d5cfdd1b66daf781e92b4dca936885a1"} Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.585111 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.587641 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b88b97456-wb29h" podStartSLOduration=2.5876192590000002 podStartE2EDuration="2.587619259s" podCreationTimestamp="2025-12-09 10:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:28.567686959 +0000 UTC m=+1280.959738050" watchObservedRunningTime="2025-12-09 10:22:28.587619259 +0000 UTC m=+1280.979670330" Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.590612 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5855d5f975-nmr2s" event={"ID":"c9198258-4919-4ade-88ba-4a0773b32012","Type":"ContainerStarted","Data":"6010a8767b6a0095cf0b09e1064ec844493933934ec0fb3b6282edd13fde39be"} Dec 09 10:22:28 crc kubenswrapper[5002]: I1209 10:22:28.671692 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.938269865 podStartE2EDuration="23.671676524s" podCreationTimestamp="2025-12-09 10:22:05 +0000 UTC" firstStartedPulling="2025-12-09 10:22:06.391014271 +0000 UTC m=+1258.783065352" lastFinishedPulling="2025-12-09 10:22:27.12442093 +0000 UTC m=+1279.516472011" observedRunningTime="2025-12-09 10:22:28.616952953 +0000 UTC m=+1281.009004034" watchObservedRunningTime="2025-12-09 10:22:28.671676524 +0000 UTC m=+1281.063727605" Dec 09 10:22:29 crc kubenswrapper[5002]: I1209 10:22:29.602390 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" event={"ID":"018e2746-8f8d-47a4-9da1-96b398185024","Type":"ContainerStarted","Data":"6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97"} Dec 09 10:22:29 crc kubenswrapper[5002]: I1209 10:22:29.603258 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:29 crc kubenswrapper[5002]: I1209 10:22:29.646877 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" podStartSLOduration=3.646856324 podStartE2EDuration="3.646856324s" podCreationTimestamp="2025-12-09 10:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:29.645048405 +0000 UTC m=+1282.037099496" watchObservedRunningTime="2025-12-09 10:22:29.646856324 +0000 UTC m=+1282.038907425" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.128549 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c454948fd-lwcxn"] Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.153150 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c454948fd-lwcxn"] Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.153255 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.154933 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.155778 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.229722 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-public-tls-certs\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.229842 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.229873 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-internal-tls-certs\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.229898 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcfbp\" (UniqueName: \"kubernetes.io/projected/a4061af7-7669-4bd4-a36c-6ec982e86753-kube-api-access-tcfbp\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.229950 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4061af7-7669-4bd4-a36c-6ec982e86753-logs\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.229977 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data-custom\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.230012 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-combined-ca-bundle\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.332025 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-combined-ca-bundle\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.332096 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-public-tls-certs\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.332166 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.332198 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-internal-tls-certs\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.332227 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcfbp\" (UniqueName: \"kubernetes.io/projected/a4061af7-7669-4bd4-a36c-6ec982e86753-kube-api-access-tcfbp\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.332298 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4061af7-7669-4bd4-a36c-6ec982e86753-logs\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.332336 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data-custom\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.333526 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4061af7-7669-4bd4-a36c-6ec982e86753-logs\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.340427 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data-custom\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.340720 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-combined-ca-bundle\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.342619 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-public-tls-certs\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.343115 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.346294 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-internal-tls-certs\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.355655 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcfbp\" (UniqueName: \"kubernetes.io/projected/a4061af7-7669-4bd4-a36c-6ec982e86753-kube-api-access-tcfbp\") pod \"barbican-api-5c454948fd-lwcxn\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.472620 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.711642 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5c99967b8c-vjq4g"] Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.713349 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.716127 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.717234 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.718044 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.736494 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c99967b8c-vjq4g"] Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.740560 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-config-data\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.740632 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-etc-swift\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.740852 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-combined-ca-bundle\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.740946 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-public-tls-certs\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.741180 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-internal-tls-certs\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.741214 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-run-httpd\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.741265 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-log-httpd\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.741305 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqgc2\" (UniqueName: \"kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-kube-api-access-tqgc2\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.843002 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqgc2\" (UniqueName: \"kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-kube-api-access-tqgc2\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.843104 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-config-data\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.843141 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-etc-swift\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.843176 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-combined-ca-bundle\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.843199 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-public-tls-certs\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.843274 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-internal-tls-certs\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.843293 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-run-httpd\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.843322 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-log-httpd\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.843944 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-run-httpd\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.844143 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-log-httpd\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.848394 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-internal-tls-certs\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.848495 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-combined-ca-bundle\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.852142 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-public-tls-certs\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.853024 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-config-data\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.861562 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-etc-swift\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:30 crc kubenswrapper[5002]: I1209 10:22:30.862457 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqgc2\" (UniqueName: \"kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-kube-api-access-tqgc2\") pod \"swift-proxy-5c99967b8c-vjq4g\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:31 crc kubenswrapper[5002]: I1209 10:22:31.033586 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:31 crc kubenswrapper[5002]: I1209 10:22:31.298026 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:22:31 crc kubenswrapper[5002]: I1209 10:22:31.619009 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="ceilometer-central-agent" containerID="cri-o://2b2f21b165e9b00eb03e02f6f6c0144e56f440afcbd50f22076c2a201b9d2b99" gracePeriod=30 Dec 09 10:22:31 crc kubenswrapper[5002]: I1209 10:22:31.619040 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="proxy-httpd" containerID="cri-o://033c0ec92f452a7bfeaaa22b0a7488d2d5cfdd1b66daf781e92b4dca936885a1" gracePeriod=30 Dec 09 10:22:31 crc kubenswrapper[5002]: I1209 10:22:31.619055 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="sg-core" containerID="cri-o://32f9dacd4134aa0f7de86368ebffb600dbe712eecaabcbb9ac009df587d26dbd" gracePeriod=30 Dec 09 10:22:31 crc kubenswrapper[5002]: I1209 10:22:31.619140 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="ceilometer-notification-agent" containerID="cri-o://3b876ca1c28b7a2bb5b33630fdff22c4cd075a5e2f05e18657738788deaa38d5" gracePeriod=30 Dec 09 10:22:32 crc kubenswrapper[5002]: I1209 10:22:32.631753 5002 generic.go:334] "Generic (PLEG): container finished" podID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerID="033c0ec92f452a7bfeaaa22b0a7488d2d5cfdd1b66daf781e92b4dca936885a1" exitCode=0 Dec 09 10:22:32 crc kubenswrapper[5002]: I1209 10:22:32.632125 5002 generic.go:334] "Generic (PLEG): container finished" podID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerID="32f9dacd4134aa0f7de86368ebffb600dbe712eecaabcbb9ac009df587d26dbd" exitCode=2 Dec 09 10:22:32 crc kubenswrapper[5002]: I1209 10:22:32.632140 5002 generic.go:334] "Generic (PLEG): container finished" podID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerID="2b2f21b165e9b00eb03e02f6f6c0144e56f440afcbd50f22076c2a201b9d2b99" exitCode=0 Dec 09 10:22:32 crc kubenswrapper[5002]: I1209 10:22:32.631843 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291d30af-cc1c-4fa3-9496-9695e50f0c6d","Type":"ContainerDied","Data":"033c0ec92f452a7bfeaaa22b0a7488d2d5cfdd1b66daf781e92b4dca936885a1"} Dec 09 10:22:32 crc kubenswrapper[5002]: I1209 10:22:32.632226 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291d30af-cc1c-4fa3-9496-9695e50f0c6d","Type":"ContainerDied","Data":"32f9dacd4134aa0f7de86368ebffb600dbe712eecaabcbb9ac009df587d26dbd"} Dec 09 10:22:32 crc kubenswrapper[5002]: I1209 10:22:32.632243 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291d30af-cc1c-4fa3-9496-9695e50f0c6d","Type":"ContainerDied","Data":"2b2f21b165e9b00eb03e02f6f6c0144e56f440afcbd50f22076c2a201b9d2b99"} Dec 09 10:22:32 crc kubenswrapper[5002]: I1209 10:22:32.635182 5002 generic.go:334] "Generic (PLEG): container finished" podID="20ebb6ea-f36b-440a-a437-ff39f9766fca" containerID="0dd06cfd3c38da8c60bac35260c461aa9a32defee6ab2c78a1bf7739889b67d1" exitCode=0 Dec 09 10:22:32 crc kubenswrapper[5002]: I1209 10:22:32.635216 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4kmzk" event={"ID":"20ebb6ea-f36b-440a-a437-ff39f9766fca","Type":"ContainerDied","Data":"0dd06cfd3c38da8c60bac35260c461aa9a32defee6ab2c78a1bf7739889b67d1"} Dec 09 10:22:33 crc kubenswrapper[5002]: I1209 10:22:33.651517 5002 generic.go:334] "Generic (PLEG): container finished" podID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerID="3b876ca1c28b7a2bb5b33630fdff22c4cd075a5e2f05e18657738788deaa38d5" exitCode=0 Dec 09 10:22:33 crc kubenswrapper[5002]: I1209 10:22:33.651748 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291d30af-cc1c-4fa3-9496-9695e50f0c6d","Type":"ContainerDied","Data":"3b876ca1c28b7a2bb5b33630fdff22c4cd075a5e2f05e18657738788deaa38d5"} Dec 09 10:22:34 crc kubenswrapper[5002]: I1209 10:22:34.742715 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.581489 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-fb7r8"] Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.583003 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fb7r8" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.593539 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fb7r8"] Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.647976 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q57qn\" (UniqueName: \"kubernetes.io/projected/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-kube-api-access-q57qn\") pod \"nova-api-db-create-fb7r8\" (UID: \"73bcef93-39f3-4f68-b3f2-cc78b4698e3a\") " pod="openstack/nova-api-db-create-fb7r8" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.648131 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-operator-scripts\") pod \"nova-api-db-create-fb7r8\" (UID: \"73bcef93-39f3-4f68-b3f2-cc78b4698e3a\") " pod="openstack/nova-api-db-create-fb7r8" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.683476 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-kth45"] Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.685631 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kth45" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.705873 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kth45"] Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.750774 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxppl\" (UniqueName: \"kubernetes.io/projected/c5afe93e-c94d-4e57-987b-956d67b03621-kube-api-access-hxppl\") pod \"nova-cell0-db-create-kth45\" (UID: \"c5afe93e-c94d-4e57-987b-956d67b03621\") " pod="openstack/nova-cell0-db-create-kth45" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.750886 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-operator-scripts\") pod \"nova-api-db-create-fb7r8\" (UID: \"73bcef93-39f3-4f68-b3f2-cc78b4698e3a\") " pod="openstack/nova-api-db-create-fb7r8" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.750936 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q57qn\" (UniqueName: \"kubernetes.io/projected/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-kube-api-access-q57qn\") pod \"nova-api-db-create-fb7r8\" (UID: \"73bcef93-39f3-4f68-b3f2-cc78b4698e3a\") " pod="openstack/nova-api-db-create-fb7r8" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.750970 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5afe93e-c94d-4e57-987b-956d67b03621-operator-scripts\") pod \"nova-cell0-db-create-kth45\" (UID: \"c5afe93e-c94d-4e57-987b-956d67b03621\") " pod="openstack/nova-cell0-db-create-kth45" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.751655 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-operator-scripts\") pod \"nova-api-db-create-fb7r8\" (UID: \"73bcef93-39f3-4f68-b3f2-cc78b4698e3a\") " pod="openstack/nova-api-db-create-fb7r8" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.781678 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q57qn\" (UniqueName: \"kubernetes.io/projected/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-kube-api-access-q57qn\") pod \"nova-api-db-create-fb7r8\" (UID: \"73bcef93-39f3-4f68-b3f2-cc78b4698e3a\") " pod="openstack/nova-api-db-create-fb7r8" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.794455 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5809-account-create-update-2sfhw"] Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.795724 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5809-account-create-update-2sfhw" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.800886 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.820911 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5809-account-create-update-2sfhw"] Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.852122 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxppl\" (UniqueName: \"kubernetes.io/projected/c5afe93e-c94d-4e57-987b-956d67b03621-kube-api-access-hxppl\") pod \"nova-cell0-db-create-kth45\" (UID: \"c5afe93e-c94d-4e57-987b-956d67b03621\") " pod="openstack/nova-cell0-db-create-kth45" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.852167 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lstcr\" (UniqueName: \"kubernetes.io/projected/955f29ee-6405-41b4-b905-3438ed1344fd-kube-api-access-lstcr\") pod \"nova-api-5809-account-create-update-2sfhw\" (UID: \"955f29ee-6405-41b4-b905-3438ed1344fd\") " pod="openstack/nova-api-5809-account-create-update-2sfhw" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.852262 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5afe93e-c94d-4e57-987b-956d67b03621-operator-scripts\") pod \"nova-cell0-db-create-kth45\" (UID: \"c5afe93e-c94d-4e57-987b-956d67b03621\") " pod="openstack/nova-cell0-db-create-kth45" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.852285 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/955f29ee-6405-41b4-b905-3438ed1344fd-operator-scripts\") pod \"nova-api-5809-account-create-update-2sfhw\" (UID: \"955f29ee-6405-41b4-b905-3438ed1344fd\") " pod="openstack/nova-api-5809-account-create-update-2sfhw" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.853241 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5afe93e-c94d-4e57-987b-956d67b03621-operator-scripts\") pod \"nova-cell0-db-create-kth45\" (UID: \"c5afe93e-c94d-4e57-987b-956d67b03621\") " pod="openstack/nova-cell0-db-create-kth45" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.878003 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxppl\" (UniqueName: \"kubernetes.io/projected/c5afe93e-c94d-4e57-987b-956d67b03621-kube-api-access-hxppl\") pod \"nova-cell0-db-create-kth45\" (UID: \"c5afe93e-c94d-4e57-987b-956d67b03621\") " pod="openstack/nova-cell0-db-create-kth45" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.898226 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mvvcv"] Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.901720 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mvvcv" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.907085 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mvvcv"] Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.922843 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fb7r8" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.954019 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lstcr\" (UniqueName: \"kubernetes.io/projected/955f29ee-6405-41b4-b905-3438ed1344fd-kube-api-access-lstcr\") pod \"nova-api-5809-account-create-update-2sfhw\" (UID: \"955f29ee-6405-41b4-b905-3438ed1344fd\") " pod="openstack/nova-api-5809-account-create-update-2sfhw" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.954077 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d71308aa-e2b5-4775-917e-1b100ff8969c-operator-scripts\") pod \"nova-cell1-db-create-mvvcv\" (UID: \"d71308aa-e2b5-4775-917e-1b100ff8969c\") " pod="openstack/nova-cell1-db-create-mvvcv" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.954159 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk4pd\" (UniqueName: \"kubernetes.io/projected/d71308aa-e2b5-4775-917e-1b100ff8969c-kube-api-access-xk4pd\") pod \"nova-cell1-db-create-mvvcv\" (UID: \"d71308aa-e2b5-4775-917e-1b100ff8969c\") " pod="openstack/nova-cell1-db-create-mvvcv" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.954200 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/955f29ee-6405-41b4-b905-3438ed1344fd-operator-scripts\") pod \"nova-api-5809-account-create-update-2sfhw\" (UID: \"955f29ee-6405-41b4-b905-3438ed1344fd\") " pod="openstack/nova-api-5809-account-create-update-2sfhw" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.954860 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/955f29ee-6405-41b4-b905-3438ed1344fd-operator-scripts\") pod \"nova-api-5809-account-create-update-2sfhw\" (UID: \"955f29ee-6405-41b4-b905-3438ed1344fd\") " pod="openstack/nova-api-5809-account-create-update-2sfhw" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.975231 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lstcr\" (UniqueName: \"kubernetes.io/projected/955f29ee-6405-41b4-b905-3438ed1344fd-kube-api-access-lstcr\") pod \"nova-api-5809-account-create-update-2sfhw\" (UID: \"955f29ee-6405-41b4-b905-3438ed1344fd\") " pod="openstack/nova-api-5809-account-create-update-2sfhw" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.989699 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-539b-account-create-update-d4qnv"] Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.990840 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-539b-account-create-update-d4qnv" Dec 09 10:22:35 crc kubenswrapper[5002]: I1209 10:22:35.993752 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.004849 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kth45" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.004868 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-539b-account-create-update-d4qnv"] Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.055943 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d71308aa-e2b5-4775-917e-1b100ff8969c-operator-scripts\") pod \"nova-cell1-db-create-mvvcv\" (UID: \"d71308aa-e2b5-4775-917e-1b100ff8969c\") " pod="openstack/nova-cell1-db-create-mvvcv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.056007 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22nb2\" (UniqueName: \"kubernetes.io/projected/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-kube-api-access-22nb2\") pod \"nova-cell0-539b-account-create-update-d4qnv\" (UID: \"f12e361b-e5e4-4c7c-8c4f-fe266937ffda\") " pod="openstack/nova-cell0-539b-account-create-update-d4qnv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.056057 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-operator-scripts\") pod \"nova-cell0-539b-account-create-update-d4qnv\" (UID: \"f12e361b-e5e4-4c7c-8c4f-fe266937ffda\") " pod="openstack/nova-cell0-539b-account-create-update-d4qnv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.056098 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk4pd\" (UniqueName: \"kubernetes.io/projected/d71308aa-e2b5-4775-917e-1b100ff8969c-kube-api-access-xk4pd\") pod \"nova-cell1-db-create-mvvcv\" (UID: \"d71308aa-e2b5-4775-917e-1b100ff8969c\") " pod="openstack/nova-cell1-db-create-mvvcv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.065893 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d71308aa-e2b5-4775-917e-1b100ff8969c-operator-scripts\") pod \"nova-cell1-db-create-mvvcv\" (UID: \"d71308aa-e2b5-4775-917e-1b100ff8969c\") " pod="openstack/nova-cell1-db-create-mvvcv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.090496 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk4pd\" (UniqueName: \"kubernetes.io/projected/d71308aa-e2b5-4775-917e-1b100ff8969c-kube-api-access-xk4pd\") pod \"nova-cell1-db-create-mvvcv\" (UID: \"d71308aa-e2b5-4775-917e-1b100ff8969c\") " pod="openstack/nova-cell1-db-create-mvvcv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.144025 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5809-account-create-update-2sfhw" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.158110 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-operator-scripts\") pod \"nova-cell0-539b-account-create-update-d4qnv\" (UID: \"f12e361b-e5e4-4c7c-8c4f-fe266937ffda\") " pod="openstack/nova-cell0-539b-account-create-update-d4qnv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.158269 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22nb2\" (UniqueName: \"kubernetes.io/projected/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-kube-api-access-22nb2\") pod \"nova-cell0-539b-account-create-update-d4qnv\" (UID: \"f12e361b-e5e4-4c7c-8c4f-fe266937ffda\") " pod="openstack/nova-cell0-539b-account-create-update-d4qnv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.162920 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-operator-scripts\") pod \"nova-cell0-539b-account-create-update-d4qnv\" (UID: \"f12e361b-e5e4-4c7c-8c4f-fe266937ffda\") " pod="openstack/nova-cell0-539b-account-create-update-d4qnv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.183367 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22nb2\" (UniqueName: \"kubernetes.io/projected/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-kube-api-access-22nb2\") pod \"nova-cell0-539b-account-create-update-d4qnv\" (UID: \"f12e361b-e5e4-4c7c-8c4f-fe266937ffda\") " pod="openstack/nova-cell0-539b-account-create-update-d4qnv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.202879 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-05d2-account-create-update-2lrlj"] Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.204870 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.206917 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.234295 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-05d2-account-create-update-2lrlj"] Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.236210 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mvvcv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.269910 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-operator-scripts\") pod \"nova-cell1-05d2-account-create-update-2lrlj\" (UID: \"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6\") " pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.270001 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqxn9\" (UniqueName: \"kubernetes.io/projected/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-kube-api-access-lqxn9\") pod \"nova-cell1-05d2-account-create-update-2lrlj\" (UID: \"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6\") " pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.296572 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.308373 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-539b-account-create-update-d4qnv" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.370909 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20ebb6ea-f36b-440a-a437-ff39f9766fca-etc-machine-id\") pod \"20ebb6ea-f36b-440a-a437-ff39f9766fca\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.370987 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-combined-ca-bundle\") pod \"20ebb6ea-f36b-440a-a437-ff39f9766fca\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.371217 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-scripts\") pod \"20ebb6ea-f36b-440a-a437-ff39f9766fca\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.371284 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf8zz\" (UniqueName: \"kubernetes.io/projected/20ebb6ea-f36b-440a-a437-ff39f9766fca-kube-api-access-hf8zz\") pod \"20ebb6ea-f36b-440a-a437-ff39f9766fca\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.371324 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-db-sync-config-data\") pod \"20ebb6ea-f36b-440a-a437-ff39f9766fca\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.371356 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-config-data\") pod \"20ebb6ea-f36b-440a-a437-ff39f9766fca\" (UID: \"20ebb6ea-f36b-440a-a437-ff39f9766fca\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.371542 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-operator-scripts\") pod \"nova-cell1-05d2-account-create-update-2lrlj\" (UID: \"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6\") " pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.371603 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqxn9\" (UniqueName: \"kubernetes.io/projected/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-kube-api-access-lqxn9\") pod \"nova-cell1-05d2-account-create-update-2lrlj\" (UID: \"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6\") " pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.376793 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ebb6ea-f36b-440a-a437-ff39f9766fca-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "20ebb6ea-f36b-440a-a437-ff39f9766fca" (UID: "20ebb6ea-f36b-440a-a437-ff39f9766fca"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.392003 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-scripts" (OuterVolumeSpecName: "scripts") pod "20ebb6ea-f36b-440a-a437-ff39f9766fca" (UID: "20ebb6ea-f36b-440a-a437-ff39f9766fca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.392781 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-operator-scripts\") pod \"nova-cell1-05d2-account-create-update-2lrlj\" (UID: \"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6\") " pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.393253 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "20ebb6ea-f36b-440a-a437-ff39f9766fca" (UID: "20ebb6ea-f36b-440a-a437-ff39f9766fca"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.416015 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ebb6ea-f36b-440a-a437-ff39f9766fca-kube-api-access-hf8zz" (OuterVolumeSpecName: "kube-api-access-hf8zz") pod "20ebb6ea-f36b-440a-a437-ff39f9766fca" (UID: "20ebb6ea-f36b-440a-a437-ff39f9766fca"). InnerVolumeSpecName "kube-api-access-hf8zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.425637 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqxn9\" (UniqueName: \"kubernetes.io/projected/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-kube-api-access-lqxn9\") pod \"nova-cell1-05d2-account-create-update-2lrlj\" (UID: \"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6\") " pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.459038 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20ebb6ea-f36b-440a-a437-ff39f9766fca" (UID: "20ebb6ea-f36b-440a-a437-ff39f9766fca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.473124 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.473349 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf8zz\" (UniqueName: \"kubernetes.io/projected/20ebb6ea-f36b-440a-a437-ff39f9766fca-kube-api-access-hf8zz\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.473419 5002 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.473477 5002 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20ebb6ea-f36b-440a-a437-ff39f9766fca-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.473533 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.518128 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-config-data" (OuterVolumeSpecName: "config-data") pod "20ebb6ea-f36b-440a-a437-ff39f9766fca" (UID: "20ebb6ea-f36b-440a-a437-ff39f9766fca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.585747 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ebb6ea-f36b-440a-a437-ff39f9766fca-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.634703 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.696947 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-scripts\") pod \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.697266 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvsrm\" (UniqueName: \"kubernetes.io/projected/291d30af-cc1c-4fa3-9496-9695e50f0c6d-kube-api-access-mvsrm\") pod \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.697302 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-config-data\") pod \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.697335 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-log-httpd\") pod \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.697364 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-sg-core-conf-yaml\") pod \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.697411 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-combined-ca-bundle\") pod \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.697487 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-run-httpd\") pod \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\" (UID: \"291d30af-cc1c-4fa3-9496-9695e50f0c6d\") " Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.705240 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "291d30af-cc1c-4fa3-9496-9695e50f0c6d" (UID: "291d30af-cc1c-4fa3-9496-9695e50f0c6d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.705500 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "291d30af-cc1c-4fa3-9496-9695e50f0c6d" (UID: "291d30af-cc1c-4fa3-9496-9695e50f0c6d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.709144 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.724573 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.739624 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/291d30af-cc1c-4fa3-9496-9695e50f0c6d-kube-api-access-mvsrm" (OuterVolumeSpecName: "kube-api-access-mvsrm") pod "291d30af-cc1c-4fa3-9496-9695e50f0c6d" (UID: "291d30af-cc1c-4fa3-9496-9695e50f0c6d"). InnerVolumeSpecName "kube-api-access-mvsrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.739778 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-scripts" (OuterVolumeSpecName: "scripts") pod "291d30af-cc1c-4fa3-9496-9695e50f0c6d" (UID: "291d30af-cc1c-4fa3-9496-9695e50f0c6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.802255 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.802283 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvsrm\" (UniqueName: \"kubernetes.io/projected/291d30af-cc1c-4fa3-9496-9695e50f0c6d-kube-api-access-mvsrm\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.802295 5002 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.802306 5002 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291d30af-cc1c-4fa3-9496-9695e50f0c6d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.806099 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4kmzk" event={"ID":"20ebb6ea-f36b-440a-a437-ff39f9766fca","Type":"ContainerDied","Data":"589aa57ae28bb3cbe2d4e332bed90a95cc6c91e2463510caa923a4364e2e4192"} Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.806136 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="589aa57ae28bb3cbe2d4e332bed90a95cc6c91e2463510caa923a4364e2e4192" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.806207 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4kmzk" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.852433 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291d30af-cc1c-4fa3-9496-9695e50f0c6d","Type":"ContainerDied","Data":"fe819e8b4ac5ab36a4e3321f6ac6e5980c36812e86424f20003abaefcd42d0c9"} Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.852497 5002 scope.go:117] "RemoveContainer" containerID="033c0ec92f452a7bfeaaa22b0a7488d2d5cfdd1b66daf781e92b4dca936885a1" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.852681 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.909904 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "291d30af-cc1c-4fa3-9496-9695e50f0c6d" (UID: "291d30af-cc1c-4fa3-9496-9695e50f0c6d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:36 crc kubenswrapper[5002]: I1209 10:22:36.938965 5002 scope.go:117] "RemoveContainer" containerID="32f9dacd4134aa0f7de86368ebffb600dbe712eecaabcbb9ac009df587d26dbd" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.006372 5002 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.006438 5002 scope.go:117] "RemoveContainer" containerID="3b876ca1c28b7a2bb5b33630fdff22c4cd075a5e2f05e18657738788deaa38d5" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.073467 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-config-data" (OuterVolumeSpecName: "config-data") pod "291d30af-cc1c-4fa3-9496-9695e50f0c6d" (UID: "291d30af-cc1c-4fa3-9496-9695e50f0c6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.077483 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "291d30af-cc1c-4fa3-9496-9695e50f0c6d" (UID: "291d30af-cc1c-4fa3-9496-9695e50f0c6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.109661 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.109690 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291d30af-cc1c-4fa3-9496-9695e50f0c6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.166264 5002 scope.go:117] "RemoveContainer" containerID="2b2f21b165e9b00eb03e02f6f6c0144e56f440afcbd50f22076c2a201b9d2b99" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.169233 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fb7r8"] Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.264984 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.298910 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.299168 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerName="glance-log" containerID="cri-o://8184adac7ec070e2ec700e89b74a2520d43784f0369369c0ef173edb3c57e1cf" gracePeriod=30 Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.299221 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerName="glance-httpd" containerID="cri-o://5ca2ef47d14df1b7854d3d7794e4e04fdddf2bb4f558c18e3a96da0d4a26ce8c" gracePeriod=30 Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.305198 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": EOF" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.305332 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/glance-default-external-api-0" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": EOF" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.310583 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/glance-default-external-api-0" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": EOF" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.415302 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jw5tg"] Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.415565 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" podUID="bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" containerName="dnsmasq-dns" containerID="cri-o://06b2d93793641692f807caaff4a3e07155c43cb5b30ab6012f124dd348926e3b" gracePeriod=10 Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.612228 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:22:37 crc kubenswrapper[5002]: E1209 10:22:37.612849 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="ceilometer-central-agent" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.612860 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="ceilometer-central-agent" Dec 09 10:22:37 crc kubenswrapper[5002]: E1209 10:22:37.612870 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="ceilometer-notification-agent" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.612876 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="ceilometer-notification-agent" Dec 09 10:22:37 crc kubenswrapper[5002]: E1209 10:22:37.612884 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="proxy-httpd" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.612890 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="proxy-httpd" Dec 09 10:22:37 crc kubenswrapper[5002]: E1209 10:22:37.612925 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ebb6ea-f36b-440a-a437-ff39f9766fca" containerName="cinder-db-sync" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.612930 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ebb6ea-f36b-440a-a437-ff39f9766fca" containerName="cinder-db-sync" Dec 09 10:22:37 crc kubenswrapper[5002]: E1209 10:22:37.612940 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="sg-core" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.612945 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="sg-core" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.613097 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="sg-core" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.613111 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="ceilometer-central-agent" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.613123 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="ceilometer-notification-agent" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.613135 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ebb6ea-f36b-440a-a437-ff39f9766fca" containerName="cinder-db-sync" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.613147 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" containerName="proxy-httpd" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.614040 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.616983 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.617677 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.617996 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xsqqp" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.618106 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.622150 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89r8f\" (UniqueName: \"kubernetes.io/projected/58839cfb-488d-4d08-b077-bf23cad0fedb-kube-api-access-89r8f\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.622186 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58839cfb-488d-4d08-b077-bf23cad0fedb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.622204 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.622225 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.622315 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-scripts\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.622360 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.651312 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.675216 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-rlkkn"] Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.681964 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.716487 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-rlkkn"] Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730067 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-scripts\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730130 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-config\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730172 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730244 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730299 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89r8f\" (UniqueName: \"kubernetes.io/projected/58839cfb-488d-4d08-b077-bf23cad0fedb-kube-api-access-89r8f\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730322 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58839cfb-488d-4d08-b077-bf23cad0fedb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730350 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730373 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730401 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730444 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5rv4\" (UniqueName: \"kubernetes.io/projected/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-kube-api-access-n5rv4\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730506 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.730530 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.733055 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58839cfb-488d-4d08-b077-bf23cad0fedb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.743698 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.744477 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.745332 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.770319 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-scripts\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.773662 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89r8f\" (UniqueName: \"kubernetes.io/projected/58839cfb-488d-4d08-b077-bf23cad0fedb-kube-api-access-89r8f\") pod \"cinder-scheduler-0\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " pod="openstack/cinder-scheduler-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.826036 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.827569 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 10:22:37 crc kubenswrapper[5002]: I1209 10:22:37.831704 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832493 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data-custom\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832528 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832559 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832603 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832643 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-config\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832670 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-scripts\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832706 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2tbc\" (UniqueName: \"kubernetes.io/projected/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-kube-api-access-b2tbc\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832728 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832754 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832779 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832828 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832882 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5rv4\" (UniqueName: \"kubernetes.io/projected/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-kube-api-access-n5rv4\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.832908 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-logs\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.833991 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.834725 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.835873 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-config\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.836624 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.837360 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.843757 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.891670 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5rv4\" (UniqueName: \"kubernetes.io/projected/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-kube-api-access-n5rv4\") pod \"dnsmasq-dns-795f4db4bc-rlkkn\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.894847 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fb7r8" event={"ID":"73bcef93-39f3-4f68-b3f2-cc78b4698e3a","Type":"ContainerStarted","Data":"b065f6405cdb49ab94b987f7efe04e86e65995435ebf64239edb19601ac81bd5"} Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.934225 5002 generic.go:334] "Generic (PLEG): container finished" podID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerID="8184adac7ec070e2ec700e89b74a2520d43784f0369369c0ef173edb3c57e1cf" exitCode=143 Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.934285 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80f62273-a2cb-48fc-9dc4-e3bbe09bb517","Type":"ContainerDied","Data":"8184adac7ec070e2ec700e89b74a2520d43784f0369369c0ef173edb3c57e1cf"} Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.935211 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.935274 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-scripts\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.935325 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2tbc\" (UniqueName: \"kubernetes.io/projected/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-kube-api-access-b2tbc\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.935357 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.935386 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.935457 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-logs\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.935525 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data-custom\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.936981 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.937459 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-logs\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.941571 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data-custom\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.943409 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-scripts\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.943410 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.943636 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.944418 5002 generic.go:334] "Generic (PLEG): container finished" podID="bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" containerID="06b2d93793641692f807caaff4a3e07155c43cb5b30ab6012f124dd348926e3b" exitCode=0 Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.944450 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" event={"ID":"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62","Type":"ContainerDied","Data":"06b2d93793641692f807caaff4a3e07155c43cb5b30ab6012f124dd348926e3b"} Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:37.953303 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2tbc\" (UniqueName: \"kubernetes.io/projected/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-kube-api-access-b2tbc\") pod \"cinder-api-0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.307867 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5809-account-create-update-2sfhw"] Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.494900 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-539b-account-create-update-d4qnv"] Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.508103 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c454948fd-lwcxn"] Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.520217 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.617224 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c99967b8c-vjq4g"] Dec 09 10:22:38 crc kubenswrapper[5002]: W1209 10:22:38.669670 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bf056c0_a496_4499_92c7_3b1300b4a29d.slice/crio-4727074ce1552a7dbd1cb8de2c29fcabc3222bf51f6752e91c19c7c391aab0f5 WatchSource:0}: Error finding container 4727074ce1552a7dbd1cb8de2c29fcabc3222bf51f6752e91c19c7c391aab0f5: Status 404 returned error can't find the container with id 4727074ce1552a7dbd1cb8de2c29fcabc3222bf51f6752e91c19c7c391aab0f5 Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.791519 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.823462 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-05d2-account-create-update-2lrlj"] Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.839990 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.852241 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mvvcv"] Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.885802 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.890104 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kth45"] Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.890409 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.904160 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.974034 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-svc\") pod \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.974116 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-sb\") pod \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.974173 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-nb\") pod \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.974267 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zqvc\" (UniqueName: \"kubernetes.io/projected/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-kube-api-access-4zqvc\") pod \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.974307 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-swift-storage-0\") pod \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " Dec 09 10:22:38 crc kubenswrapper[5002]: I1209 10:22:38.974358 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-config\") pod \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\" (UID: \"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62\") " Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.003391 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5809-account-create-update-2sfhw" event={"ID":"955f29ee-6405-41b4-b905-3438ed1344fd","Type":"ContainerStarted","Data":"12b0ab512fb447d85b7abae625606d163e6e5420c42bf1270c48144732b02ab3"} Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.028411 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.029925 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" event={"ID":"bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62","Type":"ContainerDied","Data":"68e63d63ece6441c6edeac172f450d5d628db113e8c081c1d369b8d9be939350"} Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.029974 5002 scope.go:117] "RemoveContainer" containerID="06b2d93793641692f807caaff4a3e07155c43cb5b30ab6012f124dd348926e3b" Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.048998 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c454948fd-lwcxn" event={"ID":"a4061af7-7669-4bd4-a36c-6ec982e86753","Type":"ContainerStarted","Data":"31693255ef31e683e54359ba3df04abd53715c4072183470e51814121ba2ebfc"} Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.072447 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-kube-api-access-4zqvc" (OuterVolumeSpecName: "kube-api-access-4zqvc") pod "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" (UID: "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62"). InnerVolumeSpecName "kube-api-access-4zqvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.078844 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zqvc\" (UniqueName: \"kubernetes.io/projected/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-kube-api-access-4zqvc\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.092440 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99967b8c-vjq4g" event={"ID":"1bf056c0-a496-4499-92c7-3b1300b4a29d","Type":"ContainerStarted","Data":"4727074ce1552a7dbd1cb8de2c29fcabc3222bf51f6752e91c19c7c391aab0f5"} Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.105514 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-539b-account-create-update-d4qnv" event={"ID":"f12e361b-e5e4-4c7c-8c4f-fe266937ffda","Type":"ContainerStarted","Data":"a562ac4ea0dec205f8fed9a530ea4996b65c6f83e1136d50347da2b80da44de8"} Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.240036 5002 scope.go:117] "RemoveContainer" containerID="38c2b87abaa0f4cd658a92e5537f9dc6e5d0715ab297e11b8a8eb163590eba9f" Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.290616 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:22:39 crc kubenswrapper[5002]: W1209 10:22:39.372749 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58839cfb_488d_4d08_b077_bf23cad0fedb.slice/crio-011aaa4f8516de7e7c2451312e762de09c7d16db6c9e890223ccc3a9feb85426 WatchSource:0}: Error finding container 011aaa4f8516de7e7c2451312e762de09c7d16db6c9e890223ccc3a9feb85426: Status 404 returned error can't find the container with id 011aaa4f8516de7e7c2451312e762de09c7d16db6c9e890223ccc3a9feb85426 Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.797889 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-rlkkn"] Dec 09 10:22:39 crc kubenswrapper[5002]: I1209 10:22:39.971324 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.250532 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=5.11261039 podStartE2EDuration="18.250511741s" podCreationTimestamp="2025-12-09 10:22:22 +0000 UTC" firstStartedPulling="2025-12-09 10:22:23.6015513 +0000 UTC m=+1275.993602381" lastFinishedPulling="2025-12-09 10:22:36.739452651 +0000 UTC m=+1289.131503732" observedRunningTime="2025-12-09 10:22:40.221186208 +0000 UTC m=+1292.613237279" watchObservedRunningTime="2025-12-09 10:22:40.250511741 +0000 UTC m=+1292.642562822" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.327882 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-5809-account-create-update-2sfhw" podStartSLOduration=5.327860065 podStartE2EDuration="5.327860065s" podCreationTimestamp="2025-12-09 10:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:40.325585533 +0000 UTC m=+1292.717636624" watchObservedRunningTime="2025-12-09 10:22:40.327860065 +0000 UTC m=+1292.719911156" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.443629 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" (UID: "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.451514 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-config" (OuterVolumeSpecName: "config") pod "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" (UID: "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.470668 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" (UID: "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.489545 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" (UID: "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.513651 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" (UID: "bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.536399 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.536435 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.536451 5002 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.536463 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.536474 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.564107 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kth45" event={"ID":"c5afe93e-c94d-4e57-987b-956d67b03621","Type":"ContainerStarted","Data":"2dadbfd4e1c8c792526478c5602009f135a917d70db180e2a8846460371d325f"} Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.564304 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.564380 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5855d5f975-nmr2s" event={"ID":"c9198258-4919-4ade-88ba-4a0773b32012","Type":"ContainerStarted","Data":"3cfc9050975f650d9997515f3f47032beccc8afe01479ddcb1d077ee3deff954"} Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.564438 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c9e681d8-0720-4f5e-8893-ec4f1cf43edf","Type":"ContainerStarted","Data":"bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5"} Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.564507 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" event={"ID":"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6","Type":"ContainerStarted","Data":"52af199ae9f945bb914dcd7171224e5e1eff8a4b2cb3769eec68bf05602b0f3a"} Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.564592 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fb7r8" event={"ID":"73bcef93-39f3-4f68-b3f2-cc78b4698e3a","Type":"ContainerStarted","Data":"80fb2d8343721af50993af7f62d309f89b78d2c32763dd86d1a7cc7005ef0630"} Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.564650 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mvvcv" event={"ID":"d71308aa-e2b5-4775-917e-1b100ff8969c","Type":"ContainerStarted","Data":"48901c74b8988245f232b5d839a9b44a80d2abda09d1f7cb90956eb34b89cc6b"} Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.564711 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5809-account-create-update-2sfhw" event={"ID":"955f29ee-6405-41b4-b905-3438ed1344fd","Type":"ContainerStarted","Data":"35dbaf5da172afd498a3a3aa82dd037ea9d3b17dd4326cebfa8d0dd4cd5c6087"} Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.564782 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"58839cfb-488d-4d08-b077-bf23cad0fedb","Type":"ContainerStarted","Data":"011aaa4f8516de7e7c2451312e762de09c7d16db6c9e890223ccc3a9feb85426"} Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.564932 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a200aca9-aa0e-44c1-a2da-bab4719bf8b0","Type":"ContainerStarted","Data":"3f13c9a2d984958c7b71b1a7e67c4e8dfe788e400fd4b800711dcfa6d573bff6"} Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.565012 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" event={"ID":"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6","Type":"ContainerStarted","Data":"e6528aad660e0f16bf83f8399154a19e3e32ef0843c1a9127855bcac6bd02b05"} Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.643882 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jw5tg"] Dec 09 10:22:40 crc kubenswrapper[5002]: I1209 10:22:40.655400 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jw5tg"] Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.139964 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.140583 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c75244ea-a44f-4b49-b1dc-05a025bda463" containerName="glance-log" containerID="cri-o://6ef1103090d4b3937fed5f5bb7e4f7971545d5a53212bed951a69d369cd8977f" gracePeriod=30 Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.141050 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c75244ea-a44f-4b49-b1dc-05a025bda463" containerName="glance-httpd" containerID="cri-o://473e3dfd3dbf213aeb987539706506bc04d20d03a4539c96c16ec84e35a95b56" gracePeriod=30 Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.330940 5002 generic.go:334] "Generic (PLEG): container finished" podID="c75244ea-a44f-4b49-b1dc-05a025bda463" containerID="6ef1103090d4b3937fed5f5bb7e4f7971545d5a53212bed951a69d369cd8977f" exitCode=143 Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.331053 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c75244ea-a44f-4b49-b1dc-05a025bda463","Type":"ContainerDied","Data":"6ef1103090d4b3937fed5f5bb7e4f7971545d5a53212bed951a69d369cd8977f"} Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.342033 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c454948fd-lwcxn" event={"ID":"a4061af7-7669-4bd4-a36c-6ec982e86753","Type":"ContainerStarted","Data":"6cb449e2adfcabb9641ca2b98611d189bd76e94105b2edc6a7b7b41e8dbf68a0"} Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.344306 5002 generic.go:334] "Generic (PLEG): container finished" podID="bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" containerID="eefe11b215226c75def8b0bdde5ef0b64c77c98567620951ced7374a977a9ca2" exitCode=0 Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.344367 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" event={"ID":"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6","Type":"ContainerDied","Data":"eefe11b215226c75def8b0bdde5ef0b64c77c98567620951ced7374a977a9ca2"} Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.355990 5002 generic.go:334] "Generic (PLEG): container finished" podID="f12e361b-e5e4-4c7c-8c4f-fe266937ffda" containerID="132aa42ef57343a9c295da67543e7ab3de22debb6d40a328a3c23a77cc245928" exitCode=0 Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.356058 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-539b-account-create-update-d4qnv" event={"ID":"f12e361b-e5e4-4c7c-8c4f-fe266937ffda","Type":"ContainerDied","Data":"132aa42ef57343a9c295da67543e7ab3de22debb6d40a328a3c23a77cc245928"} Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.363732 5002 generic.go:334] "Generic (PLEG): container finished" podID="c5afe93e-c94d-4e57-987b-956d67b03621" containerID="d7e7a1030d81d816b82f4b7af798ec8f997a497e96c6a80bd2f695c81a769f47" exitCode=0 Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.363805 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kth45" event={"ID":"c5afe93e-c94d-4e57-987b-956d67b03621","Type":"ContainerDied","Data":"d7e7a1030d81d816b82f4b7af798ec8f997a497e96c6a80bd2f695c81a769f47"} Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.375325 5002 generic.go:334] "Generic (PLEG): container finished" podID="955f29ee-6405-41b4-b905-3438ed1344fd" containerID="35dbaf5da172afd498a3a3aa82dd037ea9d3b17dd4326cebfa8d0dd4cd5c6087" exitCode=0 Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.375443 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5809-account-create-update-2sfhw" event={"ID":"955f29ee-6405-41b4-b905-3438ed1344fd","Type":"ContainerDied","Data":"35dbaf5da172afd498a3a3aa82dd037ea9d3b17dd4326cebfa8d0dd4cd5c6087"} Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.398973 5002 generic.go:334] "Generic (PLEG): container finished" podID="73bcef93-39f3-4f68-b3f2-cc78b4698e3a" containerID="80fb2d8343721af50993af7f62d309f89b78d2c32763dd86d1a7cc7005ef0630" exitCode=0 Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.399082 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fb7r8" event={"ID":"73bcef93-39f3-4f68-b3f2-cc78b4698e3a","Type":"ContainerDied","Data":"80fb2d8343721af50993af7f62d309f89b78d2c32763dd86d1a7cc7005ef0630"} Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.408367 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99967b8c-vjq4g" event={"ID":"1bf056c0-a496-4499-92c7-3b1300b4a29d","Type":"ContainerStarted","Data":"5f516788aa2a399cee7b3aa95c438e477b373a2ef4a7033783be06cdfa843ec6"} Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.414944 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" event={"ID":"c4ddce94-6333-4233-951d-571a761b708f","Type":"ContainerStarted","Data":"138deb878d9eae8db13bac5892a85b4950d565f4c2cf09cbe03061fbf965d6a2"} Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.423243 5002 generic.go:334] "Generic (PLEG): container finished" podID="d71308aa-e2b5-4775-917e-1b100ff8969c" containerID="b258185414bf070e956ceb655c2f9a8dbe7006c691135bd2c0d10ae21cf772c6" exitCode=0 Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.423320 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mvvcv" event={"ID":"d71308aa-e2b5-4775-917e-1b100ff8969c","Type":"ContainerDied","Data":"b258185414bf070e956ceb655c2f9a8dbe7006c691135bd2c0d10ae21cf772c6"} Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.427973 5002 generic.go:334] "Generic (PLEG): container finished" podID="f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6" containerID="fe070c5525f3715e03b76a0a5d8f0d0b37ad9652906b4ae316a2dffee62d2026" exitCode=0 Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.428625 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" event={"ID":"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6","Type":"ContainerDied","Data":"fe070c5525f3715e03b76a0a5d8f0d0b37ad9652906b4ae316a2dffee62d2026"} Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.803525 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fb7r8" Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.967013 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q57qn\" (UniqueName: \"kubernetes.io/projected/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-kube-api-access-q57qn\") pod \"73bcef93-39f3-4f68-b3f2-cc78b4698e3a\" (UID: \"73bcef93-39f3-4f68-b3f2-cc78b4698e3a\") " Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.967458 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-operator-scripts\") pod \"73bcef93-39f3-4f68-b3f2-cc78b4698e3a\" (UID: \"73bcef93-39f3-4f68-b3f2-cc78b4698e3a\") " Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.970048 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73bcef93-39f3-4f68-b3f2-cc78b4698e3a" (UID: "73bcef93-39f3-4f68-b3f2-cc78b4698e3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.976262 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:41 crc kubenswrapper[5002]: I1209 10:22:41.979867 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-kube-api-access-q57qn" (OuterVolumeSpecName: "kube-api-access-q57qn") pod "73bcef93-39f3-4f68-b3f2-cc78b4698e3a" (UID: "73bcef93-39f3-4f68-b3f2-cc78b4698e3a"). InnerVolumeSpecName "kube-api-access-q57qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.091632 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q57qn\" (UniqueName: \"kubernetes.io/projected/73bcef93-39f3-4f68-b3f2-cc78b4698e3a-kube-api-access-q57qn\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.106970 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" path="/var/lib/kubelet/pods/bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62/volumes" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.446064 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" event={"ID":"c4ddce94-6333-4233-951d-571a761b708f","Type":"ContainerStarted","Data":"1685138d02316a9a12cf58ddcc259dbabe79ffe5bb459f0ae7a6a2cbefac195b"} Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.452577 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99967b8c-vjq4g" event={"ID":"1bf056c0-a496-4499-92c7-3b1300b4a29d","Type":"ContainerStarted","Data":"3edf9b4007e80e9e88d05e62a1daa140acdad44a7fdd6235ed0a8bb73242f7e0"} Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.452932 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.453045 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.454501 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"58839cfb-488d-4d08-b077-bf23cad0fedb","Type":"ContainerStarted","Data":"d839cbd7dc24cd99b6bb542f2309c20c1ce28650b857540ee941d95b26b6625f"} Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.455502 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a200aca9-aa0e-44c1-a2da-bab4719bf8b0","Type":"ContainerStarted","Data":"096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66"} Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.456596 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c454948fd-lwcxn" event={"ID":"a4061af7-7669-4bd4-a36c-6ec982e86753","Type":"ContainerStarted","Data":"2944a25a7c0f087015f80b3d4d12a8b2ffabefdcb9d6f6b684e88e4e6b57e2db"} Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.457196 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.457223 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.460635 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" event={"ID":"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6","Type":"ContainerStarted","Data":"43937b2161c0e1c7323f5ff469253468d38851867108991fec8c4f21c0c334a6"} Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.460891 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.463125 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fb7r8" event={"ID":"73bcef93-39f3-4f68-b3f2-cc78b4698e3a","Type":"ContainerDied","Data":"b065f6405cdb49ab94b987f7efe04e86e65995435ebf64239edb19601ac81bd5"} Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.463155 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b065f6405cdb49ab94b987f7efe04e86e65995435ebf64239edb19601ac81bd5" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.463223 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fb7r8" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.469312 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5855d5f975-nmr2s" event={"ID":"c9198258-4919-4ade-88ba-4a0773b32012","Type":"ContainerStarted","Data":"9ce8868f1af38995fb075822c7445d85a0cf6501e86e92fde84885aaa86d36ad"} Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.472129 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" podStartSLOduration=7.731733284 podStartE2EDuration="16.472109845s" podCreationTimestamp="2025-12-09 10:22:26 +0000 UTC" firstStartedPulling="2025-12-09 10:22:27.860715746 +0000 UTC m=+1280.252766837" lastFinishedPulling="2025-12-09 10:22:36.601092307 +0000 UTC m=+1288.993143398" observedRunningTime="2025-12-09 10:22:42.465901707 +0000 UTC m=+1294.857952788" watchObservedRunningTime="2025-12-09 10:22:42.472109845 +0000 UTC m=+1294.864160926" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.496330 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" podStartSLOduration=5.4963107 podStartE2EDuration="5.4963107s" podCreationTimestamp="2025-12-09 10:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:42.493978997 +0000 UTC m=+1294.886030088" watchObservedRunningTime="2025-12-09 10:22:42.4963107 +0000 UTC m=+1294.888361771" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.525046 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c454948fd-lwcxn" podStartSLOduration=12.525027277 podStartE2EDuration="12.525027277s" podCreationTimestamp="2025-12-09 10:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:42.514089681 +0000 UTC m=+1294.906140772" watchObservedRunningTime="2025-12-09 10:22:42.525027277 +0000 UTC m=+1294.917078348" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.546687 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5c99967b8c-vjq4g" podStartSLOduration=12.546662763 podStartE2EDuration="12.546662763s" podCreationTimestamp="2025-12-09 10:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:42.537526555 +0000 UTC m=+1294.929577656" watchObservedRunningTime="2025-12-09 10:22:42.546662763 +0000 UTC m=+1294.938713844" Dec 09 10:22:42 crc kubenswrapper[5002]: I1209 10:22:42.571420 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5855d5f975-nmr2s" podStartSLOduration=7.723473501 podStartE2EDuration="16.571396982s" podCreationTimestamp="2025-12-09 10:22:26 +0000 UTC" firstStartedPulling="2025-12-09 10:22:27.746057303 +0000 UTC m=+1280.138108384" lastFinishedPulling="2025-12-09 10:22:36.593980784 +0000 UTC m=+1288.986031865" observedRunningTime="2025-12-09 10:22:42.562151092 +0000 UTC m=+1294.954202173" watchObservedRunningTime="2025-12-09 10:22:42.571396982 +0000 UTC m=+1294.963448083" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.065689 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kth45" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.217728 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5afe93e-c94d-4e57-987b-956d67b03621-operator-scripts\") pod \"c5afe93e-c94d-4e57-987b-956d67b03621\" (UID: \"c5afe93e-c94d-4e57-987b-956d67b03621\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.217832 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxppl\" (UniqueName: \"kubernetes.io/projected/c5afe93e-c94d-4e57-987b-956d67b03621-kube-api-access-hxppl\") pod \"c5afe93e-c94d-4e57-987b-956d67b03621\" (UID: \"c5afe93e-c94d-4e57-987b-956d67b03621\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.220079 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5afe93e-c94d-4e57-987b-956d67b03621-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5afe93e-c94d-4e57-987b-956d67b03621" (UID: "c5afe93e-c94d-4e57-987b-956d67b03621"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.228637 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5afe93e-c94d-4e57-987b-956d67b03621-kube-api-access-hxppl" (OuterVolumeSpecName: "kube-api-access-hxppl") pod "c5afe93e-c94d-4e57-987b-956d67b03621" (UID: "c5afe93e-c94d-4e57-987b-956d67b03621"). InnerVolumeSpecName "kube-api-access-hxppl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.322479 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5afe93e-c94d-4e57-987b-956d67b03621-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.322499 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxppl\" (UniqueName: \"kubernetes.io/projected/c5afe93e-c94d-4e57-987b-956d67b03621-kube-api-access-hxppl\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.523547 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5809-account-create-update-2sfhw" event={"ID":"955f29ee-6405-41b4-b905-3438ed1344fd","Type":"ContainerDied","Data":"12b0ab512fb447d85b7abae625606d163e6e5420c42bf1270c48144732b02ab3"} Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.523604 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12b0ab512fb447d85b7abae625606d163e6e5420c42bf1270c48144732b02ab3" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.523755 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-539b-account-create-update-d4qnv" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.538333 5002 generic.go:334] "Generic (PLEG): container finished" podID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerID="5ca2ef47d14df1b7854d3d7794e4e04fdddf2bb4f558c18e3a96da0d4a26ce8c" exitCode=0 Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.538400 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80f62273-a2cb-48fc-9dc4-e3bbe09bb517","Type":"ContainerDied","Data":"5ca2ef47d14df1b7854d3d7794e4e04fdddf2bb4f558c18e3a96da0d4a26ce8c"} Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.542532 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mvvcv" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.548021 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.548117 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a200aca9-aa0e-44c1-a2da-bab4719bf8b0","Type":"ContainerStarted","Data":"2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829"} Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.563315 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" event={"ID":"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6","Type":"ContainerDied","Data":"e6528aad660e0f16bf83f8399154a19e3e32ef0843c1a9127855bcac6bd02b05"} Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.563354 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6528aad660e0f16bf83f8399154a19e3e32ef0843c1a9127855bcac6bd02b05" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.563491 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5809-account-create-update-2sfhw" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.583947 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-jw5tg" podUID="bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: i/o timeout" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.586196 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mvvcv" event={"ID":"d71308aa-e2b5-4775-917e-1b100ff8969c","Type":"ContainerDied","Data":"48901c74b8988245f232b5d839a9b44a80d2abda09d1f7cb90956eb34b89cc6b"} Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.586223 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48901c74b8988245f232b5d839a9b44a80d2abda09d1f7cb90956eb34b89cc6b" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.586305 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mvvcv" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.615403 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-539b-account-create-update-d4qnv" event={"ID":"f12e361b-e5e4-4c7c-8c4f-fe266937ffda","Type":"ContainerDied","Data":"a562ac4ea0dec205f8fed9a530ea4996b65c6f83e1136d50347da2b80da44de8"} Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.615451 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a562ac4ea0dec205f8fed9a530ea4996b65c6f83e1136d50347da2b80da44de8" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.615514 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-539b-account-create-update-d4qnv" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.628626 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22nb2\" (UniqueName: \"kubernetes.io/projected/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-kube-api-access-22nb2\") pod \"f12e361b-e5e4-4c7c-8c4f-fe266937ffda\" (UID: \"f12e361b-e5e4-4c7c-8c4f-fe266937ffda\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.628911 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-operator-scripts\") pod \"f12e361b-e5e4-4c7c-8c4f-fe266937ffda\" (UID: \"f12e361b-e5e4-4c7c-8c4f-fe266937ffda\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.629809 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f12e361b-e5e4-4c7c-8c4f-fe266937ffda" (UID: "f12e361b-e5e4-4c7c-8c4f-fe266937ffda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.636108 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-kube-api-access-22nb2" (OuterVolumeSpecName: "kube-api-access-22nb2") pod "f12e361b-e5e4-4c7c-8c4f-fe266937ffda" (UID: "f12e361b-e5e4-4c7c-8c4f-fe266937ffda"). InnerVolumeSpecName "kube-api-access-22nb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.689982 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kth45" event={"ID":"c5afe93e-c94d-4e57-987b-956d67b03621","Type":"ContainerDied","Data":"2dadbfd4e1c8c792526478c5602009f135a917d70db180e2a8846460371d325f"} Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.692071 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dadbfd4e1c8c792526478c5602009f135a917d70db180e2a8846460371d325f" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.690116 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kth45" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.693602 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732149 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-operator-scripts\") pod \"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6\" (UID: \"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732190 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk4pd\" (UniqueName: \"kubernetes.io/projected/d71308aa-e2b5-4775-917e-1b100ff8969c-kube-api-access-xk4pd\") pod \"d71308aa-e2b5-4775-917e-1b100ff8969c\" (UID: \"d71308aa-e2b5-4775-917e-1b100ff8969c\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732227 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/955f29ee-6405-41b4-b905-3438ed1344fd-operator-scripts\") pod \"955f29ee-6405-41b4-b905-3438ed1344fd\" (UID: \"955f29ee-6405-41b4-b905-3438ed1344fd\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732253 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-combined-ca-bundle\") pod \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732277 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732295 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqxn9\" (UniqueName: \"kubernetes.io/projected/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-kube-api-access-lqxn9\") pod \"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6\" (UID: \"f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732326 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-httpd-run\") pod \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732343 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-logs\") pod \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732366 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d71308aa-e2b5-4775-917e-1b100ff8969c-operator-scripts\") pod \"d71308aa-e2b5-4775-917e-1b100ff8969c\" (UID: \"d71308aa-e2b5-4775-917e-1b100ff8969c\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732388 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-scripts\") pod \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732415 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-public-tls-certs\") pod \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732430 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lstcr\" (UniqueName: \"kubernetes.io/projected/955f29ee-6405-41b4-b905-3438ed1344fd-kube-api-access-lstcr\") pod \"955f29ee-6405-41b4-b905-3438ed1344fd\" (UID: \"955f29ee-6405-41b4-b905-3438ed1344fd\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732448 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-config-data\") pod \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732472 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96lf6\" (UniqueName: \"kubernetes.io/projected/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-kube-api-access-96lf6\") pod \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\" (UID: \"80f62273-a2cb-48fc-9dc4-e3bbe09bb517\") " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732969 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.732982 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22nb2\" (UniqueName: \"kubernetes.io/projected/f12e361b-e5e4-4c7c-8c4f-fe266937ffda-kube-api-access-22nb2\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.735097 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6" (UID: "f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.739954 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-logs" (OuterVolumeSpecName: "logs") pod "80f62273-a2cb-48fc-9dc4-e3bbe09bb517" (UID: "80f62273-a2cb-48fc-9dc4-e3bbe09bb517"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.740177 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "80f62273-a2cb-48fc-9dc4-e3bbe09bb517" (UID: "80f62273-a2cb-48fc-9dc4-e3bbe09bb517"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.740449 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d71308aa-e2b5-4775-917e-1b100ff8969c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d71308aa-e2b5-4775-917e-1b100ff8969c" (UID: "d71308aa-e2b5-4775-917e-1b100ff8969c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.741528 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955f29ee-6405-41b4-b905-3438ed1344fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "955f29ee-6405-41b4-b905-3438ed1344fd" (UID: "955f29ee-6405-41b4-b905-3438ed1344fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.748758 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71308aa-e2b5-4775-917e-1b100ff8969c-kube-api-access-xk4pd" (OuterVolumeSpecName: "kube-api-access-xk4pd") pod "d71308aa-e2b5-4775-917e-1b100ff8969c" (UID: "d71308aa-e2b5-4775-917e-1b100ff8969c"). InnerVolumeSpecName "kube-api-access-xk4pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.757239 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-kube-api-access-lqxn9" (OuterVolumeSpecName: "kube-api-access-lqxn9") pod "f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6" (UID: "f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6"). InnerVolumeSpecName "kube-api-access-lqxn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.762999 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-kube-api-access-96lf6" (OuterVolumeSpecName: "kube-api-access-96lf6") pod "80f62273-a2cb-48fc-9dc4-e3bbe09bb517" (UID: "80f62273-a2cb-48fc-9dc4-e3bbe09bb517"). InnerVolumeSpecName "kube-api-access-96lf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.763771 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955f29ee-6405-41b4-b905-3438ed1344fd-kube-api-access-lstcr" (OuterVolumeSpecName: "kube-api-access-lstcr") pod "955f29ee-6405-41b4-b905-3438ed1344fd" (UID: "955f29ee-6405-41b4-b905-3438ed1344fd"). InnerVolumeSpecName "kube-api-access-lstcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.764188 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-scripts" (OuterVolumeSpecName: "scripts") pod "80f62273-a2cb-48fc-9dc4-e3bbe09bb517" (UID: "80f62273-a2cb-48fc-9dc4-e3bbe09bb517"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.785345 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "80f62273-a2cb-48fc-9dc4-e3bbe09bb517" (UID: "80f62273-a2cb-48fc-9dc4-e3bbe09bb517"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.828925 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80f62273-a2cb-48fc-9dc4-e3bbe09bb517" (UID: "80f62273-a2cb-48fc-9dc4-e3bbe09bb517"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836058 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836108 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836122 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqxn9\" (UniqueName: \"kubernetes.io/projected/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-kube-api-access-lqxn9\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836138 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836149 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836160 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d71308aa-e2b5-4775-917e-1b100ff8969c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836171 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836182 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lstcr\" (UniqueName: \"kubernetes.io/projected/955f29ee-6405-41b4-b905-3438ed1344fd-kube-api-access-lstcr\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836192 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96lf6\" (UniqueName: \"kubernetes.io/projected/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-kube-api-access-96lf6\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836206 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836217 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk4pd\" (UniqueName: \"kubernetes.io/projected/d71308aa-e2b5-4775-917e-1b100ff8969c-kube-api-access-xk4pd\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.836227 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/955f29ee-6405-41b4-b905-3438ed1344fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.885058 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-config-data" (OuterVolumeSpecName: "config-data") pod "80f62273-a2cb-48fc-9dc4-e3bbe09bb517" (UID: "80f62273-a2cb-48fc-9dc4-e3bbe09bb517"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.886937 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "80f62273-a2cb-48fc-9dc4-e3bbe09bb517" (UID: "80f62273-a2cb-48fc-9dc4-e3bbe09bb517"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.903920 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.937774 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.937804 5002 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:43 crc kubenswrapper[5002]: I1209 10:22:43.937829 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f62273-a2cb-48fc-9dc4-e3bbe09bb517-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.699351 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80f62273-a2cb-48fc-9dc4-e3bbe09bb517","Type":"ContainerDied","Data":"35b3b18d0464130b0caf24ef40d7ddaa70838f5a6d56273b6211e485d18b1c4e"} Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.699680 5002 scope.go:117] "RemoveContainer" containerID="5ca2ef47d14df1b7854d3d7794e4e04fdddf2bb4f558c18e3a96da0d4a26ce8c" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.699792 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.702981 5002 generic.go:334] "Generic (PLEG): container finished" podID="c75244ea-a44f-4b49-b1dc-05a025bda463" containerID="473e3dfd3dbf213aeb987539706506bc04d20d03a4539c96c16ec84e35a95b56" exitCode=0 Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.703054 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5809-account-create-update-2sfhw" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.703571 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a200aca9-aa0e-44c1-a2da-bab4719bf8b0" containerName="cinder-api-log" containerID="cri-o://096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66" gracePeriod=30 Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.703653 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-05d2-account-create-update-2lrlj" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.704110 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a200aca9-aa0e-44c1-a2da-bab4719bf8b0" containerName="cinder-api" containerID="cri-o://2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829" gracePeriod=30 Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.704242 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c75244ea-a44f-4b49-b1dc-05a025bda463","Type":"ContainerDied","Data":"473e3dfd3dbf213aeb987539706506bc04d20d03a4539c96c16ec84e35a95b56"} Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.704283 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.753420 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.753394053 podStartE2EDuration="7.753394053s" podCreationTimestamp="2025-12-09 10:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:44.737318188 +0000 UTC m=+1297.129369279" watchObservedRunningTime="2025-12-09 10:22:44.753394053 +0000 UTC m=+1297.145445134" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.757498 5002 scope.go:117] "RemoveContainer" containerID="8184adac7ec070e2ec700e89b74a2520d43784f0369369c0ef173edb3c57e1cf" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.781183 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.798452 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.820156 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:22:44 crc kubenswrapper[5002]: E1209 10:22:44.820699 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12e361b-e5e4-4c7c-8c4f-fe266937ffda" containerName="mariadb-account-create-update" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.820723 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12e361b-e5e4-4c7c-8c4f-fe266937ffda" containerName="mariadb-account-create-update" Dec 09 10:22:44 crc kubenswrapper[5002]: E1209 10:22:44.820740 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" containerName="init" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.820748 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" containerName="init" Dec 09 10:22:44 crc kubenswrapper[5002]: E1209 10:22:44.820769 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerName="glance-log" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.820776 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerName="glance-log" Dec 09 10:22:44 crc kubenswrapper[5002]: E1209 10:22:44.820798 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" containerName="dnsmasq-dns" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.820806 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" containerName="dnsmasq-dns" Dec 09 10:22:44 crc kubenswrapper[5002]: E1209 10:22:44.820839 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71308aa-e2b5-4775-917e-1b100ff8969c" containerName="mariadb-database-create" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.820848 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71308aa-e2b5-4775-917e-1b100ff8969c" containerName="mariadb-database-create" Dec 09 10:22:44 crc kubenswrapper[5002]: E1209 10:22:44.820859 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955f29ee-6405-41b4-b905-3438ed1344fd" containerName="mariadb-account-create-update" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.820866 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="955f29ee-6405-41b4-b905-3438ed1344fd" containerName="mariadb-account-create-update" Dec 09 10:22:44 crc kubenswrapper[5002]: E1209 10:22:44.820889 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73bcef93-39f3-4f68-b3f2-cc78b4698e3a" containerName="mariadb-database-create" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.820897 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bcef93-39f3-4f68-b3f2-cc78b4698e3a" containerName="mariadb-database-create" Dec 09 10:22:44 crc kubenswrapper[5002]: E1209 10:22:44.820908 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5afe93e-c94d-4e57-987b-956d67b03621" containerName="mariadb-database-create" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.820916 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5afe93e-c94d-4e57-987b-956d67b03621" containerName="mariadb-database-create" Dec 09 10:22:44 crc kubenswrapper[5002]: E1209 10:22:44.820929 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerName="glance-httpd" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.820936 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerName="glance-httpd" Dec 09 10:22:44 crc kubenswrapper[5002]: E1209 10:22:44.820948 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6" containerName="mariadb-account-create-update" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.820955 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6" containerName="mariadb-account-create-update" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.821195 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafe96d5-c4bd-45a8-a9cc-1ee5ef780f62" containerName="dnsmasq-dns" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.821208 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerName="glance-httpd" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.821217 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="73bcef93-39f3-4f68-b3f2-cc78b4698e3a" containerName="mariadb-database-create" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.821231 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="955f29ee-6405-41b4-b905-3438ed1344fd" containerName="mariadb-account-create-update" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.821245 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" containerName="glance-log" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.821268 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5afe93e-c94d-4e57-987b-956d67b03621" containerName="mariadb-database-create" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.821283 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6" containerName="mariadb-account-create-update" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.821296 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12e361b-e5e4-4c7c-8c4f-fe266937ffda" containerName="mariadb-account-create-update" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.821311 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71308aa-e2b5-4775-917e-1b100ff8969c" containerName="mariadb-database-create" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.822652 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.828999 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.829270 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.859393 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.963151 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.963456 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-config-data\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.963486 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.963508 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.963538 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-logs\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.963556 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-scripts\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.963581 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvm5x\" (UniqueName: \"kubernetes.io/projected/65df60b6-4049-47b6-9907-ebf76c151213-kube-api-access-lvm5x\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:44 crc kubenswrapper[5002]: I1209 10:22:44.963637 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.065900 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.065981 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-config-data\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.066021 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.066052 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.066088 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-logs\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.066117 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-scripts\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.066154 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvm5x\" (UniqueName: \"kubernetes.io/projected/65df60b6-4049-47b6-9907-ebf76c151213-kube-api-access-lvm5x\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.066233 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.066667 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.066689 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.067893 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-logs\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.075094 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.075465 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-config-data\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.079047 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.097927 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-scripts\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.108526 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvm5x\" (UniqueName: \"kubernetes.io/projected/65df60b6-4049-47b6-9907-ebf76c151213-kube-api-access-lvm5x\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.118328 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.148232 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.240582 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.370799 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-config-data\") pod \"c75244ea-a44f-4b49-b1dc-05a025bda463\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.371945 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-httpd-run\") pod \"c75244ea-a44f-4b49-b1dc-05a025bda463\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.371975 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c75244ea-a44f-4b49-b1dc-05a025bda463\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.372091 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-combined-ca-bundle\") pod \"c75244ea-a44f-4b49-b1dc-05a025bda463\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.372114 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-scripts\") pod \"c75244ea-a44f-4b49-b1dc-05a025bda463\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.372166 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q954\" (UniqueName: \"kubernetes.io/projected/c75244ea-a44f-4b49-b1dc-05a025bda463-kube-api-access-4q954\") pod \"c75244ea-a44f-4b49-b1dc-05a025bda463\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.372186 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-logs\") pod \"c75244ea-a44f-4b49-b1dc-05a025bda463\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.372265 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-internal-tls-certs\") pod \"c75244ea-a44f-4b49-b1dc-05a025bda463\" (UID: \"c75244ea-a44f-4b49-b1dc-05a025bda463\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.372690 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c75244ea-a44f-4b49-b1dc-05a025bda463" (UID: "c75244ea-a44f-4b49-b1dc-05a025bda463"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.373617 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-logs" (OuterVolumeSpecName: "logs") pod "c75244ea-a44f-4b49-b1dc-05a025bda463" (UID: "c75244ea-a44f-4b49-b1dc-05a025bda463"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.385195 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "c75244ea-a44f-4b49-b1dc-05a025bda463" (UID: "c75244ea-a44f-4b49-b1dc-05a025bda463"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.398332 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75244ea-a44f-4b49-b1dc-05a025bda463-kube-api-access-4q954" (OuterVolumeSpecName: "kube-api-access-4q954") pod "c75244ea-a44f-4b49-b1dc-05a025bda463" (UID: "c75244ea-a44f-4b49-b1dc-05a025bda463"). InnerVolumeSpecName "kube-api-access-4q954". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.401357 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-scripts" (OuterVolumeSpecName: "scripts") pod "c75244ea-a44f-4b49-b1dc-05a025bda463" (UID: "c75244ea-a44f-4b49-b1dc-05a025bda463"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.417665 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c75244ea-a44f-4b49-b1dc-05a025bda463" (UID: "c75244ea-a44f-4b49-b1dc-05a025bda463"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.474130 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.474186 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.474199 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.474214 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.474226 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q954\" (UniqueName: \"kubernetes.io/projected/c75244ea-a44f-4b49-b1dc-05a025bda463-kube-api-access-4q954\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.474238 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75244ea-a44f-4b49-b1dc-05a025bda463-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.488559 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c75244ea-a44f-4b49-b1dc-05a025bda463" (UID: "c75244ea-a44f-4b49-b1dc-05a025bda463"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.501707 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-config-data" (OuterVolumeSpecName: "config-data") pod "c75244ea-a44f-4b49-b1dc-05a025bda463" (UID: "c75244ea-a44f-4b49-b1dc-05a025bda463"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.526394 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.563400 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.578279 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.578560 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.578642 5002 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75244ea-a44f-4b49-b1dc-05a025bda463-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.680366 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-logs\") pod \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.680853 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data\") pod \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.680906 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2tbc\" (UniqueName: \"kubernetes.io/projected/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-kube-api-access-b2tbc\") pod \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.680942 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-combined-ca-bundle\") pod \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.681006 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data-custom\") pod \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.681027 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-etc-machine-id\") pod \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.681051 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-logs" (OuterVolumeSpecName: "logs") pod "a200aca9-aa0e-44c1-a2da-bab4719bf8b0" (UID: "a200aca9-aa0e-44c1-a2da-bab4719bf8b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.681098 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-scripts\") pod \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\" (UID: \"a200aca9-aa0e-44c1-a2da-bab4719bf8b0\") " Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.681480 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.681646 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a200aca9-aa0e-44c1-a2da-bab4719bf8b0" (UID: "a200aca9-aa0e-44c1-a2da-bab4719bf8b0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.687357 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-kube-api-access-b2tbc" (OuterVolumeSpecName: "kube-api-access-b2tbc") pod "a200aca9-aa0e-44c1-a2da-bab4719bf8b0" (UID: "a200aca9-aa0e-44c1-a2da-bab4719bf8b0"). InnerVolumeSpecName "kube-api-access-b2tbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.689047 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-scripts" (OuterVolumeSpecName: "scripts") pod "a200aca9-aa0e-44c1-a2da-bab4719bf8b0" (UID: "a200aca9-aa0e-44c1-a2da-bab4719bf8b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.706025 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a200aca9-aa0e-44c1-a2da-bab4719bf8b0" (UID: "a200aca9-aa0e-44c1-a2da-bab4719bf8b0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.721518 5002 generic.go:334] "Generic (PLEG): container finished" podID="a200aca9-aa0e-44c1-a2da-bab4719bf8b0" containerID="2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829" exitCode=0 Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.721561 5002 generic.go:334] "Generic (PLEG): container finished" podID="a200aca9-aa0e-44c1-a2da-bab4719bf8b0" containerID="096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66" exitCode=143 Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.721632 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.721659 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a200aca9-aa0e-44c1-a2da-bab4719bf8b0","Type":"ContainerDied","Data":"2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829"} Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.721692 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a200aca9-aa0e-44c1-a2da-bab4719bf8b0","Type":"ContainerDied","Data":"096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66"} Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.721705 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a200aca9-aa0e-44c1-a2da-bab4719bf8b0","Type":"ContainerDied","Data":"3f13c9a2d984958c7b71b1a7e67c4e8dfe788e400fd4b800711dcfa6d573bff6"} Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.721723 5002 scope.go:117] "RemoveContainer" containerID="2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.736288 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c75244ea-a44f-4b49-b1dc-05a025bda463","Type":"ContainerDied","Data":"d0ed4bc24d24b47504be690fd23fabf977bc8994658940bb1646e023334390b9"} Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.736512 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.743989 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a200aca9-aa0e-44c1-a2da-bab4719bf8b0" (UID: "a200aca9-aa0e-44c1-a2da-bab4719bf8b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.750341 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"58839cfb-488d-4d08-b077-bf23cad0fedb","Type":"ContainerStarted","Data":"7c54fd7ef2d21fcdd7e51d8f37703e839fb35843258561382844b90dc6812e2c"} Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.763517 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data" (OuterVolumeSpecName: "config-data") pod "a200aca9-aa0e-44c1-a2da-bab4719bf8b0" (UID: "a200aca9-aa0e-44c1-a2da-bab4719bf8b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.784833 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.784869 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2tbc\" (UniqueName: \"kubernetes.io/projected/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-kube-api-access-b2tbc\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.784879 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.784889 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.784898 5002 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.784906 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a200aca9-aa0e-44c1-a2da-bab4719bf8b0-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.799346 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.311834004 podStartE2EDuration="8.799325369s" podCreationTimestamp="2025-12-09 10:22:37 +0000 UTC" firstStartedPulling="2025-12-09 10:22:39.405848253 +0000 UTC m=+1291.797899334" lastFinishedPulling="2025-12-09 10:22:40.893339618 +0000 UTC m=+1293.285390699" observedRunningTime="2025-12-09 10:22:45.784436877 +0000 UTC m=+1298.176487958" watchObservedRunningTime="2025-12-09 10:22:45.799325369 +0000 UTC m=+1298.191376450" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.828047 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.846036 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.853571 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:22:45 crc kubenswrapper[5002]: E1209 10:22:45.853913 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75244ea-a44f-4b49-b1dc-05a025bda463" containerName="glance-httpd" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.853924 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75244ea-a44f-4b49-b1dc-05a025bda463" containerName="glance-httpd" Dec 09 10:22:45 crc kubenswrapper[5002]: E1209 10:22:45.853939 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a200aca9-aa0e-44c1-a2da-bab4719bf8b0" containerName="cinder-api-log" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.853947 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a200aca9-aa0e-44c1-a2da-bab4719bf8b0" containerName="cinder-api-log" Dec 09 10:22:45 crc kubenswrapper[5002]: E1209 10:22:45.853960 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a200aca9-aa0e-44c1-a2da-bab4719bf8b0" containerName="cinder-api" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.853967 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a200aca9-aa0e-44c1-a2da-bab4719bf8b0" containerName="cinder-api" Dec 09 10:22:45 crc kubenswrapper[5002]: E1209 10:22:45.853981 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75244ea-a44f-4b49-b1dc-05a025bda463" containerName="glance-log" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.853986 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75244ea-a44f-4b49-b1dc-05a025bda463" containerName="glance-log" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.854139 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c75244ea-a44f-4b49-b1dc-05a025bda463" containerName="glance-log" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.854150 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c75244ea-a44f-4b49-b1dc-05a025bda463" containerName="glance-httpd" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.854162 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a200aca9-aa0e-44c1-a2da-bab4719bf8b0" containerName="cinder-api-log" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.854181 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a200aca9-aa0e-44c1-a2da-bab4719bf8b0" containerName="cinder-api" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.855146 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.863277 5002 scope.go:117] "RemoveContainer" containerID="096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.864312 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.871184 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.892598 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.914371 5002 scope.go:117] "RemoveContainer" containerID="2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829" Dec 09 10:22:45 crc kubenswrapper[5002]: E1209 10:22:45.917562 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829\": container with ID starting with 2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829 not found: ID does not exist" containerID="2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.917775 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829"} err="failed to get container status \"2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829\": rpc error: code = NotFound desc = could not find container \"2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829\": container with ID starting with 2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829 not found: ID does not exist" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.920260 5002 scope.go:117] "RemoveContainer" containerID="096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66" Dec 09 10:22:45 crc kubenswrapper[5002]: E1209 10:22:45.927228 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66\": container with ID starting with 096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66 not found: ID does not exist" containerID="096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.927286 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66"} err="failed to get container status \"096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66\": rpc error: code = NotFound desc = could not find container \"096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66\": container with ID starting with 096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66 not found: ID does not exist" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.927335 5002 scope.go:117] "RemoveContainer" containerID="2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.928095 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829"} err="failed to get container status \"2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829\": rpc error: code = NotFound desc = could not find container \"2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829\": container with ID starting with 2bc2096ebb67df1b975face6df6e04a7f42de3913bca881d883f43f451e09829 not found: ID does not exist" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.928156 5002 scope.go:117] "RemoveContainer" containerID="096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.930987 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66"} err="failed to get container status \"096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66\": rpc error: code = NotFound desc = could not find container \"096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66\": container with ID starting with 096fdd6b53668d812090b7ef62862cd5342edfc3f386b08c4d32f44285dbfa66 not found: ID does not exist" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.931024 5002 scope.go:117] "RemoveContainer" containerID="473e3dfd3dbf213aeb987539706506bc04d20d03a4539c96c16ec84e35a95b56" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.972780 5002 scope.go:117] "RemoveContainer" containerID="6ef1103090d4b3937fed5f5bb7e4f7971545d5a53212bed951a69d369cd8977f" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.980131 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.997265 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.997307 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-logs\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.997343 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.997400 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.997425 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww85p\" (UniqueName: \"kubernetes.io/projected/54351653-7ebd-40ba-8181-bb1023f18190-kube-api-access-ww85p\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.997449 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-scripts\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.997470 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:45 crc kubenswrapper[5002]: I1209 10:22:45.997593 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-config-data\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.048002 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.101542 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.101768 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-logs\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.101839 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.101966 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.102094 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww85p\" (UniqueName: \"kubernetes.io/projected/54351653-7ebd-40ba-8181-bb1023f18190-kube-api-access-ww85p\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.102135 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-scripts\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.102158 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.102303 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-config-data\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.102421 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-logs\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.103022 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.106283 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.117593 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-scripts\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.123721 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-config-data\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.128418 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f62273-a2cb-48fc-9dc4-e3bbe09bb517" path="/var/lib/kubelet/pods/80f62273-a2cb-48fc-9dc4-e3bbe09bb517/volumes" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.134411 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.135638 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c75244ea-a44f-4b49-b1dc-05a025bda463" path="/var/lib/kubelet/pods/c75244ea-a44f-4b49-b1dc-05a025bda463/volumes" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.136261 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.136286 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.136303 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.136319 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.137466 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww85p\" (UniqueName: \"kubernetes.io/projected/54351653-7ebd-40ba-8181-bb1023f18190-kube-api-access-ww85p\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.142009 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.145177 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.145487 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.145561 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.149576 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.175576 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.216455 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.320562 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7tg9\" (UniqueName: \"kubernetes.io/projected/02c94bee-a522-4ea6-85af-1ba68e174203-kube-api-access-q7tg9\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.320649 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-public-tls-certs\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.320682 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data-custom\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.320726 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-scripts\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.320748 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02c94bee-a522-4ea6-85af-1ba68e174203-logs\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.320838 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02c94bee-a522-4ea6-85af-1ba68e174203-etc-machine-id\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.320956 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.320999 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.321104 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.344124 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7b9g"] Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.345548 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.348993 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.359342 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.359573 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xzzv7" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.373438 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7b9g"] Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.423108 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.423157 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.423207 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.423254 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7tg9\" (UniqueName: \"kubernetes.io/projected/02c94bee-a522-4ea6-85af-1ba68e174203-kube-api-access-q7tg9\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.423280 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-public-tls-certs\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.423297 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data-custom\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.423311 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-scripts\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.423327 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02c94bee-a522-4ea6-85af-1ba68e174203-logs\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.423358 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02c94bee-a522-4ea6-85af-1ba68e174203-etc-machine-id\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.423462 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02c94bee-a522-4ea6-85af-1ba68e174203-etc-machine-id\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.431803 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02c94bee-a522-4ea6-85af-1ba68e174203-logs\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.433226 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.433621 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-scripts\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.440386 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data-custom\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.451168 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7tg9\" (UniqueName: \"kubernetes.io/projected/02c94bee-a522-4ea6-85af-1ba68e174203-kube-api-access-q7tg9\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.453216 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.462248 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-public-tls-certs\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.465088 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.487264 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.505036 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.524890 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.524944 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-config-data\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.525011 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-scripts\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.525058 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbbdc\" (UniqueName: \"kubernetes.io/projected/26e2d58a-f6d2-4e30-a327-042f181b7ba0-kube-api-access-pbbdc\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.626659 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.626718 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-config-data\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.626778 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-scripts\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.626840 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbbdc\" (UniqueName: \"kubernetes.io/projected/26e2d58a-f6d2-4e30-a327-042f181b7ba0-kube-api-access-pbbdc\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.634550 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.634759 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-scripts\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.635036 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-config-data\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.654282 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbbdc\" (UniqueName: \"kubernetes.io/projected/26e2d58a-f6d2-4e30-a327-042f181b7ba0-kube-api-access-pbbdc\") pod \"nova-cell0-conductor-db-sync-k7b9g\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.676205 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.810865 5002 generic.go:334] "Generic (PLEG): container finished" podID="d8f27813-7f40-4967-9e37-34e4ae205cb7" containerID="51397172bc6109b173eaa3e4fefc729d3fc17a5efc791a6027086db757378d60" exitCode=0 Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.810917 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52w6p" event={"ID":"d8f27813-7f40-4967-9e37-34e4ae205cb7","Type":"ContainerDied","Data":"51397172bc6109b173eaa3e4fefc729d3fc17a5efc791a6027086db757378d60"} Dec 09 10:22:46 crc kubenswrapper[5002]: I1209 10:22:46.834697 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65df60b6-4049-47b6-9907-ebf76c151213","Type":"ContainerStarted","Data":"dfb3a50b874247de5cc4a9c4ef2098e5c6f64cce51b380cdfee9c6796737e449"} Dec 09 10:22:47 crc kubenswrapper[5002]: W1209 10:22:47.143203 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54351653_7ebd_40ba_8181_bb1023f18190.slice/crio-6ac47bbc9469d9515e9d41c57c9542c548a6ab9b8c0c065b01deb61a1d008418 WatchSource:0}: Error finding container 6ac47bbc9469d9515e9d41c57c9542c548a6ab9b8c0c065b01deb61a1d008418: Status 404 returned error can't find the container with id 6ac47bbc9469d9515e9d41c57c9542c548a6ab9b8c0c065b01deb61a1d008418 Dec 09 10:22:47 crc kubenswrapper[5002]: I1209 10:22:47.147227 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:22:47 crc kubenswrapper[5002]: I1209 10:22:47.207598 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:22:47 crc kubenswrapper[5002]: I1209 10:22:47.334476 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7b9g"] Dec 09 10:22:47 crc kubenswrapper[5002]: I1209 10:22:47.425692 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:47 crc kubenswrapper[5002]: I1209 10:22:47.858888 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65df60b6-4049-47b6-9907-ebf76c151213","Type":"ContainerStarted","Data":"165a1620604d830dded2bfca6d82a903dd8294bcf0db3c611e329f3f40e88ace"} Dec 09 10:22:47 crc kubenswrapper[5002]: I1209 10:22:47.859283 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65df60b6-4049-47b6-9907-ebf76c151213","Type":"ContainerStarted","Data":"1dd5020ee445c45b526aafb1a2e0ba3b4c5c0fb89e017224140c385ac48d4a20"} Dec 09 10:22:47 crc kubenswrapper[5002]: I1209 10:22:47.860487 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02c94bee-a522-4ea6-85af-1ba68e174203","Type":"ContainerStarted","Data":"9e5966a72eb7d56f519b1957f1928609fb0ddbbdba8bfc272c163fdf3153bcf9"} Dec 09 10:22:47 crc kubenswrapper[5002]: I1209 10:22:47.862919 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54351653-7ebd-40ba-8181-bb1023f18190","Type":"ContainerStarted","Data":"6ac47bbc9469d9515e9d41c57c9542c548a6ab9b8c0c065b01deb61a1d008418"} Dec 09 10:22:47 crc kubenswrapper[5002]: I1209 10:22:47.867372 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7b9g" event={"ID":"26e2d58a-f6d2-4e30-a327-042f181b7ba0","Type":"ContainerStarted","Data":"593a13a5551e610250f921ade7992eebca1b77088c7d85ad442799a1ff12c7ac"} Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.135990 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a200aca9-aa0e-44c1-a2da-bab4719bf8b0" path="/var/lib/kubelet/pods/a200aca9-aa0e-44c1-a2da-bab4719bf8b0/volumes" Dec 09 10:22:48 crc kubenswrapper[5002]: E1209 10:22:48.137630 5002 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.436420 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52w6p" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.522114 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.574281 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-config\") pod \"d8f27813-7f40-4967-9e37-34e4ae205cb7\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.574356 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htx5l\" (UniqueName: \"kubernetes.io/projected/d8f27813-7f40-4967-9e37-34e4ae205cb7-kube-api-access-htx5l\") pod \"d8f27813-7f40-4967-9e37-34e4ae205cb7\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.574421 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-combined-ca-bundle\") pod \"d8f27813-7f40-4967-9e37-34e4ae205cb7\" (UID: \"d8f27813-7f40-4967-9e37-34e4ae205cb7\") " Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.578250 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f27813-7f40-4967-9e37-34e4ae205cb7-kube-api-access-htx5l" (OuterVolumeSpecName: "kube-api-access-htx5l") pod "d8f27813-7f40-4967-9e37-34e4ae205cb7" (UID: "d8f27813-7f40-4967-9e37-34e4ae205cb7"). InnerVolumeSpecName "kube-api-access-htx5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.608046 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8f27813-7f40-4967-9e37-34e4ae205cb7" (UID: "d8f27813-7f40-4967-9e37-34e4ae205cb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.650766 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-config" (OuterVolumeSpecName: "config") pod "d8f27813-7f40-4967-9e37-34e4ae205cb7" (UID: "d8f27813-7f40-4967-9e37-34e4ae205cb7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.680406 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.680451 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htx5l\" (UniqueName: \"kubernetes.io/projected/d8f27813-7f40-4967-9e37-34e4ae205cb7-kube-api-access-htx5l\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.680463 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f27813-7f40-4967-9e37-34e4ae205cb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.792977 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.861558 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.926225 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52w6p" event={"ID":"d8f27813-7f40-4967-9e37-34e4ae205cb7","Type":"ContainerDied","Data":"a5cbf5897b9db96ee4c754867f430c82c7025fed0cea4a490c80f9ba179946e2"} Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.926274 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5cbf5897b9db96ee4c754867f430c82c7025fed0cea4a490c80f9ba179946e2" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.926354 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52w6p" Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.951244 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02c94bee-a522-4ea6-85af-1ba68e174203","Type":"ContainerStarted","Data":"63586c424dc327bbd0545721f624a8d5562a690863d2d8565ad85890deded297"} Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.951458 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-ld5mf"] Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.951682 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" podUID="018e2746-8f8d-47a4-9da1-96b398185024" containerName="dnsmasq-dns" containerID="cri-o://6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97" gracePeriod=10 Dec 09 10:22:48 crc kubenswrapper[5002]: I1209 10:22:48.961230 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54351653-7ebd-40ba-8181-bb1023f18190","Type":"ContainerStarted","Data":"1df4bb922098dea87efcb2c4b87131e9b7b7222c65f32b527e20b6017dad1370"} Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.040892 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.047888 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.047869416 podStartE2EDuration="5.047869416s" podCreationTimestamp="2025-12-09 10:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:49.03328639 +0000 UTC m=+1301.425337471" watchObservedRunningTime="2025-12-09 10:22:49.047869416 +0000 UTC m=+1301.439920497" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.148259 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7xtz7"] Dec 09 10:22:49 crc kubenswrapper[5002]: E1209 10:22:49.154015 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f27813-7f40-4967-9e37-34e4ae205cb7" containerName="neutron-db-sync" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.154062 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f27813-7f40-4967-9e37-34e4ae205cb7" containerName="neutron-db-sync" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.154320 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f27813-7f40-4967-9e37-34e4ae205cb7" containerName="neutron-db-sync" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.156319 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.188378 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7xtz7"] Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.310213 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67fdf7874b-b25q8"] Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.319839 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.319929 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.319968 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.320006 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.320069 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpp4w\" (UniqueName: \"kubernetes.io/projected/7906530a-fb62-45dc-8b2c-f4c52045fa70-kube-api-access-dpp4w\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.320108 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-config\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.357935 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.362505 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.363047 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.363267 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.363449 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7tnw8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.371539 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67fdf7874b-b25q8"] Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.421664 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpp4w\" (UniqueName: \"kubernetes.io/projected/7906530a-fb62-45dc-8b2c-f4c52045fa70-kube-api-access-dpp4w\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.421731 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-config\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.421857 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.421927 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.421951 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.422000 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.423021 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.423063 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-config\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.423029 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.423683 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.423734 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.445475 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpp4w\" (UniqueName: \"kubernetes.io/projected/7906530a-fb62-45dc-8b2c-f4c52045fa70-kube-api-access-dpp4w\") pod \"dnsmasq-dns-5c9776ccc5-7xtz7\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.525012 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-combined-ca-bundle\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.525373 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-ovndb-tls-certs\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.525427 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-config\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.525475 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sz6w\" (UniqueName: \"kubernetes.io/projected/203af364-55a8-46c1-be90-22ed4849dec0-kube-api-access-6sz6w\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.525560 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-httpd-config\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.533226 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.630226 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sz6w\" (UniqueName: \"kubernetes.io/projected/203af364-55a8-46c1-be90-22ed4849dec0-kube-api-access-6sz6w\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.630346 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-httpd-config\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.630574 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-combined-ca-bundle\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.630597 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-ovndb-tls-certs\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.630628 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-config\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.638613 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-config\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.643139 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-ovndb-tls-certs\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.645960 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-httpd-config\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.668348 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-combined-ca-bundle\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.681116 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sz6w\" (UniqueName: \"kubernetes.io/projected/203af364-55a8-46c1-be90-22ed4849dec0-kube-api-access-6sz6w\") pod \"neutron-67fdf7874b-b25q8\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.718969 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.777531 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.935093 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-sb\") pod \"018e2746-8f8d-47a4-9da1-96b398185024\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.935516 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-config\") pod \"018e2746-8f8d-47a4-9da1-96b398185024\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.935637 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-svc\") pod \"018e2746-8f8d-47a4-9da1-96b398185024\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.935670 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-swift-storage-0\") pod \"018e2746-8f8d-47a4-9da1-96b398185024\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.935697 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-nb\") pod \"018e2746-8f8d-47a4-9da1-96b398185024\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.935862 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6p84\" (UniqueName: \"kubernetes.io/projected/018e2746-8f8d-47a4-9da1-96b398185024-kube-api-access-t6p84\") pod \"018e2746-8f8d-47a4-9da1-96b398185024\" (UID: \"018e2746-8f8d-47a4-9da1-96b398185024\") " Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.939174 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:22:49 crc kubenswrapper[5002]: I1209 10:22:49.942628 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018e2746-8f8d-47a4-9da1-96b398185024-kube-api-access-t6p84" (OuterVolumeSpecName: "kube-api-access-t6p84") pod "018e2746-8f8d-47a4-9da1-96b398185024" (UID: "018e2746-8f8d-47a4-9da1-96b398185024"). InnerVolumeSpecName "kube-api-access-t6p84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.027247 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "018e2746-8f8d-47a4-9da1-96b398185024" (UID: "018e2746-8f8d-47a4-9da1-96b398185024"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.038618 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b88b97456-wb29h"] Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.038865 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b88b97456-wb29h" podUID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerName="barbican-api-log" containerID="cri-o://128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f" gracePeriod=30 Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.039271 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b88b97456-wb29h" podUID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerName="barbican-api" containerID="cri-o://1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e" gracePeriod=30 Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.043373 5002 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.043420 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6p84\" (UniqueName: \"kubernetes.io/projected/018e2746-8f8d-47a4-9da1-96b398185024-kube-api-access-t6p84\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.045478 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-config" (OuterVolumeSpecName: "config") pod "018e2746-8f8d-47a4-9da1-96b398185024" (UID: "018e2746-8f8d-47a4-9da1-96b398185024"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.071400 5002 generic.go:334] "Generic (PLEG): container finished" podID="018e2746-8f8d-47a4-9da1-96b398185024" containerID="6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97" exitCode=0 Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.071541 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.074018 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="58839cfb-488d-4d08-b077-bf23cad0fedb" containerName="cinder-scheduler" containerID="cri-o://d839cbd7dc24cd99b6bb542f2309c20c1ce28650b857540ee941d95b26b6625f" gracePeriod=30 Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.075073 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="58839cfb-488d-4d08-b077-bf23cad0fedb" containerName="probe" containerID="cri-o://7c54fd7ef2d21fcdd7e51d8f37703e839fb35843258561382844b90dc6812e2c" gracePeriod=30 Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.104606 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54351653-7ebd-40ba-8181-bb1023f18190","Type":"ContainerStarted","Data":"1e6f6d93e3bd5dcd93496f1389572c1e7e7d5c754f5b03c5cf8bcd4e3532e5b0"} Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.104651 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" event={"ID":"018e2746-8f8d-47a4-9da1-96b398185024","Type":"ContainerDied","Data":"6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97"} Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.104667 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-ld5mf" event={"ID":"018e2746-8f8d-47a4-9da1-96b398185024","Type":"ContainerDied","Data":"d04e99a64e236bde486bf8dd4c49f3c3745c6aee050572e964b62b2246badbae"} Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.104677 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02c94bee-a522-4ea6-85af-1ba68e174203","Type":"ContainerStarted","Data":"d1c65c897a0495448b0bba138435b4e6ce0da36de283bd456d1269f9d3226c84"} Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.104695 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.104729 5002 scope.go:117] "RemoveContainer" containerID="6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.106682 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "018e2746-8f8d-47a4-9da1-96b398185024" (UID: "018e2746-8f8d-47a4-9da1-96b398185024"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.111863 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.11184081 podStartE2EDuration="5.11184081s" podCreationTimestamp="2025-12-09 10:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:50.105667211 +0000 UTC m=+1302.497718292" watchObservedRunningTime="2025-12-09 10:22:50.11184081 +0000 UTC m=+1302.503891891" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.124113 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "018e2746-8f8d-47a4-9da1-96b398185024" (UID: "018e2746-8f8d-47a4-9da1-96b398185024"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.144764 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.144966 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.144980 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.168680 5002 scope.go:117] "RemoveContainer" containerID="a60290bc2cab4a6248037321998fe19da156637feb660eb827cf1143edb3e0d8" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.177898 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.177876175 podStartE2EDuration="4.177876175s" podCreationTimestamp="2025-12-09 10:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:50.135426719 +0000 UTC m=+1302.527477800" watchObservedRunningTime="2025-12-09 10:22:50.177876175 +0000 UTC m=+1302.569927256" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.180463 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "018e2746-8f8d-47a4-9da1-96b398185024" (UID: "018e2746-8f8d-47a4-9da1-96b398185024"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.197184 5002 scope.go:117] "RemoveContainer" containerID="6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97" Dec 09 10:22:50 crc kubenswrapper[5002]: E1209 10:22:50.208956 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97\": container with ID starting with 6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97 not found: ID does not exist" containerID="6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.209297 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97"} err="failed to get container status \"6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97\": rpc error: code = NotFound desc = could not find container \"6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97\": container with ID starting with 6a8fb6e1d1004c319add636d77cefaef670741428f0596cff090a5aa87c34c97 not found: ID does not exist" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.209323 5002 scope.go:117] "RemoveContainer" containerID="a60290bc2cab4a6248037321998fe19da156637feb660eb827cf1143edb3e0d8" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.210216 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7xtz7"] Dec 09 10:22:50 crc kubenswrapper[5002]: E1209 10:22:50.222121 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60290bc2cab4a6248037321998fe19da156637feb660eb827cf1143edb3e0d8\": container with ID starting with a60290bc2cab4a6248037321998fe19da156637feb660eb827cf1143edb3e0d8 not found: ID does not exist" containerID="a60290bc2cab4a6248037321998fe19da156637feb660eb827cf1143edb3e0d8" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.222166 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60290bc2cab4a6248037321998fe19da156637feb660eb827cf1143edb3e0d8"} err="failed to get container status \"a60290bc2cab4a6248037321998fe19da156637feb660eb827cf1143edb3e0d8\": rpc error: code = NotFound desc = could not find container \"a60290bc2cab4a6248037321998fe19da156637feb660eb827cf1143edb3e0d8\": container with ID starting with a60290bc2cab4a6248037321998fe19da156637feb660eb827cf1143edb3e0d8 not found: ID does not exist" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.246583 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/018e2746-8f8d-47a4-9da1-96b398185024-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.462611 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-ld5mf"] Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.471391 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-ld5mf"] Dec 09 10:22:50 crc kubenswrapper[5002]: I1209 10:22:50.570867 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67fdf7874b-b25q8"] Dec 09 10:22:51 crc kubenswrapper[5002]: I1209 10:22:51.097136 5002 generic.go:334] "Generic (PLEG): container finished" podID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerID="128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f" exitCode=143 Dec 09 10:22:51 crc kubenswrapper[5002]: I1209 10:22:51.097571 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b88b97456-wb29h" event={"ID":"735c673a-9bc6-4a3c-b553-dea3ba521606","Type":"ContainerDied","Data":"128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f"} Dec 09 10:22:51 crc kubenswrapper[5002]: I1209 10:22:51.099589 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67fdf7874b-b25q8" event={"ID":"203af364-55a8-46c1-be90-22ed4849dec0","Type":"ContainerStarted","Data":"66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06"} Dec 09 10:22:51 crc kubenswrapper[5002]: I1209 10:22:51.099618 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67fdf7874b-b25q8" event={"ID":"203af364-55a8-46c1-be90-22ed4849dec0","Type":"ContainerStarted","Data":"7710f7d4c6e1d9402e7b5533673611452772302c2eb8ee6d2afb38c09c6a83c9"} Dec 09 10:22:51 crc kubenswrapper[5002]: I1209 10:22:51.108073 5002 generic.go:334] "Generic (PLEG): container finished" podID="7906530a-fb62-45dc-8b2c-f4c52045fa70" containerID="c82c4f8c5fb3dcb52f79b0e7b2fda9f4362e5b057a2b7bf1f7117129a50248b8" exitCode=0 Dec 09 10:22:51 crc kubenswrapper[5002]: I1209 10:22:51.109624 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" event={"ID":"7906530a-fb62-45dc-8b2c-f4c52045fa70","Type":"ContainerDied","Data":"c82c4f8c5fb3dcb52f79b0e7b2fda9f4362e5b057a2b7bf1f7117129a50248b8"} Dec 09 10:22:51 crc kubenswrapper[5002]: I1209 10:22:51.109668 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" event={"ID":"7906530a-fb62-45dc-8b2c-f4c52045fa70","Type":"ContainerStarted","Data":"0df69a12aef296c6624d0f0bf0a7d1ab732d06e3ccb07659252efca566e9a74a"} Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.077651 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="018e2746-8f8d-47a4-9da1-96b398185024" path="/var/lib/kubelet/pods/018e2746-8f8d-47a4-9da1-96b398185024/volumes" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.132133 5002 generic.go:334] "Generic (PLEG): container finished" podID="58839cfb-488d-4d08-b077-bf23cad0fedb" containerID="7c54fd7ef2d21fcdd7e51d8f37703e839fb35843258561382844b90dc6812e2c" exitCode=0 Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.132223 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"58839cfb-488d-4d08-b077-bf23cad0fedb","Type":"ContainerDied","Data":"7c54fd7ef2d21fcdd7e51d8f37703e839fb35843258561382844b90dc6812e2c"} Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.140571 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67fdf7874b-b25q8" event={"ID":"203af364-55a8-46c1-be90-22ed4849dec0","Type":"ContainerStarted","Data":"5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b"} Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.141686 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.153979 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" event={"ID":"7906530a-fb62-45dc-8b2c-f4c52045fa70","Type":"ContainerStarted","Data":"2af731cf29a9b1447cebdde90c9d283671a3fa56e1feac1b8bda802337a372d2"} Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.154735 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.188568 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67fdf7874b-b25q8" podStartSLOduration=3.188544435 podStartE2EDuration="3.188544435s" podCreationTimestamp="2025-12-09 10:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:52.165198642 +0000 UTC m=+1304.557249723" watchObservedRunningTime="2025-12-09 10:22:52.188544435 +0000 UTC m=+1304.580595516" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.198798 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" podStartSLOduration=3.198776569 podStartE2EDuration="3.198776569s" podCreationTimestamp="2025-12-09 10:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:22:52.186399489 +0000 UTC m=+1304.578450570" watchObservedRunningTime="2025-12-09 10:22:52.198776569 +0000 UTC m=+1304.590827650" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.505426 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-857f77df5c-skx8f"] Dec 09 10:22:52 crc kubenswrapper[5002]: E1209 10:22:52.505786 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018e2746-8f8d-47a4-9da1-96b398185024" containerName="dnsmasq-dns" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.505802 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="018e2746-8f8d-47a4-9da1-96b398185024" containerName="dnsmasq-dns" Dec 09 10:22:52 crc kubenswrapper[5002]: E1209 10:22:52.505847 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018e2746-8f8d-47a4-9da1-96b398185024" containerName="init" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.505853 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="018e2746-8f8d-47a4-9da1-96b398185024" containerName="init" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.506022 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="018e2746-8f8d-47a4-9da1-96b398185024" containerName="dnsmasq-dns" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.506923 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.509458 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.509906 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.528114 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-857f77df5c-skx8f"] Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.595856 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-ovndb-tls-certs\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.595932 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-combined-ca-bundle\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.595970 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-httpd-config\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.596007 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-internal-tls-certs\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.596133 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-public-tls-certs\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.596211 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-config\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.596244 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpvgz\" (UniqueName: \"kubernetes.io/projected/41f46a2d-f158-497f-b61b-60f39c64149b-kube-api-access-dpvgz\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.697538 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-ovndb-tls-certs\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.697998 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-combined-ca-bundle\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.698036 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-httpd-config\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.698104 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-internal-tls-certs\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.698636 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-public-tls-certs\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.699177 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-config\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.699235 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpvgz\" (UniqueName: \"kubernetes.io/projected/41f46a2d-f158-497f-b61b-60f39c64149b-kube-api-access-dpvgz\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.703962 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-public-tls-certs\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.704457 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-httpd-config\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.704657 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-config\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.704772 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-internal-tls-certs\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.705064 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-ovndb-tls-certs\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.708640 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-combined-ca-bundle\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.722674 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpvgz\" (UniqueName: \"kubernetes.io/projected/41f46a2d-f158-497f-b61b-60f39c64149b-kube-api-access-dpvgz\") pod \"neutron-857f77df5c-skx8f\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:52 crc kubenswrapper[5002]: I1209 10:22:52.831626 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.271556 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b88b97456-wb29h" podUID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:57800->10.217.0.156:9311: read: connection reset by peer" Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.271689 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b88b97456-wb29h" podUID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:57804->10.217.0.156:9311: read: connection reset by peer" Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.435360 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-857f77df5c-skx8f"] Dec 09 10:22:53 crc kubenswrapper[5002]: W1209 10:22:53.440996 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41f46a2d_f158_497f_b61b_60f39c64149b.slice/crio-f0ade6f71ccd72abf043b302815419aaf0c2592c32a3cad155f97678f1d96565 WatchSource:0}: Error finding container f0ade6f71ccd72abf043b302815419aaf0c2592c32a3cad155f97678f1d96565: Status 404 returned error can't find the container with id f0ade6f71ccd72abf043b302815419aaf0c2592c32a3cad155f97678f1d96565 Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.851447 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.921964 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data-custom\") pod \"735c673a-9bc6-4a3c-b553-dea3ba521606\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.922068 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzpx4\" (UniqueName: \"kubernetes.io/projected/735c673a-9bc6-4a3c-b553-dea3ba521606-kube-api-access-bzpx4\") pod \"735c673a-9bc6-4a3c-b553-dea3ba521606\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.922199 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/735c673a-9bc6-4a3c-b553-dea3ba521606-logs\") pod \"735c673a-9bc6-4a3c-b553-dea3ba521606\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.922228 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-combined-ca-bundle\") pod \"735c673a-9bc6-4a3c-b553-dea3ba521606\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.922297 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data\") pod \"735c673a-9bc6-4a3c-b553-dea3ba521606\" (UID: \"735c673a-9bc6-4a3c-b553-dea3ba521606\") " Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.923460 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/735c673a-9bc6-4a3c-b553-dea3ba521606-logs" (OuterVolumeSpecName: "logs") pod "735c673a-9bc6-4a3c-b553-dea3ba521606" (UID: "735c673a-9bc6-4a3c-b553-dea3ba521606"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.927924 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/735c673a-9bc6-4a3c-b553-dea3ba521606-kube-api-access-bzpx4" (OuterVolumeSpecName: "kube-api-access-bzpx4") pod "735c673a-9bc6-4a3c-b553-dea3ba521606" (UID: "735c673a-9bc6-4a3c-b553-dea3ba521606"). InnerVolumeSpecName "kube-api-access-bzpx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.932273 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "735c673a-9bc6-4a3c-b553-dea3ba521606" (UID: "735c673a-9bc6-4a3c-b553-dea3ba521606"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.970719 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "735c673a-9bc6-4a3c-b553-dea3ba521606" (UID: "735c673a-9bc6-4a3c-b553-dea3ba521606"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:53 crc kubenswrapper[5002]: I1209 10:22:53.993981 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data" (OuterVolumeSpecName: "config-data") pod "735c673a-9bc6-4a3c-b553-dea3ba521606" (UID: "735c673a-9bc6-4a3c-b553-dea3ba521606"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.025255 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.025291 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.025304 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzpx4\" (UniqueName: \"kubernetes.io/projected/735c673a-9bc6-4a3c-b553-dea3ba521606-kube-api-access-bzpx4\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.025313 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735c673a-9bc6-4a3c-b553-dea3ba521606-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.025321 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/735c673a-9bc6-4a3c-b553-dea3ba521606-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.174052 5002 generic.go:334] "Generic (PLEG): container finished" podID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerID="1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e" exitCode=0 Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.174110 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b88b97456-wb29h" event={"ID":"735c673a-9bc6-4a3c-b553-dea3ba521606","Type":"ContainerDied","Data":"1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e"} Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.174136 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b88b97456-wb29h" event={"ID":"735c673a-9bc6-4a3c-b553-dea3ba521606","Type":"ContainerDied","Data":"b9211056914c225f79c54432f31562b41520803d0cbee2682c1882ea8a90c11d"} Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.174153 5002 scope.go:117] "RemoveContainer" containerID="1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.174438 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b88b97456-wb29h" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.175917 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-857f77df5c-skx8f" event={"ID":"41f46a2d-f158-497f-b61b-60f39c64149b","Type":"ContainerStarted","Data":"a034c8a435eb9196808c0d7f0f523d44f798925554bc10f46ac57bff50643ec3"} Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.176027 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-857f77df5c-skx8f" event={"ID":"41f46a2d-f158-497f-b61b-60f39c64149b","Type":"ContainerStarted","Data":"f0ade6f71ccd72abf043b302815419aaf0c2592c32a3cad155f97678f1d96565"} Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.199208 5002 scope.go:117] "RemoveContainer" containerID="128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.202769 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b88b97456-wb29h"] Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.210727 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6b88b97456-wb29h"] Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.221245 5002 scope.go:117] "RemoveContainer" containerID="1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e" Dec 09 10:22:54 crc kubenswrapper[5002]: E1209 10:22:54.221702 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e\": container with ID starting with 1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e not found: ID does not exist" containerID="1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.221734 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e"} err="failed to get container status \"1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e\": rpc error: code = NotFound desc = could not find container \"1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e\": container with ID starting with 1a628fa6f0a45b5e0de19404d3a0514188a865e101734a1d508f7f6e532a9e3e not found: ID does not exist" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.221759 5002 scope.go:117] "RemoveContainer" containerID="128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f" Dec 09 10:22:54 crc kubenswrapper[5002]: E1209 10:22:54.222180 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f\": container with ID starting with 128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f not found: ID does not exist" containerID="128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f" Dec 09 10:22:54 crc kubenswrapper[5002]: I1209 10:22:54.222231 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f"} err="failed to get container status \"128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f\": rpc error: code = NotFound desc = could not find container \"128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f\": container with ID starting with 128dbd0c6d6c70bb16748e87cc8313e3b27d9a7ea5f00558e5eda5914f68ae2f not found: ID does not exist" Dec 09 10:22:55 crc kubenswrapper[5002]: I1209 10:22:55.150841 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 10:22:55 crc kubenswrapper[5002]: I1209 10:22:55.152068 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 10:22:55 crc kubenswrapper[5002]: I1209 10:22:55.194226 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 10:22:55 crc kubenswrapper[5002]: I1209 10:22:55.195351 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 10:22:55 crc kubenswrapper[5002]: I1209 10:22:55.195684 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 10:22:56 crc kubenswrapper[5002]: I1209 10:22:56.075255 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="735c673a-9bc6-4a3c-b553-dea3ba521606" path="/var/lib/kubelet/pods/735c673a-9bc6-4a3c-b553-dea3ba521606/volumes" Dec 09 10:22:56 crc kubenswrapper[5002]: I1209 10:22:56.203789 5002 generic.go:334] "Generic (PLEG): container finished" podID="58839cfb-488d-4d08-b077-bf23cad0fedb" containerID="d839cbd7dc24cd99b6bb542f2309c20c1ce28650b857540ee941d95b26b6625f" exitCode=0 Dec 09 10:22:56 crc kubenswrapper[5002]: I1209 10:22:56.203871 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"58839cfb-488d-4d08-b077-bf23cad0fedb","Type":"ContainerDied","Data":"d839cbd7dc24cd99b6bb542f2309c20c1ce28650b857540ee941d95b26b6625f"} Dec 09 10:22:56 crc kubenswrapper[5002]: I1209 10:22:56.204244 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 10:22:56 crc kubenswrapper[5002]: I1209 10:22:56.488500 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:56 crc kubenswrapper[5002]: I1209 10:22:56.488552 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:56 crc kubenswrapper[5002]: I1209 10:22:56.531259 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:56 crc kubenswrapper[5002]: I1209 10:22:56.548317 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:57 crc kubenswrapper[5002]: I1209 10:22:57.217893 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 10:22:57 crc kubenswrapper[5002]: I1209 10:22:57.218774 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:57 crc kubenswrapper[5002]: I1209 10:22:57.218804 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:57 crc kubenswrapper[5002]: I1209 10:22:57.617518 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 10:22:57 crc kubenswrapper[5002]: I1209 10:22:57.620339 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 10:22:58 crc kubenswrapper[5002]: I1209 10:22:58.690117 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 09 10:22:59 crc kubenswrapper[5002]: I1209 10:22:59.322401 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:59 crc kubenswrapper[5002]: I1209 10:22:59.322490 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 10:22:59 crc kubenswrapper[5002]: I1209 10:22:59.373893 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 10:22:59 crc kubenswrapper[5002]: I1209 10:22:59.534948 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:22:59 crc kubenswrapper[5002]: I1209 10:22:59.592614 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-rlkkn"] Dec 09 10:22:59 crc kubenswrapper[5002]: I1209 10:22:59.593351 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" podUID="bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" containerName="dnsmasq-dns" containerID="cri-o://43937b2161c0e1c7323f5ff469253468d38851867108991fec8c4f21c0c334a6" gracePeriod=10 Dec 09 10:23:03 crc kubenswrapper[5002]: I1209 10:23:03.286297 5002 generic.go:334] "Generic (PLEG): container finished" podID="bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" containerID="43937b2161c0e1c7323f5ff469253468d38851867108991fec8c4f21c0c334a6" exitCode=0 Dec 09 10:23:03 crc kubenswrapper[5002]: I1209 10:23:03.286410 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" event={"ID":"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6","Type":"ContainerDied","Data":"43937b2161c0e1c7323f5ff469253468d38851867108991fec8c4f21c0c334a6"} Dec 09 10:23:03 crc kubenswrapper[5002]: I1209 10:23:03.926692 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:23:03 crc kubenswrapper[5002]: I1209 10:23:03.992211 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.056643 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-svc\") pod \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.056793 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-sb\") pod \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.056854 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-nb\") pod \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.056883 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-config\") pod \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.056918 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5rv4\" (UniqueName: \"kubernetes.io/projected/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-kube-api-access-n5rv4\") pod \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.059262 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-swift-storage-0\") pod \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\" (UID: \"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.078049 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-kube-api-access-n5rv4" (OuterVolumeSpecName: "kube-api-access-n5rv4") pod "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" (UID: "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6"). InnerVolumeSpecName "kube-api-access-n5rv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.120702 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" (UID: "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.140001 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" (UID: "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.140645 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-config" (OuterVolumeSpecName: "config") pod "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" (UID: "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.154889 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" (UID: "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.161642 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58839cfb-488d-4d08-b077-bf23cad0fedb-etc-machine-id\") pod \"58839cfb-488d-4d08-b077-bf23cad0fedb\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.162021 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89r8f\" (UniqueName: \"kubernetes.io/projected/58839cfb-488d-4d08-b077-bf23cad0fedb-kube-api-access-89r8f\") pod \"58839cfb-488d-4d08-b077-bf23cad0fedb\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.162655 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data\") pod \"58839cfb-488d-4d08-b077-bf23cad0fedb\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.163115 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data-custom\") pod \"58839cfb-488d-4d08-b077-bf23cad0fedb\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.163653 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-scripts\") pod \"58839cfb-488d-4d08-b077-bf23cad0fedb\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.163921 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-combined-ca-bundle\") pod \"58839cfb-488d-4d08-b077-bf23cad0fedb\" (UID: \"58839cfb-488d-4d08-b077-bf23cad0fedb\") " Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.164789 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.165067 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.165091 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.165105 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5rv4\" (UniqueName: \"kubernetes.io/projected/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-kube-api-access-n5rv4\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.165126 5002 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.161765 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58839cfb-488d-4d08-b077-bf23cad0fedb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "58839cfb-488d-4d08-b077-bf23cad0fedb" (UID: "58839cfb-488d-4d08-b077-bf23cad0fedb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.164977 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" (UID: "bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.170938 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58839cfb-488d-4d08-b077-bf23cad0fedb-kube-api-access-89r8f" (OuterVolumeSpecName: "kube-api-access-89r8f") pod "58839cfb-488d-4d08-b077-bf23cad0fedb" (UID: "58839cfb-488d-4d08-b077-bf23cad0fedb"). InnerVolumeSpecName "kube-api-access-89r8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.171727 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "58839cfb-488d-4d08-b077-bf23cad0fedb" (UID: "58839cfb-488d-4d08-b077-bf23cad0fedb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.174172 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-scripts" (OuterVolumeSpecName: "scripts") pod "58839cfb-488d-4d08-b077-bf23cad0fedb" (UID: "58839cfb-488d-4d08-b077-bf23cad0fedb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.224955 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58839cfb-488d-4d08-b077-bf23cad0fedb" (UID: "58839cfb-488d-4d08-b077-bf23cad0fedb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.267418 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.267467 5002 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58839cfb-488d-4d08-b077-bf23cad0fedb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.267483 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89r8f\" (UniqueName: \"kubernetes.io/projected/58839cfb-488d-4d08-b077-bf23cad0fedb-kube-api-access-89r8f\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.267498 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.267508 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.267519 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.282988 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data" (OuterVolumeSpecName: "config-data") pod "58839cfb-488d-4d08-b077-bf23cad0fedb" (UID: "58839cfb-488d-4d08-b077-bf23cad0fedb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.308588 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.308614 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"58839cfb-488d-4d08-b077-bf23cad0fedb","Type":"ContainerDied","Data":"011aaa4f8516de7e7c2451312e762de09c7d16db6c9e890223ccc3a9feb85426"} Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.308670 5002 scope.go:117] "RemoveContainer" containerID="7c54fd7ef2d21fcdd7e51d8f37703e839fb35843258561382844b90dc6812e2c" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.317252 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.317251 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" event={"ID":"bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6","Type":"ContainerDied","Data":"52af199ae9f945bb914dcd7171224e5e1eff8a4b2cb3769eec68bf05602b0f3a"} Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.319714 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7b9g" event={"ID":"26e2d58a-f6d2-4e30-a327-042f181b7ba0","Type":"ContainerStarted","Data":"1c44f0683b18f3fea9dd649810fa726f3f7e8a9a0fc2da1309f38beff1d26e5b"} Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.322767 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-857f77df5c-skx8f" event={"ID":"41f46a2d-f158-497f-b61b-60f39c64149b","Type":"ContainerStarted","Data":"a9f9119ac359c6cfc08e9e4ec057f2b160949a68cbe68fb1fa38fc53c004e69a"} Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.325964 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.339572 5002 scope.go:117] "RemoveContainer" containerID="d839cbd7dc24cd99b6bb542f2309c20c1ce28650b857540ee941d95b26b6625f" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.356060 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-857f77df5c-skx8f" podStartSLOduration=12.356031657 podStartE2EDuration="12.356031657s" podCreationTimestamp="2025-12-09 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:04.352162547 +0000 UTC m=+1316.744213628" watchObservedRunningTime="2025-12-09 10:23:04.356031657 +0000 UTC m=+1316.748082738" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.370012 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58839cfb-488d-4d08-b077-bf23cad0fedb-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.379350 5002 scope.go:117] "RemoveContainer" containerID="43937b2161c0e1c7323f5ff469253468d38851867108991fec8c4f21c0c334a6" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.402846 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.421055 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.424860 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-rlkkn"] Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.444012 5002 scope.go:117] "RemoveContainer" containerID="eefe11b215226c75def8b0bdde5ef0b64c77c98567620951ced7374a977a9ca2" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.448940 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-rlkkn"] Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.461759 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:23:04 crc kubenswrapper[5002]: E1209 10:23:04.462206 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" containerName="init" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.462225 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" containerName="init" Dec 09 10:23:04 crc kubenswrapper[5002]: E1209 10:23:04.462238 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58839cfb-488d-4d08-b077-bf23cad0fedb" containerName="cinder-scheduler" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.462244 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="58839cfb-488d-4d08-b077-bf23cad0fedb" containerName="cinder-scheduler" Dec 09 10:23:04 crc kubenswrapper[5002]: E1209 10:23:04.462253 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerName="barbican-api" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.462259 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerName="barbican-api" Dec 09 10:23:04 crc kubenswrapper[5002]: E1209 10:23:04.462271 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58839cfb-488d-4d08-b077-bf23cad0fedb" containerName="probe" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.462276 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="58839cfb-488d-4d08-b077-bf23cad0fedb" containerName="probe" Dec 09 10:23:04 crc kubenswrapper[5002]: E1209 10:23:04.462293 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerName="barbican-api-log" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.462299 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerName="barbican-api-log" Dec 09 10:23:04 crc kubenswrapper[5002]: E1209 10:23:04.462308 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" containerName="dnsmasq-dns" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.462313 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" containerName="dnsmasq-dns" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.462468 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" containerName="dnsmasq-dns" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.462482 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="58839cfb-488d-4d08-b077-bf23cad0fedb" containerName="probe" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.462497 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerName="barbican-api" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.462510 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="58839cfb-488d-4d08-b077-bf23cad0fedb" containerName="cinder-scheduler" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.462519 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="735c673a-9bc6-4a3c-b553-dea3ba521606" containerName="barbican-api-log" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.463555 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.468402 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.471407 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.475453 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlfpl\" (UniqueName: \"kubernetes.io/projected/f702a539-ec25-44d4-8629-97b3c5499b96-kube-api-access-xlfpl\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.475496 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.475531 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.475569 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f702a539-ec25-44d4-8629-97b3c5499b96-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.475604 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.475627 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-scripts\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.576899 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlfpl\" (UniqueName: \"kubernetes.io/projected/f702a539-ec25-44d4-8629-97b3c5499b96-kube-api-access-xlfpl\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.577250 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.577320 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.577383 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f702a539-ec25-44d4-8629-97b3c5499b96-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.577840 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f702a539-ec25-44d4-8629-97b3c5499b96-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.579267 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.579337 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-scripts\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.585738 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.585923 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.586983 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.587355 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-scripts\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.597402 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlfpl\" (UniqueName: \"kubernetes.io/projected/f702a539-ec25-44d4-8629-97b3c5499b96-kube-api-access-xlfpl\") pod \"cinder-scheduler-0\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " pod="openstack/cinder-scheduler-0" Dec 09 10:23:04 crc kubenswrapper[5002]: I1209 10:23:04.847461 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 10:23:05 crc kubenswrapper[5002]: I1209 10:23:05.364710 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-k7b9g" podStartSLOduration=3.168786327 podStartE2EDuration="19.364681622s" podCreationTimestamp="2025-12-09 10:22:46 +0000 UTC" firstStartedPulling="2025-12-09 10:22:47.349106897 +0000 UTC m=+1299.741157978" lastFinishedPulling="2025-12-09 10:23:03.545002192 +0000 UTC m=+1315.937053273" observedRunningTime="2025-12-09 10:23:05.357476836 +0000 UTC m=+1317.749527917" watchObservedRunningTime="2025-12-09 10:23:05.364681622 +0000 UTC m=+1317.756732713" Dec 09 10:23:05 crc kubenswrapper[5002]: I1209 10:23:05.385258 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:23:06 crc kubenswrapper[5002]: I1209 10:23:06.332440 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58839cfb-488d-4d08-b077-bf23cad0fedb" path="/var/lib/kubelet/pods/58839cfb-488d-4d08-b077-bf23cad0fedb/volumes" Dec 09 10:23:06 crc kubenswrapper[5002]: I1209 10:23:06.333629 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" path="/var/lib/kubelet/pods/bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6/volumes" Dec 09 10:23:06 crc kubenswrapper[5002]: I1209 10:23:06.348970 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f702a539-ec25-44d4-8629-97b3c5499b96","Type":"ContainerStarted","Data":"a69e04a22e415316847080019084ca4991a80ceb03562e9f746818a4d0689f80"} Dec 09 10:23:07 crc kubenswrapper[5002]: I1209 10:23:07.366706 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f702a539-ec25-44d4-8629-97b3c5499b96","Type":"ContainerStarted","Data":"87bebcf10614da44af2b08b3844e8a098da235879ef6a0ce2fdbe6d780cb77c8"} Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.206053 5002 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod291d30af-cc1c-4fa3-9496-9695e50f0c6d"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod291d30af-cc1c-4fa3-9496-9695e50f0c6d] : Timed out while waiting for systemd to remove kubepods-besteffort-pod291d30af_cc1c_4fa3_9496_9695e50f0c6d.slice" Dec 09 10:23:08 crc kubenswrapper[5002]: E1209 10:23:08.206709 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod291d30af-cc1c-4fa3-9496-9695e50f0c6d] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod291d30af-cc1c-4fa3-9496-9695e50f0c6d] : Timed out while waiting for systemd to remove kubepods-besteffort-pod291d30af_cc1c_4fa3_9496_9695e50f0c6d.slice" pod="openstack/ceilometer-0" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.375960 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.376581 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f702a539-ec25-44d4-8629-97b3c5499b96","Type":"ContainerStarted","Data":"346c243539a09bb0cf2ecabd1fa68b92b5e3b4d887823c3ba0eff1a45067f934"} Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.429528 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.429511232 podStartE2EDuration="4.429511232s" podCreationTimestamp="2025-12-09 10:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:08.424609406 +0000 UTC m=+1320.816660507" watchObservedRunningTime="2025-12-09 10:23:08.429511232 +0000 UTC m=+1320.821562313" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.450141 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.473255 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.489147 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.491533 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.498865 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.527271 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.527799 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.663609 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.663665 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.663690 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lflx8\" (UniqueName: \"kubernetes.io/projected/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-kube-api-access-lflx8\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.664166 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-log-httpd\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.664194 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-config-data\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.664228 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-scripts\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.664245 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-run-httpd\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.766177 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lflx8\" (UniqueName: \"kubernetes.io/projected/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-kube-api-access-lflx8\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.766231 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-log-httpd\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.766273 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-config-data\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.766313 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-scripts\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.766334 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-run-httpd\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.766478 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.766522 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.767241 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-log-httpd\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.767335 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-run-httpd\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.773489 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-config-data\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.773647 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-scripts\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.777545 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.781503 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.782404 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lflx8\" (UniqueName: \"kubernetes.io/projected/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-kube-api-access-lflx8\") pod \"ceilometer-0\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " pod="openstack/ceilometer-0" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.793122 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-795f4db4bc-rlkkn" podUID="bd5d5044-8f5e-4cd7-9e83-edc900a0e6d6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.166:5353: i/o timeout" Dec 09 10:23:08 crc kubenswrapper[5002]: I1209 10:23:08.838506 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:23:09 crc kubenswrapper[5002]: I1209 10:23:09.327709 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:09 crc kubenswrapper[5002]: W1209 10:23:09.333332 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93a9e9ee_7e52_4ff9_a0a4_d536a52f9156.slice/crio-7a5bf3d5430c1a0fc2805ba7c537bda32b5f6110be65d5be7927c6ce0c24a658 WatchSource:0}: Error finding container 7a5bf3d5430c1a0fc2805ba7c537bda32b5f6110be65d5be7927c6ce0c24a658: Status 404 returned error can't find the container with id 7a5bf3d5430c1a0fc2805ba7c537bda32b5f6110be65d5be7927c6ce0c24a658 Dec 09 10:23:09 crc kubenswrapper[5002]: I1209 10:23:09.386930 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156","Type":"ContainerStarted","Data":"7a5bf3d5430c1a0fc2805ba7c537bda32b5f6110be65d5be7927c6ce0c24a658"} Dec 09 10:23:09 crc kubenswrapper[5002]: I1209 10:23:09.847791 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 09 10:23:10 crc kubenswrapper[5002]: I1209 10:23:10.077010 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="291d30af-cc1c-4fa3-9496-9695e50f0c6d" path="/var/lib/kubelet/pods/291d30af-cc1c-4fa3-9496-9695e50f0c6d/volumes" Dec 09 10:23:10 crc kubenswrapper[5002]: I1209 10:23:10.396511 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156","Type":"ContainerStarted","Data":"cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af"} Dec 09 10:23:10 crc kubenswrapper[5002]: I1209 10:23:10.650296 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:11 crc kubenswrapper[5002]: I1209 10:23:11.411766 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156","Type":"ContainerStarted","Data":"00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e"} Dec 09 10:23:13 crc kubenswrapper[5002]: I1209 10:23:13.453209 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156","Type":"ContainerStarted","Data":"274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0"} Dec 09 10:23:14 crc kubenswrapper[5002]: I1209 10:23:14.467315 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156","Type":"ContainerStarted","Data":"466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134"} Dec 09 10:23:14 crc kubenswrapper[5002]: I1209 10:23:14.467899 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="ceilometer-central-agent" containerID="cri-o://cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af" gracePeriod=30 Dec 09 10:23:14 crc kubenswrapper[5002]: I1209 10:23:14.468216 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 10:23:14 crc kubenswrapper[5002]: I1209 10:23:14.468664 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="proxy-httpd" containerID="cri-o://466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134" gracePeriod=30 Dec 09 10:23:14 crc kubenswrapper[5002]: I1209 10:23:14.468756 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="sg-core" containerID="cri-o://274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0" gracePeriod=30 Dec 09 10:23:14 crc kubenswrapper[5002]: I1209 10:23:14.468852 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="ceilometer-notification-agent" containerID="cri-o://00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e" gracePeriod=30 Dec 09 10:23:14 crc kubenswrapper[5002]: I1209 10:23:14.507490 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.259887369 podStartE2EDuration="6.50746583s" podCreationTimestamp="2025-12-09 10:23:08 +0000 UTC" firstStartedPulling="2025-12-09 10:23:09.335312793 +0000 UTC m=+1321.727363874" lastFinishedPulling="2025-12-09 10:23:13.582891214 +0000 UTC m=+1325.974942335" observedRunningTime="2025-12-09 10:23:14.502803459 +0000 UTC m=+1326.894854560" watchObservedRunningTime="2025-12-09 10:23:14.50746583 +0000 UTC m=+1326.899516921" Dec 09 10:23:15 crc kubenswrapper[5002]: I1209 10:23:15.068692 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 09 10:23:15 crc kubenswrapper[5002]: I1209 10:23:15.479150 5002 generic.go:334] "Generic (PLEG): container finished" podID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerID="466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134" exitCode=0 Dec 09 10:23:15 crc kubenswrapper[5002]: I1209 10:23:15.479182 5002 generic.go:334] "Generic (PLEG): container finished" podID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerID="274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0" exitCode=2 Dec 09 10:23:15 crc kubenswrapper[5002]: I1209 10:23:15.479194 5002 generic.go:334] "Generic (PLEG): container finished" podID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerID="00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e" exitCode=0 Dec 09 10:23:15 crc kubenswrapper[5002]: I1209 10:23:15.479216 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156","Type":"ContainerDied","Data":"466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134"} Dec 09 10:23:15 crc kubenswrapper[5002]: I1209 10:23:15.479245 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156","Type":"ContainerDied","Data":"274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0"} Dec 09 10:23:15 crc kubenswrapper[5002]: I1209 10:23:15.479256 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156","Type":"ContainerDied","Data":"00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e"} Dec 09 10:23:17 crc kubenswrapper[5002]: I1209 10:23:17.496959 5002 generic.go:334] "Generic (PLEG): container finished" podID="26e2d58a-f6d2-4e30-a327-042f181b7ba0" containerID="1c44f0683b18f3fea9dd649810fa726f3f7e8a9a0fc2da1309f38beff1d26e5b" exitCode=0 Dec 09 10:23:17 crc kubenswrapper[5002]: I1209 10:23:17.497244 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7b9g" event={"ID":"26e2d58a-f6d2-4e30-a327-042f181b7ba0","Type":"ContainerDied","Data":"1c44f0683b18f3fea9dd649810fa726f3f7e8a9a0fc2da1309f38beff1d26e5b"} Dec 09 10:23:18 crc kubenswrapper[5002]: I1209 10:23:18.919279 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.048464 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-config-data\") pod \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.048518 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-combined-ca-bundle\") pod \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.048624 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbbdc\" (UniqueName: \"kubernetes.io/projected/26e2d58a-f6d2-4e30-a327-042f181b7ba0-kube-api-access-pbbdc\") pod \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.048685 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-scripts\") pod \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\" (UID: \"26e2d58a-f6d2-4e30-a327-042f181b7ba0\") " Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.055051 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e2d58a-f6d2-4e30-a327-042f181b7ba0-kube-api-access-pbbdc" (OuterVolumeSpecName: "kube-api-access-pbbdc") pod "26e2d58a-f6d2-4e30-a327-042f181b7ba0" (UID: "26e2d58a-f6d2-4e30-a327-042f181b7ba0"). InnerVolumeSpecName "kube-api-access-pbbdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.055424 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-scripts" (OuterVolumeSpecName: "scripts") pod "26e2d58a-f6d2-4e30-a327-042f181b7ba0" (UID: "26e2d58a-f6d2-4e30-a327-042f181b7ba0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.080868 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-config-data" (OuterVolumeSpecName: "config-data") pod "26e2d58a-f6d2-4e30-a327-042f181b7ba0" (UID: "26e2d58a-f6d2-4e30-a327-042f181b7ba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.092936 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26e2d58a-f6d2-4e30-a327-042f181b7ba0" (UID: "26e2d58a-f6d2-4e30-a327-042f181b7ba0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.150647 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.150695 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.150710 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbbdc\" (UniqueName: \"kubernetes.io/projected/26e2d58a-f6d2-4e30-a327-042f181b7ba0-kube-api-access-pbbdc\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.150721 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e2d58a-f6d2-4e30-a327-042f181b7ba0-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.517537 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7b9g" event={"ID":"26e2d58a-f6d2-4e30-a327-042f181b7ba0","Type":"ContainerDied","Data":"593a13a5551e610250f921ade7992eebca1b77088c7d85ad442799a1ff12c7ac"} Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.517962 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="593a13a5551e610250f921ade7992eebca1b77088c7d85ad442799a1ff12c7ac" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.518241 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7b9g" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.595023 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 10:23:19 crc kubenswrapper[5002]: E1209 10:23:19.595564 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e2d58a-f6d2-4e30-a327-042f181b7ba0" containerName="nova-cell0-conductor-db-sync" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.595592 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e2d58a-f6d2-4e30-a327-042f181b7ba0" containerName="nova-cell0-conductor-db-sync" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.595873 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e2d58a-f6d2-4e30-a327-042f181b7ba0" containerName="nova-cell0-conductor-db-sync" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.596807 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.598995 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xzzv7" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.599418 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.618662 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.735170 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.767746 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.768122 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.768290 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p799z\" (UniqueName: \"kubernetes.io/projected/7ae47b25-e6fd-451f-9827-72ee4e12e526-kube-api-access-p799z\") pod \"nova-cell0-conductor-0\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.870267 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.870345 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p799z\" (UniqueName: \"kubernetes.io/projected/7ae47b25-e6fd-451f-9827-72ee4e12e526-kube-api-access-p799z\") pod \"nova-cell0-conductor-0\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.870454 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.877012 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.881007 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.888899 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p799z\" (UniqueName: \"kubernetes.io/projected/7ae47b25-e6fd-451f-9827-72ee4e12e526-kube-api-access-p799z\") pod \"nova-cell0-conductor-0\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:19 crc kubenswrapper[5002]: I1209 10:23:19.921645 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:20 crc kubenswrapper[5002]: W1209 10:23:20.370038 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ae47b25_e6fd_451f_9827_72ee4e12e526.slice/crio-da23d51b564af3329adb4eb9eebdd7e2460c0512a6b09a65189ec881705e17cf WatchSource:0}: Error finding container da23d51b564af3329adb4eb9eebdd7e2460c0512a6b09a65189ec881705e17cf: Status 404 returned error can't find the container with id da23d51b564af3329adb4eb9eebdd7e2460c0512a6b09a65189ec881705e17cf Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.377288 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.413394 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.529498 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7ae47b25-e6fd-451f-9827-72ee4e12e526","Type":"ContainerStarted","Data":"da23d51b564af3329adb4eb9eebdd7e2460c0512a6b09a65189ec881705e17cf"} Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.532726 5002 generic.go:334] "Generic (PLEG): container finished" podID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerID="cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af" exitCode=0 Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.532772 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156","Type":"ContainerDied","Data":"cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af"} Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.532803 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156","Type":"ContainerDied","Data":"7a5bf3d5430c1a0fc2805ba7c537bda32b5f6110be65d5be7927c6ce0c24a658"} Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.532864 5002 scope.go:117] "RemoveContainer" containerID="466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.533059 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.557214 5002 scope.go:117] "RemoveContainer" containerID="274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.581009 5002 scope.go:117] "RemoveContainer" containerID="00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.583435 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-scripts\") pod \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.583724 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lflx8\" (UniqueName: \"kubernetes.io/projected/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-kube-api-access-lflx8\") pod \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.583901 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-combined-ca-bundle\") pod \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.584568 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-config-data\") pod \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.584702 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-sg-core-conf-yaml\") pod \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.585183 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-run-httpd\") pod \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.585595 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-log-httpd\") pod \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\" (UID: \"93a9e9ee-7e52-4ff9-a0a4-d536a52f9156\") " Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.585626 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" (UID: "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.586041 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" (UID: "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.588032 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-scripts" (OuterVolumeSpecName: "scripts") pod "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" (UID: "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.588960 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-kube-api-access-lflx8" (OuterVolumeSpecName: "kube-api-access-lflx8") pod "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" (UID: "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156"). InnerVolumeSpecName "kube-api-access-lflx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.611143 5002 scope.go:117] "RemoveContainer" containerID="cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.628308 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" (UID: "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.639688 5002 scope.go:117] "RemoveContainer" containerID="466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134" Dec 09 10:23:20 crc kubenswrapper[5002]: E1209 10:23:20.640274 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134\": container with ID starting with 466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134 not found: ID does not exist" containerID="466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.640342 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134"} err="failed to get container status \"466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134\": rpc error: code = NotFound desc = could not find container \"466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134\": container with ID starting with 466341f2e41d849af2bd6e5c12403fe104885c028236b3aaafe8ac44cb910134 not found: ID does not exist" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.640370 5002 scope.go:117] "RemoveContainer" containerID="274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0" Dec 09 10:23:20 crc kubenswrapper[5002]: E1209 10:23:20.640844 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0\": container with ID starting with 274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0 not found: ID does not exist" containerID="274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.640907 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0"} err="failed to get container status \"274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0\": rpc error: code = NotFound desc = could not find container \"274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0\": container with ID starting with 274ff1c4d533256aa5cc49891323aa312bd8a76fac0fdd99649729dce3ed1ac0 not found: ID does not exist" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.640946 5002 scope.go:117] "RemoveContainer" containerID="00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e" Dec 09 10:23:20 crc kubenswrapper[5002]: E1209 10:23:20.641547 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e\": container with ID starting with 00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e not found: ID does not exist" containerID="00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.641583 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e"} err="failed to get container status \"00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e\": rpc error: code = NotFound desc = could not find container \"00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e\": container with ID starting with 00ca69a841ff13760c6d3f2ca274f6a7782feebc4affa28cfca224ea5d191e7e not found: ID does not exist" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.641604 5002 scope.go:117] "RemoveContainer" containerID="cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af" Dec 09 10:23:20 crc kubenswrapper[5002]: E1209 10:23:20.641984 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af\": container with ID starting with cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af not found: ID does not exist" containerID="cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.642010 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af"} err="failed to get container status \"cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af\": rpc error: code = NotFound desc = could not find container \"cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af\": container with ID starting with cc64a9da30fd78357ae54ce6c0d3970c32e74006434774f8ead28296a23669af not found: ID does not exist" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.662778 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" (UID: "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.674970 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-config-data" (OuterVolumeSpecName: "config-data") pod "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" (UID: "93a9e9ee-7e52-4ff9-a0a4-d536a52f9156"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.688655 5002 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.688836 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.688894 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lflx8\" (UniqueName: \"kubernetes.io/projected/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-kube-api-access-lflx8\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.689537 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.689626 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.689686 5002 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.689738 5002 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.893926 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.904959 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.927760 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:20 crc kubenswrapper[5002]: E1209 10:23:20.928139 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="sg-core" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.928151 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="sg-core" Dec 09 10:23:20 crc kubenswrapper[5002]: E1209 10:23:20.928164 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="proxy-httpd" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.928171 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="proxy-httpd" Dec 09 10:23:20 crc kubenswrapper[5002]: E1209 10:23:20.928192 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="ceilometer-central-agent" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.928197 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="ceilometer-central-agent" Dec 09 10:23:20 crc kubenswrapper[5002]: E1209 10:23:20.928205 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="ceilometer-notification-agent" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.928210 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="ceilometer-notification-agent" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.928387 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="proxy-httpd" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.928407 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="ceilometer-central-agent" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.928416 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="ceilometer-notification-agent" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.928428 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" containerName="sg-core" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.930046 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.935289 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.935619 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 10:23:20 crc kubenswrapper[5002]: I1209 10:23:20.963763 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.099657 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs8gb\" (UniqueName: \"kubernetes.io/projected/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-kube-api-access-cs8gb\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.099726 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.099861 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-log-httpd\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.099906 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-config-data\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.100018 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-run-httpd\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.100044 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-scripts\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.100202 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.201603 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-run-httpd\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.201938 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-scripts\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.202011 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.202062 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs8gb\" (UniqueName: \"kubernetes.io/projected/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-kube-api-access-cs8gb\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.202098 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.202130 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-log-httpd\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.202147 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-config-data\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.203135 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-run-httpd\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.204258 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-log-httpd\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.206969 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.207318 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-scripts\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.208299 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-config-data\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.208572 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.240294 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs8gb\" (UniqueName: \"kubernetes.io/projected/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-kube-api-access-cs8gb\") pod \"ceilometer-0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.284749 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.544212 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7ae47b25-e6fd-451f-9827-72ee4e12e526","Type":"ContainerStarted","Data":"b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847"} Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.544613 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.569573 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.569553069 podStartE2EDuration="2.569553069s" podCreationTimestamp="2025-12-09 10:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:21.55640921 +0000 UTC m=+1333.948460331" watchObservedRunningTime="2025-12-09 10:23:21.569553069 +0000 UTC m=+1333.961604160" Dec 09 10:23:21 crc kubenswrapper[5002]: I1209 10:23:21.752069 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:21 crc kubenswrapper[5002]: W1209 10:23:21.754493 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod863f7cf2_76b4_4780_8c7e_797bd1a52ea0.slice/crio-9b1f687954ff90dd411d3f05c2f108ff8221a863b8138b70b3515ca274005b86 WatchSource:0}: Error finding container 9b1f687954ff90dd411d3f05c2f108ff8221a863b8138b70b3515ca274005b86: Status 404 returned error can't find the container with id 9b1f687954ff90dd411d3f05c2f108ff8221a863b8138b70b3515ca274005b86 Dec 09 10:23:22 crc kubenswrapper[5002]: I1209 10:23:22.078297 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93a9e9ee-7e52-4ff9-a0a4-d536a52f9156" path="/var/lib/kubelet/pods/93a9e9ee-7e52-4ff9-a0a4-d536a52f9156/volumes" Dec 09 10:23:22 crc kubenswrapper[5002]: I1209 10:23:22.554648 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"863f7cf2-76b4-4780-8c7e-797bd1a52ea0","Type":"ContainerStarted","Data":"29a3a70981ffc1c78ea042ff208a86bb4d0c41cf67ee7a5b6de71def9da9f55b"} Dec 09 10:23:22 crc kubenswrapper[5002]: I1209 10:23:22.555002 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"863f7cf2-76b4-4780-8c7e-797bd1a52ea0","Type":"ContainerStarted","Data":"9b1f687954ff90dd411d3f05c2f108ff8221a863b8138b70b3515ca274005b86"} Dec 09 10:23:22 crc kubenswrapper[5002]: I1209 10:23:22.850550 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:23:22 crc kubenswrapper[5002]: I1209 10:23:22.916468 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67fdf7874b-b25q8"] Dec 09 10:23:22 crc kubenswrapper[5002]: I1209 10:23:22.916678 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67fdf7874b-b25q8" podUID="203af364-55a8-46c1-be90-22ed4849dec0" containerName="neutron-api" containerID="cri-o://66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06" gracePeriod=30 Dec 09 10:23:22 crc kubenswrapper[5002]: I1209 10:23:22.916801 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67fdf7874b-b25q8" podUID="203af364-55a8-46c1-be90-22ed4849dec0" containerName="neutron-httpd" containerID="cri-o://5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b" gracePeriod=30 Dec 09 10:23:23 crc kubenswrapper[5002]: I1209 10:23:23.574399 5002 generic.go:334] "Generic (PLEG): container finished" podID="203af364-55a8-46c1-be90-22ed4849dec0" containerID="5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b" exitCode=0 Dec 09 10:23:23 crc kubenswrapper[5002]: I1209 10:23:23.574469 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67fdf7874b-b25q8" event={"ID":"203af364-55a8-46c1-be90-22ed4849dec0","Type":"ContainerDied","Data":"5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b"} Dec 09 10:23:23 crc kubenswrapper[5002]: I1209 10:23:23.576498 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"863f7cf2-76b4-4780-8c7e-797bd1a52ea0","Type":"ContainerStarted","Data":"667fe254c514762feed8f00659b9972ab6e5bc6dad1e2e2312a29b3309eb39b0"} Dec 09 10:23:24 crc kubenswrapper[5002]: I1209 10:23:24.597546 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"863f7cf2-76b4-4780-8c7e-797bd1a52ea0","Type":"ContainerStarted","Data":"906207b6ee5d87fa339a261555ebad22b0b552623b98629bf7b1a5247286b94a"} Dec 09 10:23:25 crc kubenswrapper[5002]: I1209 10:23:25.607372 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"863f7cf2-76b4-4780-8c7e-797bd1a52ea0","Type":"ContainerStarted","Data":"a7d92c4732c10b0253989840b364e50476dfcce6b63ceef50d6e3a8462cc0004"} Dec 09 10:23:25 crc kubenswrapper[5002]: I1209 10:23:25.607870 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 10:23:25 crc kubenswrapper[5002]: I1209 10:23:25.638471 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.340720155 podStartE2EDuration="5.638456088s" podCreationTimestamp="2025-12-09 10:23:20 +0000 UTC" firstStartedPulling="2025-12-09 10:23:21.756740151 +0000 UTC m=+1334.148791242" lastFinishedPulling="2025-12-09 10:23:25.054476094 +0000 UTC m=+1337.446527175" observedRunningTime="2025-12-09 10:23:25.636706633 +0000 UTC m=+1338.028757714" watchObservedRunningTime="2025-12-09 10:23:25.638456088 +0000 UTC m=+1338.030507159" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.626965 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.626958 5002 generic.go:334] "Generic (PLEG): container finished" podID="203af364-55a8-46c1-be90-22ed4849dec0" containerID="66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06" exitCode=0 Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.626985 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67fdf7874b-b25q8" event={"ID":"203af364-55a8-46c1-be90-22ed4849dec0","Type":"ContainerDied","Data":"66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06"} Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.627382 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67fdf7874b-b25q8" event={"ID":"203af364-55a8-46c1-be90-22ed4849dec0","Type":"ContainerDied","Data":"7710f7d4c6e1d9402e7b5533673611452772302c2eb8ee6d2afb38c09c6a83c9"} Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.627405 5002 scope.go:117] "RemoveContainer" containerID="5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.650861 5002 scope.go:117] "RemoveContainer" containerID="66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.671807 5002 scope.go:117] "RemoveContainer" containerID="5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b" Dec 09 10:23:27 crc kubenswrapper[5002]: E1209 10:23:27.673313 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b\": container with ID starting with 5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b not found: ID does not exist" containerID="5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.673338 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b"} err="failed to get container status \"5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b\": rpc error: code = NotFound desc = could not find container \"5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b\": container with ID starting with 5485d6fc5b977bb2cf23ebce8d5bc7c9e77dcc1b4353f2738fe906eb7118c90b not found: ID does not exist" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.673357 5002 scope.go:117] "RemoveContainer" containerID="66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06" Dec 09 10:23:27 crc kubenswrapper[5002]: E1209 10:23:27.673544 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06\": container with ID starting with 66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06 not found: ID does not exist" containerID="66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.673564 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06"} err="failed to get container status \"66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06\": rpc error: code = NotFound desc = could not find container \"66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06\": container with ID starting with 66f50d7fd24bb1f772b27308824711561e85c01d58e40c73bede64cc9ad62d06 not found: ID does not exist" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.723786 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-httpd-config\") pod \"203af364-55a8-46c1-be90-22ed4849dec0\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.724004 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-combined-ca-bundle\") pod \"203af364-55a8-46c1-be90-22ed4849dec0\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.724047 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sz6w\" (UniqueName: \"kubernetes.io/projected/203af364-55a8-46c1-be90-22ed4849dec0-kube-api-access-6sz6w\") pod \"203af364-55a8-46c1-be90-22ed4849dec0\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.724095 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-ovndb-tls-certs\") pod \"203af364-55a8-46c1-be90-22ed4849dec0\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.724262 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-config\") pod \"203af364-55a8-46c1-be90-22ed4849dec0\" (UID: \"203af364-55a8-46c1-be90-22ed4849dec0\") " Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.731528 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "203af364-55a8-46c1-be90-22ed4849dec0" (UID: "203af364-55a8-46c1-be90-22ed4849dec0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.737141 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203af364-55a8-46c1-be90-22ed4849dec0-kube-api-access-6sz6w" (OuterVolumeSpecName: "kube-api-access-6sz6w") pod "203af364-55a8-46c1-be90-22ed4849dec0" (UID: "203af364-55a8-46c1-be90-22ed4849dec0"). InnerVolumeSpecName "kube-api-access-6sz6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.780576 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "203af364-55a8-46c1-be90-22ed4849dec0" (UID: "203af364-55a8-46c1-be90-22ed4849dec0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.826589 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.826634 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sz6w\" (UniqueName: \"kubernetes.io/projected/203af364-55a8-46c1-be90-22ed4849dec0-kube-api-access-6sz6w\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.826647 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.826730 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "203af364-55a8-46c1-be90-22ed4849dec0" (UID: "203af364-55a8-46c1-be90-22ed4849dec0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.827186 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-config" (OuterVolumeSpecName: "config") pod "203af364-55a8-46c1-be90-22ed4849dec0" (UID: "203af364-55a8-46c1-be90-22ed4849dec0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.928036 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:27 crc kubenswrapper[5002]: I1209 10:23:27.928080 5002 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/203af364-55a8-46c1-be90-22ed4849dec0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:28 crc kubenswrapper[5002]: I1209 10:23:28.637783 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67fdf7874b-b25q8" Dec 09 10:23:28 crc kubenswrapper[5002]: I1209 10:23:28.671396 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67fdf7874b-b25q8"] Dec 09 10:23:28 crc kubenswrapper[5002]: I1209 10:23:28.680051 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-67fdf7874b-b25q8"] Dec 09 10:23:29 crc kubenswrapper[5002]: I1209 10:23:29.970638 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.073714 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203af364-55a8-46c1-be90-22ed4849dec0" path="/var/lib/kubelet/pods/203af364-55a8-46c1-be90-22ed4849dec0/volumes" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.532158 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-s99q2"] Dec 09 10:23:30 crc kubenswrapper[5002]: E1209 10:23:30.532701 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203af364-55a8-46c1-be90-22ed4849dec0" containerName="neutron-httpd" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.532729 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="203af364-55a8-46c1-be90-22ed4849dec0" containerName="neutron-httpd" Dec 09 10:23:30 crc kubenswrapper[5002]: E1209 10:23:30.532754 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203af364-55a8-46c1-be90-22ed4849dec0" containerName="neutron-api" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.532763 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="203af364-55a8-46c1-be90-22ed4849dec0" containerName="neutron-api" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.533061 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="203af364-55a8-46c1-be90-22ed4849dec0" containerName="neutron-api" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.533091 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="203af364-55a8-46c1-be90-22ed4849dec0" containerName="neutron-httpd" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.533895 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.536707 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.545953 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-s99q2"] Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.553576 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.607030 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-config-data\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.607119 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.607157 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4dkq\" (UniqueName: \"kubernetes.io/projected/759658bf-fbc8-40b6-96a5-b691f2ecec64-kube-api-access-k4dkq\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.607207 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-scripts\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.710168 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-config-data\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.710262 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.710289 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4dkq\" (UniqueName: \"kubernetes.io/projected/759658bf-fbc8-40b6-96a5-b691f2ecec64-kube-api-access-k4dkq\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.710389 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-scripts\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.718602 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-config-data\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.718750 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.721593 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-scripts\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.724993 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.726888 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.749751 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.752517 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4dkq\" (UniqueName: \"kubernetes.io/projected/759658bf-fbc8-40b6-96a5-b691f2ecec64-kube-api-access-k4dkq\") pod \"nova-cell0-cell-mapping-s99q2\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.772797 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.809070 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.811185 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.815399 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.818734 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.818786 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0172af5-b021-469d-ac3a-fb73a0651f27-logs\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.818847 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-config-data\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.818898 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.818951 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-config-data\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.818993 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhd9f\" (UniqueName: \"kubernetes.io/projected/a0172af5-b021-469d-ac3a-fb73a0651f27-kube-api-access-rhd9f\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.819061 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdmgh\" (UniqueName: \"kubernetes.io/projected/845e4d85-e58c-4ceb-a26f-5c918422c6a6-kube-api-access-wdmgh\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.819093 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/845e4d85-e58c-4ceb-a26f-5c918422c6a6-logs\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.822187 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.858933 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.881875 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7dbbg"] Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.883390 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.911110 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.912307 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.915277 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.922806 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.922957 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-config-data\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.922982 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923016 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhd9f\" (UniqueName: \"kubernetes.io/projected/a0172af5-b021-469d-ac3a-fb73a0651f27-kube-api-access-rhd9f\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923060 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-config-data\") pod \"nova-scheduler-0\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923122 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdmgh\" (UniqueName: \"kubernetes.io/projected/845e4d85-e58c-4ceb-a26f-5c918422c6a6-kube-api-access-wdmgh\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923147 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqhxf\" (UniqueName: \"kubernetes.io/projected/ea94dfeb-8659-48ad-9f5a-da8202588f0f-kube-api-access-fqhxf\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923170 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923192 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/845e4d85-e58c-4ceb-a26f-5c918422c6a6-logs\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923218 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923250 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923272 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqrdk\" (UniqueName: \"kubernetes.io/projected/3d1c1529-5c04-4bea-b054-5a73781b0e56-kube-api-access-xqrdk\") pod \"nova-scheduler-0\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923299 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-config-data\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923322 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0172af5-b021-469d-ac3a-fb73a0651f27-logs\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923377 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923415 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-config\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923452 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-svc\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.923864 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7dbbg"] Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.924289 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/845e4d85-e58c-4ceb-a26f-5c918422c6a6-logs\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.927662 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0172af5-b021-469d-ac3a-fb73a0651f27-logs\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.932967 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.939597 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.957647 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-config-data\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.958232 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-config-data\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.970443 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhd9f\" (UniqueName: \"kubernetes.io/projected/a0172af5-b021-469d-ac3a-fb73a0651f27-kube-api-access-rhd9f\") pod \"nova-api-0\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " pod="openstack/nova-api-0" Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.970728 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:23:30 crc kubenswrapper[5002]: I1209 10:23:30.978633 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdmgh\" (UniqueName: \"kubernetes.io/projected/845e4d85-e58c-4ceb-a26f-5c918422c6a6-kube-api-access-wdmgh\") pod \"nova-metadata-0\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " pod="openstack/nova-metadata-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.026465 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-config-data\") pod \"nova-scheduler-0\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.026565 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqhxf\" (UniqueName: \"kubernetes.io/projected/ea94dfeb-8659-48ad-9f5a-da8202588f0f-kube-api-access-fqhxf\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.026588 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.026623 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.026664 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqrdk\" (UniqueName: \"kubernetes.io/projected/3d1c1529-5c04-4bea-b054-5a73781b0e56-kube-api-access-xqrdk\") pod \"nova-scheduler-0\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.026740 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-config\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.026778 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-svc\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.026807 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.026855 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.027318 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.028493 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.029871 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-svc\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.030303 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.030648 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-config\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.031235 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.032065 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.036446 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-config-data\") pod \"nova-scheduler-0\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.037493 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.040658 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.045912 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqhxf\" (UniqueName: \"kubernetes.io/projected/ea94dfeb-8659-48ad-9f5a-da8202588f0f-kube-api-access-fqhxf\") pod \"dnsmasq-dns-757b4f8459-7dbbg\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.053016 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.062288 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqrdk\" (UniqueName: \"kubernetes.io/projected/3d1c1529-5c04-4bea-b054-5a73781b0e56-kube-api-access-xqrdk\") pod \"nova-scheduler-0\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.074138 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.077593 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.131314 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk5km\" (UniqueName: \"kubernetes.io/projected/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-kube-api-access-nk5km\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.131699 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.131741 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.146287 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.146629 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.234748 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk5km\" (UniqueName: \"kubernetes.io/projected/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-kube-api-access-nk5km\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.234820 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.234852 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.240678 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.245458 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.260951 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk5km\" (UniqueName: \"kubernetes.io/projected/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-kube-api-access-nk5km\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.391465 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.524003 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-s99q2"] Dec 09 10:23:31 crc kubenswrapper[5002]: W1209 10:23:31.545312 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod759658bf_fbc8_40b6_96a5_b691f2ecec64.slice/crio-0e33b126f382ea5ba98ff2305db64139c035df475787582194e452a5c649f2c5 WatchSource:0}: Error finding container 0e33b126f382ea5ba98ff2305db64139c035df475787582194e452a5c649f2c5: Status 404 returned error can't find the container with id 0e33b126f382ea5ba98ff2305db64139c035df475787582194e452a5c649f2c5 Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.689398 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s99q2" event={"ID":"759658bf-fbc8-40b6-96a5-b691f2ecec64","Type":"ContainerStarted","Data":"0e33b126f382ea5ba98ff2305db64139c035df475787582194e452a5c649f2c5"} Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.710331 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6ck56"] Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.711566 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.715880 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.716405 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.744398 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.759847 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qftg\" (UniqueName: \"kubernetes.io/projected/15a833c4-f8ea-4259-a659-a11ea55a8f88-kube-api-access-4qftg\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.759982 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-scripts\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.760019 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-config-data\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.760071 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.774453 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6ck56"] Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.802948 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7dbbg"] Dec 09 10:23:31 crc kubenswrapper[5002]: W1209 10:23:31.835992 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0172af5_b021_469d_ac3a_fb73a0651f27.slice/crio-5dbf1c652d02050e856e7d5ddc49e119822c77aa7b7f7c7d56120fa1bcb45fbe WatchSource:0}: Error finding container 5dbf1c652d02050e856e7d5ddc49e119822c77aa7b7f7c7d56120fa1bcb45fbe: Status 404 returned error can't find the container with id 5dbf1c652d02050e856e7d5ddc49e119822c77aa7b7f7c7d56120fa1bcb45fbe Dec 09 10:23:31 crc kubenswrapper[5002]: W1209 10:23:31.837913 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod845e4d85_e58c_4ceb_a26f_5c918422c6a6.slice/crio-46ceec4db445b3a77e0becf039547aefbb678dcd96a1714e8b29224c66ff5ef7 WatchSource:0}: Error finding container 46ceec4db445b3a77e0becf039547aefbb678dcd96a1714e8b29224c66ff5ef7: Status 404 returned error can't find the container with id 46ceec4db445b3a77e0becf039547aefbb678dcd96a1714e8b29224c66ff5ef7 Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.839466 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.862192 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-scripts\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.862233 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-config-data\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.862264 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.862391 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qftg\" (UniqueName: \"kubernetes.io/projected/15a833c4-f8ea-4259-a659-a11ea55a8f88-kube-api-access-4qftg\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.866699 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.868372 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-scripts\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.868589 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-config-data\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.872910 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.879760 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qftg\" (UniqueName: \"kubernetes.io/projected/15a833c4-f8ea-4259-a659-a11ea55a8f88-kube-api-access-4qftg\") pod \"nova-cell1-conductor-db-sync-6ck56\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:31 crc kubenswrapper[5002]: I1209 10:23:31.995877 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.032087 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.498436 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6ck56"] Dec 09 10:23:32 crc kubenswrapper[5002]: W1209 10:23:32.513421 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15a833c4_f8ea_4259_a659_a11ea55a8f88.slice/crio-85ced4ea064aa41717fd1b683ad7794bc97a1c10fe6e2455e84789028720a02a WatchSource:0}: Error finding container 85ced4ea064aa41717fd1b683ad7794bc97a1c10fe6e2455e84789028720a02a: Status 404 returned error can't find the container with id 85ced4ea064aa41717fd1b683ad7794bc97a1c10fe6e2455e84789028720a02a Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.711723 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s99q2" event={"ID":"759658bf-fbc8-40b6-96a5-b691f2ecec64","Type":"ContainerStarted","Data":"53b30c4b17869586d3e315fd81ac0d1c658ddca2fa36d24ba53232a45f2431eb"} Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.714220 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d1c1529-5c04-4bea-b054-5a73781b0e56","Type":"ContainerStarted","Data":"e9de674912b9182c1dbf5ebddaa6e247523e04c4a231e5de133da3524affc0e4"} Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.715454 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"845e4d85-e58c-4ceb-a26f-5c918422c6a6","Type":"ContainerStarted","Data":"46ceec4db445b3a77e0becf039547aefbb678dcd96a1714e8b29224c66ff5ef7"} Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.719592 5002 generic.go:334] "Generic (PLEG): container finished" podID="ea94dfeb-8659-48ad-9f5a-da8202588f0f" containerID="f0293e5b5f7f5a7d5ec9c6b4dfce8e4f7f1b1faa0700b8fb01f00a08e3aa8546" exitCode=0 Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.719719 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" event={"ID":"ea94dfeb-8659-48ad-9f5a-da8202588f0f","Type":"ContainerDied","Data":"f0293e5b5f7f5a7d5ec9c6b4dfce8e4f7f1b1faa0700b8fb01f00a08e3aa8546"} Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.719776 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" event={"ID":"ea94dfeb-8659-48ad-9f5a-da8202588f0f","Type":"ContainerStarted","Data":"805aa91bb48d32efc4b8adc7fc3487b23a26eb3df9bfdfdab943bd052a42f4e4"} Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.729733 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-s99q2" podStartSLOduration=2.72971196 podStartE2EDuration="2.72971196s" podCreationTimestamp="2025-12-09 10:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:32.72661718 +0000 UTC m=+1345.118668261" watchObservedRunningTime="2025-12-09 10:23:32.72971196 +0000 UTC m=+1345.121763041" Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.730589 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0172af5-b021-469d-ac3a-fb73a0651f27","Type":"ContainerStarted","Data":"5dbf1c652d02050e856e7d5ddc49e119822c77aa7b7f7c7d56120fa1bcb45fbe"} Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.735983 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6ck56" event={"ID":"15a833c4-f8ea-4259-a659-a11ea55a8f88","Type":"ContainerStarted","Data":"85ced4ea064aa41717fd1b683ad7794bc97a1c10fe6e2455e84789028720a02a"} Dec 09 10:23:32 crc kubenswrapper[5002]: I1209 10:23:32.737554 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4","Type":"ContainerStarted","Data":"5b58407fd49e574e6f6bfcabd307c75067505b4978b34c4f8e4c0fad822eece6"} Dec 09 10:23:33 crc kubenswrapper[5002]: I1209 10:23:33.747012 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6ck56" event={"ID":"15a833c4-f8ea-4259-a659-a11ea55a8f88","Type":"ContainerStarted","Data":"a481319e648432bf60e9db982024715b443dfc82b14f612845eaada599692612"} Dec 09 10:23:33 crc kubenswrapper[5002]: I1209 10:23:33.748739 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" event={"ID":"ea94dfeb-8659-48ad-9f5a-da8202588f0f","Type":"ContainerStarted","Data":"31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902"} Dec 09 10:23:33 crc kubenswrapper[5002]: I1209 10:23:33.781609 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" podStartSLOduration=3.7815903 podStartE2EDuration="3.7815903s" podCreationTimestamp="2025-12-09 10:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:33.781031356 +0000 UTC m=+1346.173082437" watchObservedRunningTime="2025-12-09 10:23:33.7815903 +0000 UTC m=+1346.173641381" Dec 09 10:23:33 crc kubenswrapper[5002]: I1209 10:23:33.788127 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-6ck56" podStartSLOduration=2.788112629 podStartE2EDuration="2.788112629s" podCreationTimestamp="2025-12-09 10:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:33.763731709 +0000 UTC m=+1346.155782800" watchObservedRunningTime="2025-12-09 10:23:33.788112629 +0000 UTC m=+1346.180163710" Dec 09 10:23:34 crc kubenswrapper[5002]: I1209 10:23:34.611364 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:34 crc kubenswrapper[5002]: I1209 10:23:34.631215 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:23:34 crc kubenswrapper[5002]: I1209 10:23:34.756720 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:36 crc kubenswrapper[5002]: I1209 10:23:36.799889 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4","Type":"ContainerStarted","Data":"4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00"} Dec 09 10:23:36 crc kubenswrapper[5002]: I1209 10:23:36.801937 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d1c1529-5c04-4bea-b054-5a73781b0e56","Type":"ContainerStarted","Data":"08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582"} Dec 09 10:23:36 crc kubenswrapper[5002]: I1209 10:23:36.804070 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"845e4d85-e58c-4ceb-a26f-5c918422c6a6","Type":"ContainerStarted","Data":"79384b122aa44c18780aee437bdeb1a0621649de1b77bc5c23b4cd94eb84ae33"} Dec 09 10:23:36 crc kubenswrapper[5002]: I1209 10:23:36.805894 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0172af5-b021-469d-ac3a-fb73a0651f27","Type":"ContainerStarted","Data":"5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569"} Dec 09 10:23:37 crc kubenswrapper[5002]: I1209 10:23:37.838113 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"845e4d85-e58c-4ceb-a26f-5c918422c6a6","Type":"ContainerStarted","Data":"8a4518abf93de8bdd2719c1c8f32badfc0d9d51af6d4f4eb804414d47cabacfa"} Dec 09 10:23:37 crc kubenswrapper[5002]: I1209 10:23:37.838280 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="845e4d85-e58c-4ceb-a26f-5c918422c6a6" containerName="nova-metadata-log" containerID="cri-o://79384b122aa44c18780aee437bdeb1a0621649de1b77bc5c23b4cd94eb84ae33" gracePeriod=30 Dec 09 10:23:37 crc kubenswrapper[5002]: I1209 10:23:37.838358 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="845e4d85-e58c-4ceb-a26f-5c918422c6a6" containerName="nova-metadata-metadata" containerID="cri-o://8a4518abf93de8bdd2719c1c8f32badfc0d9d51af6d4f4eb804414d47cabacfa" gracePeriod=30 Dec 09 10:23:37 crc kubenswrapper[5002]: I1209 10:23:37.846481 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0172af5-b021-469d-ac3a-fb73a0651f27","Type":"ContainerStarted","Data":"665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf"} Dec 09 10:23:37 crc kubenswrapper[5002]: I1209 10:23:37.846630 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8b839aa4-a8e5-4b00-a9d6-3aea41a707f4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00" gracePeriod=30 Dec 09 10:23:37 crc kubenswrapper[5002]: I1209 10:23:37.869439 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.025219369 podStartE2EDuration="7.869423287s" podCreationTimestamp="2025-12-09 10:23:30 +0000 UTC" firstStartedPulling="2025-12-09 10:23:31.842797136 +0000 UTC m=+1344.234848217" lastFinishedPulling="2025-12-09 10:23:35.687001054 +0000 UTC m=+1348.079052135" observedRunningTime="2025-12-09 10:23:37.864288865 +0000 UTC m=+1350.256339956" watchObservedRunningTime="2025-12-09 10:23:37.869423287 +0000 UTC m=+1350.261474358" Dec 09 10:23:37 crc kubenswrapper[5002]: I1209 10:23:37.883897 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.039826727 podStartE2EDuration="7.883879711s" podCreationTimestamp="2025-12-09 10:23:30 +0000 UTC" firstStartedPulling="2025-12-09 10:23:31.838150696 +0000 UTC m=+1344.230201767" lastFinishedPulling="2025-12-09 10:23:35.68220366 +0000 UTC m=+1348.074254751" observedRunningTime="2025-12-09 10:23:37.882453134 +0000 UTC m=+1350.274504235" watchObservedRunningTime="2025-12-09 10:23:37.883879711 +0000 UTC m=+1350.275930792" Dec 09 10:23:37 crc kubenswrapper[5002]: I1209 10:23:37.899803 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.948312284 podStartE2EDuration="7.899782071s" podCreationTimestamp="2025-12-09 10:23:30 +0000 UTC" firstStartedPulling="2025-12-09 10:23:31.732294474 +0000 UTC m=+1344.124345555" lastFinishedPulling="2025-12-09 10:23:35.683764241 +0000 UTC m=+1348.075815342" observedRunningTime="2025-12-09 10:23:37.895698106 +0000 UTC m=+1350.287749187" watchObservedRunningTime="2025-12-09 10:23:37.899782071 +0000 UTC m=+1350.291833152" Dec 09 10:23:37 crc kubenswrapper[5002]: I1209 10:23:37.919825 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.268319595 podStartE2EDuration="7.919796528s" podCreationTimestamp="2025-12-09 10:23:30 +0000 UTC" firstStartedPulling="2025-12-09 10:23:32.015254728 +0000 UTC m=+1344.407305809" lastFinishedPulling="2025-12-09 10:23:35.666731661 +0000 UTC m=+1348.058782742" observedRunningTime="2025-12-09 10:23:37.917309074 +0000 UTC m=+1350.309360155" watchObservedRunningTime="2025-12-09 10:23:37.919796528 +0000 UTC m=+1350.311847609" Dec 09 10:23:38 crc kubenswrapper[5002]: E1209 10:23:38.236605 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod845e4d85_e58c_4ceb_a26f_5c918422c6a6.slice/crio-8a4518abf93de8bdd2719c1c8f32badfc0d9d51af6d4f4eb804414d47cabacfa.scope\": RecentStats: unable to find data in memory cache]" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.723575 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.810527 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-combined-ca-bundle\") pod \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.811001 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk5km\" (UniqueName: \"kubernetes.io/projected/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-kube-api-access-nk5km\") pod \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.811174 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-config-data\") pod \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\" (UID: \"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4\") " Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.829305 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-kube-api-access-nk5km" (OuterVolumeSpecName: "kube-api-access-nk5km") pod "8b839aa4-a8e5-4b00-a9d6-3aea41a707f4" (UID: "8b839aa4-a8e5-4b00-a9d6-3aea41a707f4"). InnerVolumeSpecName "kube-api-access-nk5km". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.850465 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b839aa4-a8e5-4b00-a9d6-3aea41a707f4" (UID: "8b839aa4-a8e5-4b00-a9d6-3aea41a707f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.865028 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-config-data" (OuterVolumeSpecName: "config-data") pod "8b839aa4-a8e5-4b00-a9d6-3aea41a707f4" (UID: "8b839aa4-a8e5-4b00-a9d6-3aea41a707f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.870646 5002 generic.go:334] "Generic (PLEG): container finished" podID="845e4d85-e58c-4ceb-a26f-5c918422c6a6" containerID="8a4518abf93de8bdd2719c1c8f32badfc0d9d51af6d4f4eb804414d47cabacfa" exitCode=0 Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.870669 5002 generic.go:334] "Generic (PLEG): container finished" podID="845e4d85-e58c-4ceb-a26f-5c918422c6a6" containerID="79384b122aa44c18780aee437bdeb1a0621649de1b77bc5c23b4cd94eb84ae33" exitCode=143 Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.870685 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"845e4d85-e58c-4ceb-a26f-5c918422c6a6","Type":"ContainerDied","Data":"8a4518abf93de8bdd2719c1c8f32badfc0d9d51af6d4f4eb804414d47cabacfa"} Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.870732 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"845e4d85-e58c-4ceb-a26f-5c918422c6a6","Type":"ContainerDied","Data":"79384b122aa44c18780aee437bdeb1a0621649de1b77bc5c23b4cd94eb84ae33"} Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.873001 5002 generic.go:334] "Generic (PLEG): container finished" podID="8b839aa4-a8e5-4b00-a9d6-3aea41a707f4" containerID="4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00" exitCode=0 Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.873082 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.873160 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4","Type":"ContainerDied","Data":"4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00"} Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.873207 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b839aa4-a8e5-4b00-a9d6-3aea41a707f4","Type":"ContainerDied","Data":"5b58407fd49e574e6f6bfcabd307c75067505b4978b34c4f8e4c0fad822eece6"} Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.873231 5002 scope.go:117] "RemoveContainer" containerID="4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.910963 5002 scope.go:117] "RemoveContainer" containerID="4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.913105 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.913126 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.913137 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk5km\" (UniqueName: \"kubernetes.io/projected/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4-kube-api-access-nk5km\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:38 crc kubenswrapper[5002]: E1209 10:23:38.915756 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00\": container with ID starting with 4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00 not found: ID does not exist" containerID="4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.915798 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00"} err="failed to get container status \"4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00\": rpc error: code = NotFound desc = could not find container \"4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00\": container with ID starting with 4d48b17a98ab38f888972168c3d3c5347497af04bf3792b5903ed476cfc6eb00 not found: ID does not exist" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.916434 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.951569 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.961375 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:23:38 crc kubenswrapper[5002]: E1209 10:23:38.962003 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b839aa4-a8e5-4b00-a9d6-3aea41a707f4" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.962096 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b839aa4-a8e5-4b00-a9d6-3aea41a707f4" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.962334 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b839aa4-a8e5-4b00-a9d6-3aea41a707f4" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.963044 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.967898 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.968064 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.971308 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:23:38 crc kubenswrapper[5002]: I1209 10:23:38.972050 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.015510 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.015595 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.015619 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.015682 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbmfh\" (UniqueName: \"kubernetes.io/projected/b613f5a4-9369-45ae-8c2c-10e16e639999-kube-api-access-mbmfh\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.015712 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.084289 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.117337 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.117404 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.117437 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.117552 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbmfh\" (UniqueName: \"kubernetes.io/projected/b613f5a4-9369-45ae-8c2c-10e16e639999-kube-api-access-mbmfh\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.117604 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.123639 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.127686 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.128021 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.128578 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.145541 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbmfh\" (UniqueName: \"kubernetes.io/projected/b613f5a4-9369-45ae-8c2c-10e16e639999-kube-api-access-mbmfh\") pod \"nova-cell1-novncproxy-0\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.218584 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-combined-ca-bundle\") pod \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.218714 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-config-data\") pod \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.218882 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/845e4d85-e58c-4ceb-a26f-5c918422c6a6-logs\") pod \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.218993 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdmgh\" (UniqueName: \"kubernetes.io/projected/845e4d85-e58c-4ceb-a26f-5c918422c6a6-kube-api-access-wdmgh\") pod \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\" (UID: \"845e4d85-e58c-4ceb-a26f-5c918422c6a6\") " Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.222223 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/845e4d85-e58c-4ceb-a26f-5c918422c6a6-logs" (OuterVolumeSpecName: "logs") pod "845e4d85-e58c-4ceb-a26f-5c918422c6a6" (UID: "845e4d85-e58c-4ceb-a26f-5c918422c6a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.222742 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/845e4d85-e58c-4ceb-a26f-5c918422c6a6-kube-api-access-wdmgh" (OuterVolumeSpecName: "kube-api-access-wdmgh") pod "845e4d85-e58c-4ceb-a26f-5c918422c6a6" (UID: "845e4d85-e58c-4ceb-a26f-5c918422c6a6"). InnerVolumeSpecName "kube-api-access-wdmgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.256663 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "845e4d85-e58c-4ceb-a26f-5c918422c6a6" (UID: "845e4d85-e58c-4ceb-a26f-5c918422c6a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.278172 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-config-data" (OuterVolumeSpecName: "config-data") pod "845e4d85-e58c-4ceb-a26f-5c918422c6a6" (UID: "845e4d85-e58c-4ceb-a26f-5c918422c6a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.320910 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.321464 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845e4d85-e58c-4ceb-a26f-5c918422c6a6-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.321561 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/845e4d85-e58c-4ceb-a26f-5c918422c6a6-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.321657 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdmgh\" (UniqueName: \"kubernetes.io/projected/845e4d85-e58c-4ceb-a26f-5c918422c6a6-kube-api-access-wdmgh\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.379498 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.683104 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.888654 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"845e4d85-e58c-4ceb-a26f-5c918422c6a6","Type":"ContainerDied","Data":"46ceec4db445b3a77e0becf039547aefbb678dcd96a1714e8b29224c66ff5ef7"} Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.888683 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.888713 5002 scope.go:117] "RemoveContainer" containerID="8a4518abf93de8bdd2719c1c8f32badfc0d9d51af6d4f4eb804414d47cabacfa" Dec 09 10:23:39 crc kubenswrapper[5002]: I1209 10:23:39.890613 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b613f5a4-9369-45ae-8c2c-10e16e639999","Type":"ContainerStarted","Data":"95902ae3f27c573bbc6865a40cc0ada0249ce299ad6554c44da8c9b964e9464b"} Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.160126 5002 scope.go:117] "RemoveContainer" containerID="79384b122aa44c18780aee437bdeb1a0621649de1b77bc5c23b4cd94eb84ae33" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.178408 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b839aa4-a8e5-4b00-a9d6-3aea41a707f4" path="/var/lib/kubelet/pods/8b839aa4-a8e5-4b00-a9d6-3aea41a707f4/volumes" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.192965 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.225191 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.236700 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:40 crc kubenswrapper[5002]: E1209 10:23:40.237147 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845e4d85-e58c-4ceb-a26f-5c918422c6a6" containerName="nova-metadata-log" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.237181 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="845e4d85-e58c-4ceb-a26f-5c918422c6a6" containerName="nova-metadata-log" Dec 09 10:23:40 crc kubenswrapper[5002]: E1209 10:23:40.237213 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845e4d85-e58c-4ceb-a26f-5c918422c6a6" containerName="nova-metadata-metadata" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.237219 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="845e4d85-e58c-4ceb-a26f-5c918422c6a6" containerName="nova-metadata-metadata" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.237386 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="845e4d85-e58c-4ceb-a26f-5c918422c6a6" containerName="nova-metadata-metadata" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.237412 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="845e4d85-e58c-4ceb-a26f-5c918422c6a6" containerName="nova-metadata-log" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.238415 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.242786 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.243080 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.245099 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.253978 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.254062 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6wdb\" (UniqueName: \"kubernetes.io/projected/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-kube-api-access-k6wdb\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.254093 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-config-data\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.254120 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.254144 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-logs\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.355550 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.355646 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6wdb\" (UniqueName: \"kubernetes.io/projected/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-kube-api-access-k6wdb\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.355678 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-config-data\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.355709 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.355731 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-logs\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.356237 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-logs\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.360554 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.360554 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-config-data\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.361021 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.378525 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6wdb\" (UniqueName: \"kubernetes.io/projected/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-kube-api-access-k6wdb\") pod \"nova-metadata-0\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.566806 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.902685 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b613f5a4-9369-45ae-8c2c-10e16e639999","Type":"ContainerStarted","Data":"58ecfbecbf171e347e375d6249ce709b457a4fd2276c833f84808e224ef3e5ed"} Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.905697 5002 generic.go:334] "Generic (PLEG): container finished" podID="759658bf-fbc8-40b6-96a5-b691f2ecec64" containerID="53b30c4b17869586d3e315fd81ac0d1c658ddca2fa36d24ba53232a45f2431eb" exitCode=0 Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.905784 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s99q2" event={"ID":"759658bf-fbc8-40b6-96a5-b691f2ecec64","Type":"ContainerDied","Data":"53b30c4b17869586d3e315fd81ac0d1c658ddca2fa36d24ba53232a45f2431eb"} Dec 09 10:23:40 crc kubenswrapper[5002]: I1209 10:23:40.926505 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.926487738 podStartE2EDuration="2.926487738s" podCreationTimestamp="2025-12-09 10:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:40.924687301 +0000 UTC m=+1353.316738382" watchObservedRunningTime="2025-12-09 10:23:40.926487738 +0000 UTC m=+1353.318538819" Dec 09 10:23:41 crc kubenswrapper[5002]: I1209 10:23:41.044558 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:41 crc kubenswrapper[5002]: I1209 10:23:41.057982 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:23:41 crc kubenswrapper[5002]: I1209 10:23:41.083907 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 10:23:41 crc kubenswrapper[5002]: I1209 10:23:41.083952 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 10:23:41 crc kubenswrapper[5002]: I1209 10:23:41.144231 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7xtz7"] Dec 09 10:23:41 crc kubenswrapper[5002]: I1209 10:23:41.144460 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" podUID="7906530a-fb62-45dc-8b2c-f4c52045fa70" containerName="dnsmasq-dns" containerID="cri-o://2af731cf29a9b1447cebdde90c9d283671a3fa56e1feac1b8bda802337a372d2" gracePeriod=10 Dec 09 10:23:41 crc kubenswrapper[5002]: I1209 10:23:41.148191 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 10:23:41 crc kubenswrapper[5002]: I1209 10:23:41.148237 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 10:23:41 crc kubenswrapper[5002]: I1209 10:23:41.155594 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.043486 5002 generic.go:334] "Generic (PLEG): container finished" podID="7906530a-fb62-45dc-8b2c-f4c52045fa70" containerID="2af731cf29a9b1447cebdde90c9d283671a3fa56e1feac1b8bda802337a372d2" exitCode=0 Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.044112 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" event={"ID":"7906530a-fb62-45dc-8b2c-f4c52045fa70","Type":"ContainerDied","Data":"2af731cf29a9b1447cebdde90c9d283671a3fa56e1feac1b8bda802337a372d2"} Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.064690 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226","Type":"ContainerStarted","Data":"10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f"} Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.064731 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226","Type":"ContainerStarted","Data":"15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e"} Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.064741 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226","Type":"ContainerStarted","Data":"e5e3802e24f38c3058e11a60e4e6919f83932822e74eb9206e8b0c0cb30aca9f"} Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.088925 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="845e4d85-e58c-4ceb-a26f-5c918422c6a6" path="/var/lib/kubelet/pods/845e4d85-e58c-4ceb-a26f-5c918422c6a6/volumes" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.107606 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.107587365 podStartE2EDuration="2.107587365s" podCreationTimestamp="2025-12-09 10:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:42.098730737 +0000 UTC m=+1354.490781838" watchObservedRunningTime="2025-12-09 10:23:42.107587365 +0000 UTC m=+1354.499638436" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.113046 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.241995 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.241995 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.580642 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.598626 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.627250 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-combined-ca-bundle\") pod \"759658bf-fbc8-40b6-96a5-b691f2ecec64\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.627392 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4dkq\" (UniqueName: \"kubernetes.io/projected/759658bf-fbc8-40b6-96a5-b691f2ecec64-kube-api-access-k4dkq\") pod \"759658bf-fbc8-40b6-96a5-b691f2ecec64\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.627432 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-sb\") pod \"7906530a-fb62-45dc-8b2c-f4c52045fa70\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.627492 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-svc\") pod \"7906530a-fb62-45dc-8b2c-f4c52045fa70\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.627537 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-nb\") pod \"7906530a-fb62-45dc-8b2c-f4c52045fa70\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.627585 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-config\") pod \"7906530a-fb62-45dc-8b2c-f4c52045fa70\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.627605 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-scripts\") pod \"759658bf-fbc8-40b6-96a5-b691f2ecec64\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.627624 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-config-data\") pod \"759658bf-fbc8-40b6-96a5-b691f2ecec64\" (UID: \"759658bf-fbc8-40b6-96a5-b691f2ecec64\") " Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.627657 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-swift-storage-0\") pod \"7906530a-fb62-45dc-8b2c-f4c52045fa70\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.627687 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpp4w\" (UniqueName: \"kubernetes.io/projected/7906530a-fb62-45dc-8b2c-f4c52045fa70-kube-api-access-dpp4w\") pod \"7906530a-fb62-45dc-8b2c-f4c52045fa70\" (UID: \"7906530a-fb62-45dc-8b2c-f4c52045fa70\") " Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.644026 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7906530a-fb62-45dc-8b2c-f4c52045fa70-kube-api-access-dpp4w" (OuterVolumeSpecName: "kube-api-access-dpp4w") pod "7906530a-fb62-45dc-8b2c-f4c52045fa70" (UID: "7906530a-fb62-45dc-8b2c-f4c52045fa70"). InnerVolumeSpecName "kube-api-access-dpp4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.651639 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/759658bf-fbc8-40b6-96a5-b691f2ecec64-kube-api-access-k4dkq" (OuterVolumeSpecName: "kube-api-access-k4dkq") pod "759658bf-fbc8-40b6-96a5-b691f2ecec64" (UID: "759658bf-fbc8-40b6-96a5-b691f2ecec64"). InnerVolumeSpecName "kube-api-access-k4dkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.656258 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-scripts" (OuterVolumeSpecName: "scripts") pod "759658bf-fbc8-40b6-96a5-b691f2ecec64" (UID: "759658bf-fbc8-40b6-96a5-b691f2ecec64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.695657 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "759658bf-fbc8-40b6-96a5-b691f2ecec64" (UID: "759658bf-fbc8-40b6-96a5-b691f2ecec64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.707448 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7906530a-fb62-45dc-8b2c-f4c52045fa70" (UID: "7906530a-fb62-45dc-8b2c-f4c52045fa70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.712528 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7906530a-fb62-45dc-8b2c-f4c52045fa70" (UID: "7906530a-fb62-45dc-8b2c-f4c52045fa70"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.721020 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-config-data" (OuterVolumeSpecName: "config-data") pod "759658bf-fbc8-40b6-96a5-b691f2ecec64" (UID: "759658bf-fbc8-40b6-96a5-b691f2ecec64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.729100 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.729134 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.729143 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.729154 5002 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.729164 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpp4w\" (UniqueName: \"kubernetes.io/projected/7906530a-fb62-45dc-8b2c-f4c52045fa70-kube-api-access-dpp4w\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.729174 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/759658bf-fbc8-40b6-96a5-b691f2ecec64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.729181 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4dkq\" (UniqueName: \"kubernetes.io/projected/759658bf-fbc8-40b6-96a5-b691f2ecec64-kube-api-access-k4dkq\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.736940 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7906530a-fb62-45dc-8b2c-f4c52045fa70" (UID: "7906530a-fb62-45dc-8b2c-f4c52045fa70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.749940 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7906530a-fb62-45dc-8b2c-f4c52045fa70" (UID: "7906530a-fb62-45dc-8b2c-f4c52045fa70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.775332 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-config" (OuterVolumeSpecName: "config") pod "7906530a-fb62-45dc-8b2c-f4c52045fa70" (UID: "7906530a-fb62-45dc-8b2c-f4c52045fa70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.830908 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.830948 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:42 crc kubenswrapper[5002]: I1209 10:23:42.830960 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7906530a-fb62-45dc-8b2c-f4c52045fa70-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.075770 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s99q2" event={"ID":"759658bf-fbc8-40b6-96a5-b691f2ecec64","Type":"ContainerDied","Data":"0e33b126f382ea5ba98ff2305db64139c035df475787582194e452a5c649f2c5"} Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.075840 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e33b126f382ea5ba98ff2305db64139c035df475787582194e452a5c649f2c5" Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.075855 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s99q2" Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.077893 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" event={"ID":"7906530a-fb62-45dc-8b2c-f4c52045fa70","Type":"ContainerDied","Data":"0df69a12aef296c6624d0f0bf0a7d1ab732d06e3ccb07659252efca566e9a74a"} Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.077936 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7xtz7" Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.077943 5002 scope.go:117] "RemoveContainer" containerID="2af731cf29a9b1447cebdde90c9d283671a3fa56e1feac1b8bda802337a372d2" Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.100433 5002 scope.go:117] "RemoveContainer" containerID="c82c4f8c5fb3dcb52f79b0e7b2fda9f4362e5b057a2b7bf1f7117129a50248b8" Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.124558 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7xtz7"] Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.175905 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7xtz7"] Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.192916 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.193205 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerName="nova-api-log" containerID="cri-o://5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569" gracePeriod=30 Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.193518 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerName="nova-api-api" containerID="cri-o://665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf" gracePeriod=30 Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.207916 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:43 crc kubenswrapper[5002]: I1209 10:23:43.217090 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.071079 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7906530a-fb62-45dc-8b2c-f4c52045fa70" path="/var/lib/kubelet/pods/7906530a-fb62-45dc-8b2c-f4c52045fa70/volumes" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.088522 5002 generic.go:334] "Generic (PLEG): container finished" podID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerID="5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569" exitCode=143 Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.088584 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0172af5-b021-469d-ac3a-fb73a0651f27","Type":"ContainerDied","Data":"5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569"} Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.088688 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3d1c1529-5c04-4bea-b054-5a73781b0e56" containerName="nova-scheduler-scheduler" containerID="cri-o://08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582" gracePeriod=30 Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.089151 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" containerName="nova-metadata-log" containerID="cri-o://15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e" gracePeriod=30 Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.089330 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" containerName="nova-metadata-metadata" containerID="cri-o://10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f" gracePeriod=30 Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.380993 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.673903 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.780768 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6wdb\" (UniqueName: \"kubernetes.io/projected/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-kube-api-access-k6wdb\") pod \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.780961 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-combined-ca-bundle\") pod \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.781078 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-logs\") pod \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.781163 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-config-data\") pod \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.781226 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-nova-metadata-tls-certs\") pod \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\" (UID: \"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226\") " Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.781382 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-logs" (OuterVolumeSpecName: "logs") pod "8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" (UID: "8d110d9d-9cd6-4e9c-8b97-4c835ddfe226"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.781684 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.789800 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-kube-api-access-k6wdb" (OuterVolumeSpecName: "kube-api-access-k6wdb") pod "8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" (UID: "8d110d9d-9cd6-4e9c-8b97-4c835ddfe226"). InnerVolumeSpecName "kube-api-access-k6wdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.812496 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" (UID: "8d110d9d-9cd6-4e9c-8b97-4c835ddfe226"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.813255 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-config-data" (OuterVolumeSpecName: "config-data") pod "8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" (UID: "8d110d9d-9cd6-4e9c-8b97-4c835ddfe226"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.838926 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" (UID: "8d110d9d-9cd6-4e9c-8b97-4c835ddfe226"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.883702 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.884018 5002 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.884146 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6wdb\" (UniqueName: \"kubernetes.io/projected/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-kube-api-access-k6wdb\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:44 crc kubenswrapper[5002]: I1209 10:23:44.884308 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.102305 5002 generic.go:334] "Generic (PLEG): container finished" podID="8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" containerID="10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f" exitCode=0 Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.103146 5002 generic.go:334] "Generic (PLEG): container finished" podID="8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" containerID="15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e" exitCode=143 Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.103222 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226","Type":"ContainerDied","Data":"10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f"} Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.103326 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226","Type":"ContainerDied","Data":"15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e"} Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.103451 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d110d9d-9cd6-4e9c-8b97-4c835ddfe226","Type":"ContainerDied","Data":"e5e3802e24f38c3058e11a60e4e6919f83932822e74eb9206e8b0c0cb30aca9f"} Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.103540 5002 scope.go:117] "RemoveContainer" containerID="10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.103744 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.178045 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.184446 5002 scope.go:117] "RemoveContainer" containerID="15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.187395 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.211312 5002 scope.go:117] "RemoveContainer" containerID="10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f" Dec 09 10:23:45 crc kubenswrapper[5002]: E1209 10:23:45.212754 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f\": container with ID starting with 10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f not found: ID does not exist" containerID="10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.212795 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f"} err="failed to get container status \"10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f\": rpc error: code = NotFound desc = could not find container \"10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f\": container with ID starting with 10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f not found: ID does not exist" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.212841 5002 scope.go:117] "RemoveContainer" containerID="15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e" Dec 09 10:23:45 crc kubenswrapper[5002]: E1209 10:23:45.213053 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e\": container with ID starting with 15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e not found: ID does not exist" containerID="15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.213073 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e"} err="failed to get container status \"15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e\": rpc error: code = NotFound desc = could not find container \"15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e\": container with ID starting with 15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e not found: ID does not exist" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.213089 5002 scope.go:117] "RemoveContainer" containerID="10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.213293 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f"} err="failed to get container status \"10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f\": rpc error: code = NotFound desc = could not find container \"10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f\": container with ID starting with 10253f70171793efb67965614b3d2e26379b42893cc3a79baf848f97d46bae7f not found: ID does not exist" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.213338 5002 scope.go:117] "RemoveContainer" containerID="15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.213558 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e"} err="failed to get container status \"15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e\": rpc error: code = NotFound desc = could not find container \"15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e\": container with ID starting with 15896ca85688a43ee13e64bfa4484c68c0a84a760157fafcb157e56e5855aa2e not found: ID does not exist" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.231299 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:45 crc kubenswrapper[5002]: E1209 10:23:45.231802 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7906530a-fb62-45dc-8b2c-f4c52045fa70" containerName="dnsmasq-dns" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.231837 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7906530a-fb62-45dc-8b2c-f4c52045fa70" containerName="dnsmasq-dns" Dec 09 10:23:45 crc kubenswrapper[5002]: E1209 10:23:45.231857 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7906530a-fb62-45dc-8b2c-f4c52045fa70" containerName="init" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.231903 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7906530a-fb62-45dc-8b2c-f4c52045fa70" containerName="init" Dec 09 10:23:45 crc kubenswrapper[5002]: E1209 10:23:45.231923 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759658bf-fbc8-40b6-96a5-b691f2ecec64" containerName="nova-manage" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.231930 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="759658bf-fbc8-40b6-96a5-b691f2ecec64" containerName="nova-manage" Dec 09 10:23:45 crc kubenswrapper[5002]: E1209 10:23:45.232076 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" containerName="nova-metadata-metadata" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.232104 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" containerName="nova-metadata-metadata" Dec 09 10:23:45 crc kubenswrapper[5002]: E1209 10:23:45.232121 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" containerName="nova-metadata-log" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.232128 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" containerName="nova-metadata-log" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.232334 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" containerName="nova-metadata-metadata" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.232350 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7906530a-fb62-45dc-8b2c-f4c52045fa70" containerName="dnsmasq-dns" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.232364 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" containerName="nova-metadata-log" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.232387 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="759658bf-fbc8-40b6-96a5-b691f2ecec64" containerName="nova-manage" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.233585 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.241217 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.248110 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.250654 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.290630 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-logs\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.290720 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.290753 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws6zs\" (UniqueName: \"kubernetes.io/projected/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-kube-api-access-ws6zs\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.290836 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-config-data\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.290911 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.392691 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-logs\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.393087 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.393113 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws6zs\" (UniqueName: \"kubernetes.io/projected/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-kube-api-access-ws6zs\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.393155 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-config-data\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.393211 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.393259 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-logs\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.397141 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.397203 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-config-data\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.398471 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.403627 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.409858 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws6zs\" (UniqueName: \"kubernetes.io/projected/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-kube-api-access-ws6zs\") pod \"nova-metadata-0\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.494786 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqrdk\" (UniqueName: \"kubernetes.io/projected/3d1c1529-5c04-4bea-b054-5a73781b0e56-kube-api-access-xqrdk\") pod \"3d1c1529-5c04-4bea-b054-5a73781b0e56\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.494863 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-combined-ca-bundle\") pod \"3d1c1529-5c04-4bea-b054-5a73781b0e56\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.495017 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-config-data\") pod \"3d1c1529-5c04-4bea-b054-5a73781b0e56\" (UID: \"3d1c1529-5c04-4bea-b054-5a73781b0e56\") " Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.497539 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1c1529-5c04-4bea-b054-5a73781b0e56-kube-api-access-xqrdk" (OuterVolumeSpecName: "kube-api-access-xqrdk") pod "3d1c1529-5c04-4bea-b054-5a73781b0e56" (UID: "3d1c1529-5c04-4bea-b054-5a73781b0e56"). InnerVolumeSpecName "kube-api-access-xqrdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.519609 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-config-data" (OuterVolumeSpecName: "config-data") pod "3d1c1529-5c04-4bea-b054-5a73781b0e56" (UID: "3d1c1529-5c04-4bea-b054-5a73781b0e56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.520843 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d1c1529-5c04-4bea-b054-5a73781b0e56" (UID: "3d1c1529-5c04-4bea-b054-5a73781b0e56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.555317 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.599220 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.599253 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqrdk\" (UniqueName: \"kubernetes.io/projected/3d1c1529-5c04-4bea-b054-5a73781b0e56-kube-api-access-xqrdk\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:45 crc kubenswrapper[5002]: I1209 10:23:45.599266 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1c1529-5c04-4bea-b054-5a73781b0e56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.030968 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:23:46 crc kubenswrapper[5002]: W1209 10:23:46.034633 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25c15ad_ecf7_44b0_b68c_06a00ad0ab9b.slice/crio-7dde305264ab5f31fca8ca2a9dc8095f30989349501c79d8a49ac78269d8cfa9 WatchSource:0}: Error finding container 7dde305264ab5f31fca8ca2a9dc8095f30989349501c79d8a49ac78269d8cfa9: Status 404 returned error can't find the container with id 7dde305264ab5f31fca8ca2a9dc8095f30989349501c79d8a49ac78269d8cfa9 Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.074481 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d110d9d-9cd6-4e9c-8b97-4c835ddfe226" path="/var/lib/kubelet/pods/8d110d9d-9cd6-4e9c-8b97-4c835ddfe226/volumes" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.115748 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b","Type":"ContainerStarted","Data":"7dde305264ab5f31fca8ca2a9dc8095f30989349501c79d8a49ac78269d8cfa9"} Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.117572 5002 generic.go:334] "Generic (PLEG): container finished" podID="3d1c1529-5c04-4bea-b054-5a73781b0e56" containerID="08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582" exitCode=0 Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.117650 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d1c1529-5c04-4bea-b054-5a73781b0e56","Type":"ContainerDied","Data":"08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582"} Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.117673 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d1c1529-5c04-4bea-b054-5a73781b0e56","Type":"ContainerDied","Data":"e9de674912b9182c1dbf5ebddaa6e247523e04c4a231e5de133da3524affc0e4"} Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.117689 5002 scope.go:117] "RemoveContainer" containerID="08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.117776 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.152511 5002 scope.go:117] "RemoveContainer" containerID="08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582" Dec 09 10:23:46 crc kubenswrapper[5002]: E1209 10:23:46.153134 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582\": container with ID starting with 08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582 not found: ID does not exist" containerID="08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.153185 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582"} err="failed to get container status \"08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582\": rpc error: code = NotFound desc = could not find container \"08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582\": container with ID starting with 08d2ebe98f25abdf69cdfccc2c50835449a00d2a32e249926fa0e11b23d79582 not found: ID does not exist" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.158898 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.210130 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.219301 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:23:46 crc kubenswrapper[5002]: E1209 10:23:46.219745 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1c1529-5c04-4bea-b054-5a73781b0e56" containerName="nova-scheduler-scheduler" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.219760 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1c1529-5c04-4bea-b054-5a73781b0e56" containerName="nova-scheduler-scheduler" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.219946 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1c1529-5c04-4bea-b054-5a73781b0e56" containerName="nova-scheduler-scheduler" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.220590 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.222521 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.231119 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.313273 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmld9\" (UniqueName: \"kubernetes.io/projected/f12d2757-e55a-4fd6-a910-857f03cb8f64-kube-api-access-fmld9\") pod \"nova-scheduler-0\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.313510 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.313732 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-config-data\") pod \"nova-scheduler-0\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.415985 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.416116 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-config-data\") pod \"nova-scheduler-0\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.416234 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmld9\" (UniqueName: \"kubernetes.io/projected/f12d2757-e55a-4fd6-a910-857f03cb8f64-kube-api-access-fmld9\") pod \"nova-scheduler-0\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.421781 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.422852 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-config-data\") pod \"nova-scheduler-0\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.433209 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmld9\" (UniqueName: \"kubernetes.io/projected/f12d2757-e55a-4fd6-a910-857f03cb8f64-kube-api-access-fmld9\") pod \"nova-scheduler-0\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " pod="openstack/nova-scheduler-0" Dec 09 10:23:46 crc kubenswrapper[5002]: I1209 10:23:46.621427 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:23:47 crc kubenswrapper[5002]: I1209 10:23:47.139073 5002 generic.go:334] "Generic (PLEG): container finished" podID="15a833c4-f8ea-4259-a659-a11ea55a8f88" containerID="a481319e648432bf60e9db982024715b443dfc82b14f612845eaada599692612" exitCode=0 Dec 09 10:23:47 crc kubenswrapper[5002]: I1209 10:23:47.139243 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6ck56" event={"ID":"15a833c4-f8ea-4259-a659-a11ea55a8f88","Type":"ContainerDied","Data":"a481319e648432bf60e9db982024715b443dfc82b14f612845eaada599692612"} Dec 09 10:23:47 crc kubenswrapper[5002]: I1209 10:23:47.144222 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b","Type":"ContainerStarted","Data":"02343a85391413db414177ad41f87801d36aa19d3a0fc605a5eedfd1aa3dcf76"} Dec 09 10:23:47 crc kubenswrapper[5002]: I1209 10:23:47.144279 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b","Type":"ContainerStarted","Data":"de0c51c01e1bbeadedfa7668fc8d59bbd0722e47a08a6a6ee9a64bf53dcad53a"} Dec 09 10:23:47 crc kubenswrapper[5002]: I1209 10:23:47.189532 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.189502852 podStartE2EDuration="2.189502852s" podCreationTimestamp="2025-12-09 10:23:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:47.180939051 +0000 UTC m=+1359.572990172" watchObservedRunningTime="2025-12-09 10:23:47.189502852 +0000 UTC m=+1359.581553933" Dec 09 10:23:47 crc kubenswrapper[5002]: I1209 10:23:47.208827 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:23:47 crc kubenswrapper[5002]: I1209 10:23:47.968119 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.047234 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-config-data\") pod \"a0172af5-b021-469d-ac3a-fb73a0651f27\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.047271 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-combined-ca-bundle\") pod \"a0172af5-b021-469d-ac3a-fb73a0651f27\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.047308 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0172af5-b021-469d-ac3a-fb73a0651f27-logs\") pod \"a0172af5-b021-469d-ac3a-fb73a0651f27\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.047331 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhd9f\" (UniqueName: \"kubernetes.io/projected/a0172af5-b021-469d-ac3a-fb73a0651f27-kube-api-access-rhd9f\") pod \"a0172af5-b021-469d-ac3a-fb73a0651f27\" (UID: \"a0172af5-b021-469d-ac3a-fb73a0651f27\") " Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.047966 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0172af5-b021-469d-ac3a-fb73a0651f27-logs" (OuterVolumeSpecName: "logs") pod "a0172af5-b021-469d-ac3a-fb73a0651f27" (UID: "a0172af5-b021-469d-ac3a-fb73a0651f27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.051257 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0172af5-b021-469d-ac3a-fb73a0651f27-kube-api-access-rhd9f" (OuterVolumeSpecName: "kube-api-access-rhd9f") pod "a0172af5-b021-469d-ac3a-fb73a0651f27" (UID: "a0172af5-b021-469d-ac3a-fb73a0651f27"). InnerVolumeSpecName "kube-api-access-rhd9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.077724 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1c1529-5c04-4bea-b054-5a73781b0e56" path="/var/lib/kubelet/pods/3d1c1529-5c04-4bea-b054-5a73781b0e56/volumes" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.078678 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-config-data" (OuterVolumeSpecName: "config-data") pod "a0172af5-b021-469d-ac3a-fb73a0651f27" (UID: "a0172af5-b021-469d-ac3a-fb73a0651f27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.087146 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0172af5-b021-469d-ac3a-fb73a0651f27" (UID: "a0172af5-b021-469d-ac3a-fb73a0651f27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.149930 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.149958 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0172af5-b021-469d-ac3a-fb73a0651f27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.149969 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0172af5-b021-469d-ac3a-fb73a0651f27-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.149978 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhd9f\" (UniqueName: \"kubernetes.io/projected/a0172af5-b021-469d-ac3a-fb73a0651f27-kube-api-access-rhd9f\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.155540 5002 generic.go:334] "Generic (PLEG): container finished" podID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerID="665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf" exitCode=0 Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.155607 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0172af5-b021-469d-ac3a-fb73a0651f27","Type":"ContainerDied","Data":"665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf"} Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.155648 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0172af5-b021-469d-ac3a-fb73a0651f27","Type":"ContainerDied","Data":"5dbf1c652d02050e856e7d5ddc49e119822c77aa7b7f7c7d56120fa1bcb45fbe"} Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.155668 5002 scope.go:117] "RemoveContainer" containerID="665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.155893 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.157463 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f12d2757-e55a-4fd6-a910-857f03cb8f64","Type":"ContainerStarted","Data":"e86f743bac5750837d76b9e1ac24b67b07615ac1c447105837da31f3f7fe02f1"} Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.157526 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f12d2757-e55a-4fd6-a910-857f03cb8f64","Type":"ContainerStarted","Data":"d0144d5cf6b715a94b6a4f5a2430c03665c10aadc7c0a091fa9f192c2304f812"} Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.185179 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.185154322 podStartE2EDuration="2.185154322s" podCreationTimestamp="2025-12-09 10:23:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:48.176753265 +0000 UTC m=+1360.568804356" watchObservedRunningTime="2025-12-09 10:23:48.185154322 +0000 UTC m=+1360.577205423" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.189058 5002 scope.go:117] "RemoveContainer" containerID="5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.213106 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.223837 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.224008 5002 scope.go:117] "RemoveContainer" containerID="665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf" Dec 09 10:23:48 crc kubenswrapper[5002]: E1209 10:23:48.224930 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf\": container with ID starting with 665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf not found: ID does not exist" containerID="665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.224962 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf"} err="failed to get container status \"665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf\": rpc error: code = NotFound desc = could not find container \"665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf\": container with ID starting with 665f9d1bc6efa80141e1dd38fd740fc3ef39099c939bb5863973c24f95545acf not found: ID does not exist" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.224986 5002 scope.go:117] "RemoveContainer" containerID="5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569" Dec 09 10:23:48 crc kubenswrapper[5002]: E1209 10:23:48.227667 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569\": container with ID starting with 5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569 not found: ID does not exist" containerID="5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.227701 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569"} err="failed to get container status \"5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569\": rpc error: code = NotFound desc = could not find container \"5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569\": container with ID starting with 5b8a86d6555d11247d955b123911f432c8bf22e6801542802949c7fe12ad5569 not found: ID does not exist" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.229183 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 10:23:48 crc kubenswrapper[5002]: E1209 10:23:48.229587 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerName="nova-api-log" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.229600 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerName="nova-api-log" Dec 09 10:23:48 crc kubenswrapper[5002]: E1209 10:23:48.229616 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerName="nova-api-api" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.229622 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerName="nova-api-api" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.229841 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerName="nova-api-api" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.229875 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0172af5-b021-469d-ac3a-fb73a0651f27" containerName="nova-api-log" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.230930 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.233629 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.245641 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.251724 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fd6b926-1857-4c25-87f2-9477dc815651-logs\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.251775 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5f2r\" (UniqueName: \"kubernetes.io/projected/4fd6b926-1857-4c25-87f2-9477dc815651-kube-api-access-q5f2r\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.251846 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-config-data\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.252010 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.353541 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.354070 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fd6b926-1857-4c25-87f2-9477dc815651-logs\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.354136 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5f2r\" (UniqueName: \"kubernetes.io/projected/4fd6b926-1857-4c25-87f2-9477dc815651-kube-api-access-q5f2r\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.354237 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-config-data\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.356022 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fd6b926-1857-4c25-87f2-9477dc815651-logs\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.360357 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-config-data\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.361075 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.384596 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5f2r\" (UniqueName: \"kubernetes.io/projected/4fd6b926-1857-4c25-87f2-9477dc815651-kube-api-access-q5f2r\") pod \"nova-api-0\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.492847 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.548448 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.556915 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-scripts\") pod \"15a833c4-f8ea-4259-a659-a11ea55a8f88\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.556995 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qftg\" (UniqueName: \"kubernetes.io/projected/15a833c4-f8ea-4259-a659-a11ea55a8f88-kube-api-access-4qftg\") pod \"15a833c4-f8ea-4259-a659-a11ea55a8f88\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.557407 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-combined-ca-bundle\") pod \"15a833c4-f8ea-4259-a659-a11ea55a8f88\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.557436 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-config-data\") pod \"15a833c4-f8ea-4259-a659-a11ea55a8f88\" (UID: \"15a833c4-f8ea-4259-a659-a11ea55a8f88\") " Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.560090 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a833c4-f8ea-4259-a659-a11ea55a8f88-kube-api-access-4qftg" (OuterVolumeSpecName: "kube-api-access-4qftg") pod "15a833c4-f8ea-4259-a659-a11ea55a8f88" (UID: "15a833c4-f8ea-4259-a659-a11ea55a8f88"). InnerVolumeSpecName "kube-api-access-4qftg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.561304 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-scripts" (OuterVolumeSpecName: "scripts") pod "15a833c4-f8ea-4259-a659-a11ea55a8f88" (UID: "15a833c4-f8ea-4259-a659-a11ea55a8f88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.585938 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15a833c4-f8ea-4259-a659-a11ea55a8f88" (UID: "15a833c4-f8ea-4259-a659-a11ea55a8f88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.596999 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-config-data" (OuterVolumeSpecName: "config-data") pod "15a833c4-f8ea-4259-a659-a11ea55a8f88" (UID: "15a833c4-f8ea-4259-a659-a11ea55a8f88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.661196 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.661539 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qftg\" (UniqueName: \"kubernetes.io/projected/15a833c4-f8ea-4259-a659-a11ea55a8f88-kube-api-access-4qftg\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.661552 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:48 crc kubenswrapper[5002]: I1209 10:23:48.661561 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a833c4-f8ea-4259-a659-a11ea55a8f88-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.009089 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:23:49 crc kubenswrapper[5002]: W1209 10:23:49.018179 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fd6b926_1857_4c25_87f2_9477dc815651.slice/crio-4723f39dd909c80a7b45d3ee3ab6e46f1d61edf6f58d2b5829b89c5316f27a0b WatchSource:0}: Error finding container 4723f39dd909c80a7b45d3ee3ab6e46f1d61edf6f58d2b5829b89c5316f27a0b: Status 404 returned error can't find the container with id 4723f39dd909c80a7b45d3ee3ab6e46f1d61edf6f58d2b5829b89c5316f27a0b Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.173807 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6ck56" event={"ID":"15a833c4-f8ea-4259-a659-a11ea55a8f88","Type":"ContainerDied","Data":"85ced4ea064aa41717fd1b683ad7794bc97a1c10fe6e2455e84789028720a02a"} Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.173858 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85ced4ea064aa41717fd1b683ad7794bc97a1c10fe6e2455e84789028720a02a" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.173909 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6ck56" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.192500 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fd6b926-1857-4c25-87f2-9477dc815651","Type":"ContainerStarted","Data":"4723f39dd909c80a7b45d3ee3ab6e46f1d61edf6f58d2b5829b89c5316f27a0b"} Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.276702 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 10:23:49 crc kubenswrapper[5002]: E1209 10:23:49.277172 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a833c4-f8ea-4259-a659-a11ea55a8f88" containerName="nova-cell1-conductor-db-sync" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.277197 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a833c4-f8ea-4259-a659-a11ea55a8f88" containerName="nova-cell1-conductor-db-sync" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.277418 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a833c4-f8ea-4259-a659-a11ea55a8f88" containerName="nova-cell1-conductor-db-sync" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.278067 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.280795 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.332790 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.372780 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.372934 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.372967 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnp7t\" (UniqueName: \"kubernetes.io/projected/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-kube-api-access-lnp7t\") pod \"nova-cell1-conductor-0\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.380496 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.426374 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.474485 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.475016 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.475060 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnp7t\" (UniqueName: \"kubernetes.io/projected/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-kube-api-access-lnp7t\") pod \"nova-cell1-conductor-0\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.479599 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.480648 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.497261 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnp7t\" (UniqueName: \"kubernetes.io/projected/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-kube-api-access-lnp7t\") pod \"nova-cell1-conductor-0\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:49 crc kubenswrapper[5002]: I1209 10:23:49.648269 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:50 crc kubenswrapper[5002]: I1209 10:23:50.070655 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0172af5-b021-469d-ac3a-fb73a0651f27" path="/var/lib/kubelet/pods/a0172af5-b021-469d-ac3a-fb73a0651f27/volumes" Dec 09 10:23:50 crc kubenswrapper[5002]: I1209 10:23:50.114175 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 10:23:50 crc kubenswrapper[5002]: I1209 10:23:50.204248 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e5a9794a-b66f-40d4-9e70-efc6a0a72d83","Type":"ContainerStarted","Data":"bb62e1e09e69cdfa540a5529b1263c7fc9cae85b97824273bbfb93fd66ecca52"} Dec 09 10:23:50 crc kubenswrapper[5002]: I1209 10:23:50.210536 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fd6b926-1857-4c25-87f2-9477dc815651","Type":"ContainerStarted","Data":"a11adff3f141188c9874f24498bda40a3808357613274458ca3d6932b9c1c32e"} Dec 09 10:23:50 crc kubenswrapper[5002]: I1209 10:23:50.210581 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fd6b926-1857-4c25-87f2-9477dc815651","Type":"ContainerStarted","Data":"3b7279a5fe451090bf124375a3dc865310dc7e7141c18544dd6e31e5222517d0"} Dec 09 10:23:50 crc kubenswrapper[5002]: I1209 10:23:50.246524 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:23:50 crc kubenswrapper[5002]: I1209 10:23:50.247280 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.24726249 podStartE2EDuration="2.24726249s" podCreationTimestamp="2025-12-09 10:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:50.230880768 +0000 UTC m=+1362.622931859" watchObservedRunningTime="2025-12-09 10:23:50.24726249 +0000 UTC m=+1362.639313571" Dec 09 10:23:50 crc kubenswrapper[5002]: I1209 10:23:50.555786 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 10:23:50 crc kubenswrapper[5002]: I1209 10:23:50.556174 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 10:23:51 crc kubenswrapper[5002]: I1209 10:23:51.220729 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e5a9794a-b66f-40d4-9e70-efc6a0a72d83","Type":"ContainerStarted","Data":"ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9"} Dec 09 10:23:51 crc kubenswrapper[5002]: I1209 10:23:51.261457 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.261436268 podStartE2EDuration="2.261436268s" podCreationTimestamp="2025-12-09 10:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:23:51.24526784 +0000 UTC m=+1363.637318921" watchObservedRunningTime="2025-12-09 10:23:51.261436268 +0000 UTC m=+1363.653487359" Dec 09 10:23:51 crc kubenswrapper[5002]: I1209 10:23:51.292654 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 10:23:51 crc kubenswrapper[5002]: I1209 10:23:51.622312 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 10:23:52 crc kubenswrapper[5002]: I1209 10:23:52.230544 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 09 10:23:55 crc kubenswrapper[5002]: I1209 10:23:55.165637 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:23:55 crc kubenswrapper[5002]: I1209 10:23:55.166657 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3fdef05d-e63e-48b6-8d88-6a374294940e" containerName="kube-state-metrics" containerID="cri-o://12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848" gracePeriod=30 Dec 09 10:23:55 crc kubenswrapper[5002]: I1209 10:23:55.556177 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 10:23:55 crc kubenswrapper[5002]: I1209 10:23:55.556585 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 10:23:55 crc kubenswrapper[5002]: I1209 10:23:55.810385 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 10:23:55 crc kubenswrapper[5002]: I1209 10:23:55.904248 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66r8s\" (UniqueName: \"kubernetes.io/projected/3fdef05d-e63e-48b6-8d88-6a374294940e-kube-api-access-66r8s\") pod \"3fdef05d-e63e-48b6-8d88-6a374294940e\" (UID: \"3fdef05d-e63e-48b6-8d88-6a374294940e\") " Dec 09 10:23:55 crc kubenswrapper[5002]: I1209 10:23:55.910640 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fdef05d-e63e-48b6-8d88-6a374294940e-kube-api-access-66r8s" (OuterVolumeSpecName: "kube-api-access-66r8s") pod "3fdef05d-e63e-48b6-8d88-6a374294940e" (UID: "3fdef05d-e63e-48b6-8d88-6a374294940e"). InnerVolumeSpecName "kube-api-access-66r8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.007645 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66r8s\" (UniqueName: \"kubernetes.io/projected/3fdef05d-e63e-48b6-8d88-6a374294940e-kube-api-access-66r8s\") on node \"crc\" DevicePath \"\"" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.274892 5002 generic.go:334] "Generic (PLEG): container finished" podID="3fdef05d-e63e-48b6-8d88-6a374294940e" containerID="12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848" exitCode=2 Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.274977 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.274995 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3fdef05d-e63e-48b6-8d88-6a374294940e","Type":"ContainerDied","Data":"12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848"} Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.276231 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3fdef05d-e63e-48b6-8d88-6a374294940e","Type":"ContainerDied","Data":"1f37c7d6fddd41f5cfe2b0a979617fdb72360dec2fb341bccfe819aadcc5b37a"} Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.276257 5002 scope.go:117] "RemoveContainer" containerID="12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.307511 5002 scope.go:117] "RemoveContainer" containerID="12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848" Dec 09 10:23:56 crc kubenswrapper[5002]: E1209 10:23:56.307971 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848\": container with ID starting with 12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848 not found: ID does not exist" containerID="12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.308010 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848"} err="failed to get container status \"12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848\": rpc error: code = NotFound desc = could not find container \"12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848\": container with ID starting with 12da1110ed3b978094ec24674980dfa5543d04422512d3f4ef3ff7e753242848 not found: ID does not exist" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.315988 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.331768 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.352993 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:23:56 crc kubenswrapper[5002]: E1209 10:23:56.353387 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fdef05d-e63e-48b6-8d88-6a374294940e" containerName="kube-state-metrics" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.353402 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fdef05d-e63e-48b6-8d88-6a374294940e" containerName="kube-state-metrics" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.353630 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fdef05d-e63e-48b6-8d88-6a374294940e" containerName="kube-state-metrics" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.354248 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.357313 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.357496 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.364425 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.415700 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sjlt\" (UniqueName: \"kubernetes.io/projected/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-api-access-9sjlt\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.415749 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.415868 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.415992 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.517692 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.517795 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.517895 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sjlt\" (UniqueName: \"kubernetes.io/projected/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-api-access-9sjlt\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.517918 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.523616 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.525195 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.526135 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.538724 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sjlt\" (UniqueName: \"kubernetes.io/projected/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-api-access-9sjlt\") pod \"kube-state-metrics-0\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " pod="openstack/kube-state-metrics-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.581167 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.581549 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.622046 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.648330 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 10:23:56 crc kubenswrapper[5002]: I1209 10:23:56.674663 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 10:23:57 crc kubenswrapper[5002]: I1209 10:23:57.127565 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:23:57 crc kubenswrapper[5002]: I1209 10:23:57.137414 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 10:23:57 crc kubenswrapper[5002]: I1209 10:23:57.238743 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:23:57 crc kubenswrapper[5002]: I1209 10:23:57.239293 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="ceilometer-central-agent" containerID="cri-o://29a3a70981ffc1c78ea042ff208a86bb4d0c41cf67ee7a5b6de71def9da9f55b" gracePeriod=30 Dec 09 10:23:57 crc kubenswrapper[5002]: I1209 10:23:57.239408 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="ceilometer-notification-agent" containerID="cri-o://667fe254c514762feed8f00659b9972ab6e5bc6dad1e2e2312a29b3309eb39b0" gracePeriod=30 Dec 09 10:23:57 crc kubenswrapper[5002]: I1209 10:23:57.239380 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="proxy-httpd" containerID="cri-o://a7d92c4732c10b0253989840b364e50476dfcce6b63ceef50d6e3a8462cc0004" gracePeriod=30 Dec 09 10:23:57 crc kubenswrapper[5002]: I1209 10:23:57.239514 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="sg-core" containerID="cri-o://906207b6ee5d87fa339a261555ebad22b0b552623b98629bf7b1a5247286b94a" gracePeriod=30 Dec 09 10:23:57 crc kubenswrapper[5002]: I1209 10:23:57.285198 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2adbbd67-ccdf-4444-b667-2b549bc200b5","Type":"ContainerStarted","Data":"6003a40fb265d020620cab6aa92b8f309caa3b915b079cf0782a54294791a405"} Dec 09 10:23:57 crc kubenswrapper[5002]: I1209 10:23:57.313865 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 10:23:58 crc kubenswrapper[5002]: I1209 10:23:58.077192 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fdef05d-e63e-48b6-8d88-6a374294940e" path="/var/lib/kubelet/pods/3fdef05d-e63e-48b6-8d88-6a374294940e/volumes" Dec 09 10:23:58 crc kubenswrapper[5002]: I1209 10:23:58.298054 5002 generic.go:334] "Generic (PLEG): container finished" podID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerID="a7d92c4732c10b0253989840b364e50476dfcce6b63ceef50d6e3a8462cc0004" exitCode=0 Dec 09 10:23:58 crc kubenswrapper[5002]: I1209 10:23:58.298108 5002 generic.go:334] "Generic (PLEG): container finished" podID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerID="906207b6ee5d87fa339a261555ebad22b0b552623b98629bf7b1a5247286b94a" exitCode=2 Dec 09 10:23:58 crc kubenswrapper[5002]: I1209 10:23:58.298128 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"863f7cf2-76b4-4780-8c7e-797bd1a52ea0","Type":"ContainerDied","Data":"a7d92c4732c10b0253989840b364e50476dfcce6b63ceef50d6e3a8462cc0004"} Dec 09 10:23:58 crc kubenswrapper[5002]: I1209 10:23:58.298171 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"863f7cf2-76b4-4780-8c7e-797bd1a52ea0","Type":"ContainerDied","Data":"906207b6ee5d87fa339a261555ebad22b0b552623b98629bf7b1a5247286b94a"} Dec 09 10:23:58 crc kubenswrapper[5002]: I1209 10:23:58.549658 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 10:23:58 crc kubenswrapper[5002]: I1209 10:23:58.549977 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 10:23:59 crc kubenswrapper[5002]: I1209 10:23:59.311207 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2adbbd67-ccdf-4444-b667-2b549bc200b5","Type":"ContainerStarted","Data":"7599cb61e23fe24a1fe7539f2fe49839b0013514838b373c23975501d3f53ed4"} Dec 09 10:23:59 crc kubenswrapper[5002]: I1209 10:23:59.315916 5002 generic.go:334] "Generic (PLEG): container finished" podID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerID="29a3a70981ffc1c78ea042ff208a86bb4d0c41cf67ee7a5b6de71def9da9f55b" exitCode=0 Dec 09 10:23:59 crc kubenswrapper[5002]: I1209 10:23:59.315966 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"863f7cf2-76b4-4780-8c7e-797bd1a52ea0","Type":"ContainerDied","Data":"29a3a70981ffc1c78ea042ff208a86bb4d0c41cf67ee7a5b6de71def9da9f55b"} Dec 09 10:23:59 crc kubenswrapper[5002]: I1209 10:23:59.340084 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.743363453 podStartE2EDuration="3.340070877s" podCreationTimestamp="2025-12-09 10:23:56 +0000 UTC" firstStartedPulling="2025-12-09 10:23:57.137011081 +0000 UTC m=+1369.529062162" lastFinishedPulling="2025-12-09 10:23:58.733718505 +0000 UTC m=+1371.125769586" observedRunningTime="2025-12-09 10:23:59.329877774 +0000 UTC m=+1371.721928855" watchObservedRunningTime="2025-12-09 10:23:59.340070877 +0000 UTC m=+1371.732121958" Dec 09 10:23:59 crc kubenswrapper[5002]: I1209 10:23:59.632996 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4fd6b926-1857-4c25-87f2-9477dc815651" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 10:23:59 crc kubenswrapper[5002]: I1209 10:23:59.632996 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4fd6b926-1857-4c25-87f2-9477dc815651" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 10:23:59 crc kubenswrapper[5002]: I1209 10:23:59.678789 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.158638 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-sblvw"] Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.160500 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.167129 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.167176 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.175495 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sblvw"] Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.185741 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-scripts\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.185937 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcxgm\" (UniqueName: \"kubernetes.io/projected/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-kube-api-access-qcxgm\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.186075 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-config-data\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.186430 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.287702 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-config-data\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.287861 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.287900 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-scripts\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.287932 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcxgm\" (UniqueName: \"kubernetes.io/projected/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-kube-api-access-qcxgm\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.294218 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.298157 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-scripts\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.309716 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-config-data\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.311466 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcxgm\" (UniqueName: \"kubernetes.io/projected/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-kube-api-access-qcxgm\") pod \"nova-cell1-cell-mapping-sblvw\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.360264 5002 generic.go:334] "Generic (PLEG): container finished" podID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerID="667fe254c514762feed8f00659b9972ab6e5bc6dad1e2e2312a29b3309eb39b0" exitCode=0 Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.363642 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"863f7cf2-76b4-4780-8c7e-797bd1a52ea0","Type":"ContainerDied","Data":"667fe254c514762feed8f00659b9972ab6e5bc6dad1e2e2312a29b3309eb39b0"} Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.365293 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.484293 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.621537 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.700575 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-sg-core-conf-yaml\") pod \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.700686 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-run-httpd\") pod \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.700719 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-scripts\") pod \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.700755 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs8gb\" (UniqueName: \"kubernetes.io/projected/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-kube-api-access-cs8gb\") pod \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.700849 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-config-data\") pod \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.700878 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-combined-ca-bundle\") pod \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.700947 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-log-httpd\") pod \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\" (UID: \"863f7cf2-76b4-4780-8c7e-797bd1a52ea0\") " Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.701788 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "863f7cf2-76b4-4780-8c7e-797bd1a52ea0" (UID: "863f7cf2-76b4-4780-8c7e-797bd1a52ea0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.702036 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "863f7cf2-76b4-4780-8c7e-797bd1a52ea0" (UID: "863f7cf2-76b4-4780-8c7e-797bd1a52ea0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.707943 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-kube-api-access-cs8gb" (OuterVolumeSpecName: "kube-api-access-cs8gb") pod "863f7cf2-76b4-4780-8c7e-797bd1a52ea0" (UID: "863f7cf2-76b4-4780-8c7e-797bd1a52ea0"). InnerVolumeSpecName "kube-api-access-cs8gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.723477 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-scripts" (OuterVolumeSpecName: "scripts") pod "863f7cf2-76b4-4780-8c7e-797bd1a52ea0" (UID: "863f7cf2-76b4-4780-8c7e-797bd1a52ea0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.732243 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "863f7cf2-76b4-4780-8c7e-797bd1a52ea0" (UID: "863f7cf2-76b4-4780-8c7e-797bd1a52ea0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.793291 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "863f7cf2-76b4-4780-8c7e-797bd1a52ea0" (UID: "863f7cf2-76b4-4780-8c7e-797bd1a52ea0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.803678 5002 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.803716 5002 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.803731 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.803744 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs8gb\" (UniqueName: \"kubernetes.io/projected/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-kube-api-access-cs8gb\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.803756 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.803769 5002 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.815211 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-config-data" (OuterVolumeSpecName: "config-data") pod "863f7cf2-76b4-4780-8c7e-797bd1a52ea0" (UID: "863f7cf2-76b4-4780-8c7e-797bd1a52ea0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.906034 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/863f7cf2-76b4-4780-8c7e-797bd1a52ea0-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:00 crc kubenswrapper[5002]: I1209 10:24:00.960215 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sblvw"] Dec 09 10:24:00 crc kubenswrapper[5002]: W1209 10:24:00.961430 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83eb6a58_6bec_43eb_bbef_1fc7d4ec76e7.slice/crio-ff91901a3a57c88120246562d6b03d7d85cc6ec279559b59aef4adc9ca9d65f4 WatchSource:0}: Error finding container ff91901a3a57c88120246562d6b03d7d85cc6ec279559b59aef4adc9ca9d65f4: Status 404 returned error can't find the container with id ff91901a3a57c88120246562d6b03d7d85cc6ec279559b59aef4adc9ca9d65f4 Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.372177 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"863f7cf2-76b4-4780-8c7e-797bd1a52ea0","Type":"ContainerDied","Data":"9b1f687954ff90dd411d3f05c2f108ff8221a863b8138b70b3515ca274005b86"} Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.372253 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.372582 5002 scope.go:117] "RemoveContainer" containerID="a7d92c4732c10b0253989840b364e50476dfcce6b63ceef50d6e3a8462cc0004" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.374537 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sblvw" event={"ID":"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7","Type":"ContainerStarted","Data":"ff91901a3a57c88120246562d6b03d7d85cc6ec279559b59aef4adc9ca9d65f4"} Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.417976 5002 scope.go:117] "RemoveContainer" containerID="906207b6ee5d87fa339a261555ebad22b0b552623b98629bf7b1a5247286b94a" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.430241 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.442599 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.450389 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:24:01 crc kubenswrapper[5002]: E1209 10:24:01.450904 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="sg-core" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.450921 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="sg-core" Dec 09 10:24:01 crc kubenswrapper[5002]: E1209 10:24:01.450953 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="ceilometer-notification-agent" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.450962 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="ceilometer-notification-agent" Dec 09 10:24:01 crc kubenswrapper[5002]: E1209 10:24:01.450986 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="proxy-httpd" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.450993 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="proxy-httpd" Dec 09 10:24:01 crc kubenswrapper[5002]: E1209 10:24:01.451020 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="ceilometer-central-agent" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.451028 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="ceilometer-central-agent" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.451239 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="sg-core" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.451259 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="proxy-httpd" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.451266 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="ceilometer-central-agent" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.451285 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" containerName="ceilometer-notification-agent" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.452981 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.455572 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.455730 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.455874 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.456850 5002 scope.go:117] "RemoveContainer" containerID="667fe254c514762feed8f00659b9972ab6e5bc6dad1e2e2312a29b3309eb39b0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.458347 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.516387 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-scripts\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.516458 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-config-data\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.516555 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.516619 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-run-httpd\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.516854 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-log-httpd\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.516899 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.516974 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.517038 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7s5l\" (UniqueName: \"kubernetes.io/projected/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-kube-api-access-r7s5l\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.522244 5002 scope.go:117] "RemoveContainer" containerID="29a3a70981ffc1c78ea042ff208a86bb4d0c41cf67ee7a5b6de71def9da9f55b" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.619048 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7s5l\" (UniqueName: \"kubernetes.io/projected/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-kube-api-access-r7s5l\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.619454 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-scripts\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.619499 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-config-data\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.619540 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.619565 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-run-httpd\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.619731 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-log-httpd\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.619764 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.619857 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.620280 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-run-httpd\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.620299 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-log-httpd\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.625445 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.625496 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.625607 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.626010 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-scripts\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.636365 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7s5l\" (UniqueName: \"kubernetes.io/projected/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-kube-api-access-r7s5l\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.645637 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-config-data\") pod \"ceilometer-0\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " pod="openstack/ceilometer-0" Dec 09 10:24:01 crc kubenswrapper[5002]: I1209 10:24:01.816922 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.073087 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="863f7cf2-76b4-4780-8c7e-797bd1a52ea0" path="/var/lib/kubelet/pods/863f7cf2-76b4-4780-8c7e-797bd1a52ea0/volumes" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.297089 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ltfqf"] Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.298951 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.334639 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-utilities\") pod \"redhat-operators-ltfqf\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.334674 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-catalog-content\") pod \"redhat-operators-ltfqf\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.334755 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgj78\" (UniqueName: \"kubernetes.io/projected/e661e6fc-55a2-4371-80a2-7d403ca48469-kube-api-access-tgj78\") pod \"redhat-operators-ltfqf\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.336160 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ltfqf"] Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.384431 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.394220 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sblvw" event={"ID":"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7","Type":"ContainerStarted","Data":"cec7c3fa420c540edced66346b2b76b161b34be979d648e08ae7cb019c22e6e1"} Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.436993 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-utilities\") pod \"redhat-operators-ltfqf\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.437044 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-catalog-content\") pod \"redhat-operators-ltfqf\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.437165 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgj78\" (UniqueName: \"kubernetes.io/projected/e661e6fc-55a2-4371-80a2-7d403ca48469-kube-api-access-tgj78\") pod \"redhat-operators-ltfqf\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.438055 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-utilities\") pod \"redhat-operators-ltfqf\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.438349 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-catalog-content\") pod \"redhat-operators-ltfqf\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.457395 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgj78\" (UniqueName: \"kubernetes.io/projected/e661e6fc-55a2-4371-80a2-7d403ca48469-kube-api-access-tgj78\") pod \"redhat-operators-ltfqf\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:02 crc kubenswrapper[5002]: I1209 10:24:02.617966 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:03 crc kubenswrapper[5002]: I1209 10:24:03.129614 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-sblvw" podStartSLOduration=3.129592154 podStartE2EDuration="3.129592154s" podCreationTimestamp="2025-12-09 10:24:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:24:02.418646402 +0000 UTC m=+1374.810697473" watchObservedRunningTime="2025-12-09 10:24:03.129592154 +0000 UTC m=+1375.521643235" Dec 09 10:24:03 crc kubenswrapper[5002]: I1209 10:24:03.136689 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ltfqf"] Dec 09 10:24:03 crc kubenswrapper[5002]: I1209 10:24:03.432358 5002 generic.go:334] "Generic (PLEG): container finished" podID="e661e6fc-55a2-4371-80a2-7d403ca48469" containerID="3050ad96585d359616c6b9a3489cb5b3592bc428a9ce1eeb21c9a74fc2dc3dfd" exitCode=0 Dec 09 10:24:03 crc kubenswrapper[5002]: I1209 10:24:03.432626 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltfqf" event={"ID":"e661e6fc-55a2-4371-80a2-7d403ca48469","Type":"ContainerDied","Data":"3050ad96585d359616c6b9a3489cb5b3592bc428a9ce1eeb21c9a74fc2dc3dfd"} Dec 09 10:24:03 crc kubenswrapper[5002]: I1209 10:24:03.432670 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltfqf" event={"ID":"e661e6fc-55a2-4371-80a2-7d403ca48469","Type":"ContainerStarted","Data":"863b52096323ce38596157a8f2c5d0fccd5a2782fea3d0678b951a6bbeaff570"} Dec 09 10:24:03 crc kubenswrapper[5002]: I1209 10:24:03.442230 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda","Type":"ContainerStarted","Data":"76149de6ce66b8cedf4a6dfeeb296a676c4aad79e373b6768b5a2d2a97a4c0e7"} Dec 09 10:24:03 crc kubenswrapper[5002]: I1209 10:24:03.442277 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda","Type":"ContainerStarted","Data":"b3aa33a9a4fc42b4224380fad05e9109c6d4c984a5497f6f554c60d94aefe67f"} Dec 09 10:24:04 crc kubenswrapper[5002]: I1209 10:24:04.451877 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda","Type":"ContainerStarted","Data":"eeaf8951f3287f5784fa9f9c2771e4bbb41389c597a580bb67b5f503705904e8"} Dec 09 10:24:04 crc kubenswrapper[5002]: I1209 10:24:04.453901 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltfqf" event={"ID":"e661e6fc-55a2-4371-80a2-7d403ca48469","Type":"ContainerStarted","Data":"e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001"} Dec 09 10:24:05 crc kubenswrapper[5002]: I1209 10:24:05.463642 5002 generic.go:334] "Generic (PLEG): container finished" podID="e661e6fc-55a2-4371-80a2-7d403ca48469" containerID="e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001" exitCode=0 Dec 09 10:24:05 crc kubenswrapper[5002]: I1209 10:24:05.463749 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltfqf" event={"ID":"e661e6fc-55a2-4371-80a2-7d403ca48469","Type":"ContainerDied","Data":"e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001"} Dec 09 10:24:05 crc kubenswrapper[5002]: I1209 10:24:05.561688 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 10:24:05 crc kubenswrapper[5002]: I1209 10:24:05.562127 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 10:24:05 crc kubenswrapper[5002]: I1209 10:24:05.568033 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 10:24:06 crc kubenswrapper[5002]: I1209 10:24:06.501029 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 10:24:07 crc kubenswrapper[5002]: I1209 10:24:07.441204 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.553291 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.553716 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.554355 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.554523 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.558648 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.561202 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.763196 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-cbdk5"] Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.766549 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.771563 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-cbdk5"] Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.803337 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.803474 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxtm6\" (UniqueName: \"kubernetes.io/projected/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-kube-api-access-gxtm6\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.803540 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.803585 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-config\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.803621 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.803646 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.905452 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.905514 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-config\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.905547 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.905568 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.905619 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.905683 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxtm6\" (UniqueName: \"kubernetes.io/projected/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-kube-api-access-gxtm6\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.906380 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.907684 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.907976 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.909176 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-config\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.909177 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:08 crc kubenswrapper[5002]: I1209 10:24:08.944890 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxtm6\" (UniqueName: \"kubernetes.io/projected/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-kube-api-access-gxtm6\") pod \"dnsmasq-dns-89c5cd4d5-cbdk5\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:09 crc kubenswrapper[5002]: I1209 10:24:09.095344 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:09 crc kubenswrapper[5002]: I1209 10:24:09.544129 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda","Type":"ContainerStarted","Data":"26497c5fbd77add1281dbc1c31b1c516391bb5c3c7fa69e4d5ed2284d381bb0e"} Dec 09 10:24:09 crc kubenswrapper[5002]: I1209 10:24:09.562331 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-cbdk5"] Dec 09 10:24:10 crc kubenswrapper[5002]: I1209 10:24:10.553020 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" event={"ID":"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0","Type":"ContainerStarted","Data":"9d8bc8ae158180c360cb14e1855a6831c104753e3830b538656e18fbb138197d"} Dec 09 10:24:12 crc kubenswrapper[5002]: I1209 10:24:12.263172 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:24:12 crc kubenswrapper[5002]: I1209 10:24:12.263800 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4fd6b926-1857-4c25-87f2-9477dc815651" containerName="nova-api-log" containerID="cri-o://3b7279a5fe451090bf124375a3dc865310dc7e7141c18544dd6e31e5222517d0" gracePeriod=30 Dec 09 10:24:12 crc kubenswrapper[5002]: I1209 10:24:12.263899 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4fd6b926-1857-4c25-87f2-9477dc815651" containerName="nova-api-api" containerID="cri-o://a11adff3f141188c9874f24498bda40a3808357613274458ca3d6932b9c1c32e" gracePeriod=30 Dec 09 10:24:12 crc kubenswrapper[5002]: I1209 10:24:12.890001 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="f702a539-ec25-44d4-8629-97b3c5499b96" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.175:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 10:24:14 crc kubenswrapper[5002]: I1209 10:24:14.597077 5002 generic.go:334] "Generic (PLEG): container finished" podID="83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7" containerID="cec7c3fa420c540edced66346b2b76b161b34be979d648e08ae7cb019c22e6e1" exitCode=0 Dec 09 10:24:14 crc kubenswrapper[5002]: I1209 10:24:14.597184 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sblvw" event={"ID":"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7","Type":"ContainerDied","Data":"cec7c3fa420c540edced66346b2b76b161b34be979d648e08ae7cb019c22e6e1"} Dec 09 10:24:14 crc kubenswrapper[5002]: I1209 10:24:14.603739 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltfqf" event={"ID":"e661e6fc-55a2-4371-80a2-7d403ca48469","Type":"ContainerStarted","Data":"50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb"} Dec 09 10:24:14 crc kubenswrapper[5002]: I1209 10:24:14.608133 5002 generic.go:334] "Generic (PLEG): container finished" podID="e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" containerID="adb75aabb4e985441e87b5cb83073e97ca8eadf6b2f9d37e10d79579ee3295dd" exitCode=0 Dec 09 10:24:14 crc kubenswrapper[5002]: I1209 10:24:14.608208 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" event={"ID":"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0","Type":"ContainerDied","Data":"adb75aabb4e985441e87b5cb83073e97ca8eadf6b2f9d37e10d79579ee3295dd"} Dec 09 10:24:14 crc kubenswrapper[5002]: I1209 10:24:14.617610 5002 generic.go:334] "Generic (PLEG): container finished" podID="4fd6b926-1857-4c25-87f2-9477dc815651" containerID="3b7279a5fe451090bf124375a3dc865310dc7e7141c18544dd6e31e5222517d0" exitCode=143 Dec 09 10:24:14 crc kubenswrapper[5002]: I1209 10:24:14.617963 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fd6b926-1857-4c25-87f2-9477dc815651","Type":"ContainerDied","Data":"3b7279a5fe451090bf124375a3dc865310dc7e7141c18544dd6e31e5222517d0"} Dec 09 10:24:14 crc kubenswrapper[5002]: I1209 10:24:14.744162 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ltfqf" podStartSLOduration=1.922903751 podStartE2EDuration="12.744137783s" podCreationTimestamp="2025-12-09 10:24:02 +0000 UTC" firstStartedPulling="2025-12-09 10:24:03.435176542 +0000 UTC m=+1375.827227643" lastFinishedPulling="2025-12-09 10:24:14.256410594 +0000 UTC m=+1386.648461675" observedRunningTime="2025-12-09 10:24:14.658411031 +0000 UTC m=+1387.050462112" watchObservedRunningTime="2025-12-09 10:24:14.744137783 +0000 UTC m=+1387.136188864" Dec 09 10:24:15 crc kubenswrapper[5002]: I1209 10:24:15.633518 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" event={"ID":"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0","Type":"ContainerStarted","Data":"3ddbe12bd810be7f1ea041b90c527d9d3fc61e0ea9bf9d295213946e6d8f92d8"} Dec 09 10:24:15 crc kubenswrapper[5002]: I1209 10:24:15.634108 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:15 crc kubenswrapper[5002]: I1209 10:24:15.636688 5002 generic.go:334] "Generic (PLEG): container finished" podID="4fd6b926-1857-4c25-87f2-9477dc815651" containerID="a11adff3f141188c9874f24498bda40a3808357613274458ca3d6932b9c1c32e" exitCode=0 Dec 09 10:24:15 crc kubenswrapper[5002]: I1209 10:24:15.636917 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fd6b926-1857-4c25-87f2-9477dc815651","Type":"ContainerDied","Data":"a11adff3f141188c9874f24498bda40a3808357613274458ca3d6932b9c1c32e"} Dec 09 10:24:15 crc kubenswrapper[5002]: I1209 10:24:15.672771 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" podStartSLOduration=7.672751723 podStartE2EDuration="7.672751723s" podCreationTimestamp="2025-12-09 10:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:24:15.656486993 +0000 UTC m=+1388.048538074" watchObservedRunningTime="2025-12-09 10:24:15.672751723 +0000 UTC m=+1388.064802804" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.045239 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.142003 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-config-data\") pod \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.142066 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-scripts\") pod \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.142138 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-combined-ca-bundle\") pod \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.142216 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcxgm\" (UniqueName: \"kubernetes.io/projected/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-kube-api-access-qcxgm\") pod \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\" (UID: \"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7\") " Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.146762 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-kube-api-access-qcxgm" (OuterVolumeSpecName: "kube-api-access-qcxgm") pod "83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7" (UID: "83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7"). InnerVolumeSpecName "kube-api-access-qcxgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.147081 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-scripts" (OuterVolumeSpecName: "scripts") pod "83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7" (UID: "83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.172412 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7" (UID: "83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.176030 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-config-data" (OuterVolumeSpecName: "config-data") pod "83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7" (UID: "83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.243664 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.243703 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.243713 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.243724 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcxgm\" (UniqueName: \"kubernetes.io/projected/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7-kube-api-access-qcxgm\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.461928 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.658254 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-config-data\") pod \"4fd6b926-1857-4c25-87f2-9477dc815651\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.661249 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-combined-ca-bundle\") pod \"4fd6b926-1857-4c25-87f2-9477dc815651\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.661288 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5f2r\" (UniqueName: \"kubernetes.io/projected/4fd6b926-1857-4c25-87f2-9477dc815651-kube-api-access-q5f2r\") pod \"4fd6b926-1857-4c25-87f2-9477dc815651\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.661326 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fd6b926-1857-4c25-87f2-9477dc815651-logs\") pod \"4fd6b926-1857-4c25-87f2-9477dc815651\" (UID: \"4fd6b926-1857-4c25-87f2-9477dc815651\") " Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.662385 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fd6b926-1857-4c25-87f2-9477dc815651-logs" (OuterVolumeSpecName: "logs") pod "4fd6b926-1857-4c25-87f2-9477dc815651" (UID: "4fd6b926-1857-4c25-87f2-9477dc815651"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.682286 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd6b926-1857-4c25-87f2-9477dc815651-kube-api-access-q5f2r" (OuterVolumeSpecName: "kube-api-access-q5f2r") pod "4fd6b926-1857-4c25-87f2-9477dc815651" (UID: "4fd6b926-1857-4c25-87f2-9477dc815651"). InnerVolumeSpecName "kube-api-access-q5f2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.694540 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda","Type":"ContainerStarted","Data":"429f78f8bb9188800330f3a2b6f3d36e71b4c9578367e496ad00775874a81bc2"} Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.696070 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.707370 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-config-data" (OuterVolumeSpecName: "config-data") pod "4fd6b926-1857-4c25-87f2-9477dc815651" (UID: "4fd6b926-1857-4c25-87f2-9477dc815651"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.707648 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.708157 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fd6b926-1857-4c25-87f2-9477dc815651","Type":"ContainerDied","Data":"4723f39dd909c80a7b45d3ee3ab6e46f1d61edf6f58d2b5829b89c5316f27a0b"} Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.708194 5002 scope.go:117] "RemoveContainer" containerID="a11adff3f141188c9874f24498bda40a3808357613274458ca3d6932b9c1c32e" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.716531 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sblvw" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.717648 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sblvw" event={"ID":"83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7","Type":"ContainerDied","Data":"ff91901a3a57c88120246562d6b03d7d85cc6ec279559b59aef4adc9ca9d65f4"} Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.717680 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff91901a3a57c88120246562d6b03d7d85cc6ec279559b59aef4adc9ca9d65f4" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.736348 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fd6b926-1857-4c25-87f2-9477dc815651" (UID: "4fd6b926-1857-4c25-87f2-9477dc815651"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.738072 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.580881413 podStartE2EDuration="15.738048901s" podCreationTimestamp="2025-12-09 10:24:01 +0000 UTC" firstStartedPulling="2025-12-09 10:24:02.418733365 +0000 UTC m=+1374.810784446" lastFinishedPulling="2025-12-09 10:24:15.575900853 +0000 UTC m=+1387.967951934" observedRunningTime="2025-12-09 10:24:16.714287137 +0000 UTC m=+1389.106338238" watchObservedRunningTime="2025-12-09 10:24:16.738048901 +0000 UTC m=+1389.130099982" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.763332 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.763398 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd6b926-1857-4c25-87f2-9477dc815651-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.763421 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5f2r\" (UniqueName: \"kubernetes.io/projected/4fd6b926-1857-4c25-87f2-9477dc815651-kube-api-access-q5f2r\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.763442 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fd6b926-1857-4c25-87f2-9477dc815651-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.825572 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.825837 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f12d2757-e55a-4fd6-a910-857f03cb8f64" containerName="nova-scheduler-scheduler" containerID="cri-o://e86f743bac5750837d76b9e1ac24b67b07615ac1c447105837da31f3f7fe02f1" gracePeriod=30 Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.858675 5002 scope.go:117] "RemoveContainer" containerID="3b7279a5fe451090bf124375a3dc865310dc7e7141c18544dd6e31e5222517d0" Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.886319 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.886901 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-metadata" containerID="cri-o://02343a85391413db414177ad41f87801d36aa19d3a0fc605a5eedfd1aa3dcf76" gracePeriod=30 Dec 09 10:24:16 crc kubenswrapper[5002]: I1209 10:24:16.886604 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-log" containerID="cri-o://de0c51c01e1bbeadedfa7668fc8d59bbd0722e47a08a6a6ee9a64bf53dcad53a" gracePeriod=30 Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.057202 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.065327 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.085525 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 10:24:17 crc kubenswrapper[5002]: E1209 10:24:17.085910 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7" containerName="nova-manage" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.085926 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7" containerName="nova-manage" Dec 09 10:24:17 crc kubenswrapper[5002]: E1209 10:24:17.085945 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd6b926-1857-4c25-87f2-9477dc815651" containerName="nova-api-log" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.085951 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd6b926-1857-4c25-87f2-9477dc815651" containerName="nova-api-log" Dec 09 10:24:17 crc kubenswrapper[5002]: E1209 10:24:17.085967 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd6b926-1857-4c25-87f2-9477dc815651" containerName="nova-api-api" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.085974 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd6b926-1857-4c25-87f2-9477dc815651" containerName="nova-api-api" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.086137 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7" containerName="nova-manage" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.086156 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd6b926-1857-4c25-87f2-9477dc815651" containerName="nova-api-api" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.086175 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd6b926-1857-4c25-87f2-9477dc815651" containerName="nova-api-log" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.087089 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.089521 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.090406 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.092054 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.104674 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.171379 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.171843 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nsvr\" (UniqueName: \"kubernetes.io/projected/6b5836b7-7b16-477f-9a20-f30032362374-kube-api-access-5nsvr\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.171950 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.172111 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-config-data\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.172332 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5836b7-7b16-477f-9a20-f30032362374-logs\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.172476 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-public-tls-certs\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.274778 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5836b7-7b16-477f-9a20-f30032362374-logs\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.275128 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-public-tls-certs\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.275214 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5836b7-7b16-477f-9a20-f30032362374-logs\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.275339 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.275434 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nsvr\" (UniqueName: \"kubernetes.io/projected/6b5836b7-7b16-477f-9a20-f30032362374-kube-api-access-5nsvr\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.275520 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.275616 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-config-data\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.278910 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.279000 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-public-tls-certs\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.280325 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-config-data\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.280366 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.298029 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nsvr\" (UniqueName: \"kubernetes.io/projected/6b5836b7-7b16-477f-9a20-f30032362374-kube-api-access-5nsvr\") pod \"nova-api-0\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.407658 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.731693 5002 generic.go:334] "Generic (PLEG): container finished" podID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerID="de0c51c01e1bbeadedfa7668fc8d59bbd0722e47a08a6a6ee9a64bf53dcad53a" exitCode=143 Dec 09 10:24:17 crc kubenswrapper[5002]: I1209 10:24:17.731753 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b","Type":"ContainerDied","Data":"de0c51c01e1bbeadedfa7668fc8d59bbd0722e47a08a6a6ee9a64bf53dcad53a"} Dec 09 10:24:18 crc kubenswrapper[5002]: I1209 10:24:18.052945 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:24:18 crc kubenswrapper[5002]: I1209 10:24:18.071260 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd6b926-1857-4c25-87f2-9477dc815651" path="/var/lib/kubelet/pods/4fd6b926-1857-4c25-87f2-9477dc815651/volumes" Dec 09 10:24:18 crc kubenswrapper[5002]: I1209 10:24:18.332612 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:24:18 crc kubenswrapper[5002]: I1209 10:24:18.748205 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b5836b7-7b16-477f-9a20-f30032362374","Type":"ContainerStarted","Data":"290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd"} Dec 09 10:24:18 crc kubenswrapper[5002]: I1209 10:24:18.748527 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b5836b7-7b16-477f-9a20-f30032362374","Type":"ContainerStarted","Data":"b4ae8614bcce2d361a6cf0745ddf8e2e52c50ea1762b1b442ae058f343acb92f"} Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.097020 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.186478 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7dbbg"] Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.186720 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" podUID="ea94dfeb-8659-48ad-9f5a-da8202588f0f" containerName="dnsmasq-dns" containerID="cri-o://31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902" gracePeriod=10 Dec 09 10:24:19 crc kubenswrapper[5002]: E1209 10:24:19.527957 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea94dfeb_8659_48ad_9f5a_da8202588f0f.slice/crio-conmon-31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902.scope\": RecentStats: unable to find data in memory cache]" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.755305 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.761075 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b5836b7-7b16-477f-9a20-f30032362374","Type":"ContainerStarted","Data":"aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304"} Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.765617 5002 generic.go:334] "Generic (PLEG): container finished" podID="ea94dfeb-8659-48ad-9f5a-da8202588f0f" containerID="31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902" exitCode=0 Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.765907 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="ceilometer-central-agent" containerID="cri-o://76149de6ce66b8cedf4a6dfeeb296a676c4aad79e373b6768b5a2d2a97a4c0e7" gracePeriod=30 Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.766036 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" event={"ID":"ea94dfeb-8659-48ad-9f5a-da8202588f0f","Type":"ContainerDied","Data":"31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902"} Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.766110 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" event={"ID":"ea94dfeb-8659-48ad-9f5a-da8202588f0f","Type":"ContainerDied","Data":"805aa91bb48d32efc4b8adc7fc3487b23a26eb3df9bfdfdab943bd052a42f4e4"} Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.766105 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="sg-core" containerID="cri-o://26497c5fbd77add1281dbc1c31b1c516391bb5c3c7fa69e4d5ed2284d381bb0e" gracePeriod=30 Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.766163 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="ceilometer-notification-agent" containerID="cri-o://eeaf8951f3287f5784fa9f9c2771e4bbb41389c597a580bb67b5f503705904e8" gracePeriod=30 Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.766168 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7dbbg" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.766636 5002 scope.go:117] "RemoveContainer" containerID="31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.766719 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="proxy-httpd" containerID="cri-o://429f78f8bb9188800330f3a2b6f3d36e71b4c9578367e496ad00775874a81bc2" gracePeriod=30 Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.834757 5002 scope.go:117] "RemoveContainer" containerID="f0293e5b5f7f5a7d5ec9c6b4dfce8e4f7f1b1faa0700b8fb01f00a08e3aa8546" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.838544 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-svc\") pod \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.838709 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-swift-storage-0\") pod \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.838790 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-sb\") pod \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.838904 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqhxf\" (UniqueName: \"kubernetes.io/projected/ea94dfeb-8659-48ad-9f5a-da8202588f0f-kube-api-access-fqhxf\") pod \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.838939 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-config\") pod \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.839024 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-nb\") pod \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\" (UID: \"ea94dfeb-8659-48ad-9f5a-da8202588f0f\") " Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.853057 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea94dfeb-8659-48ad-9f5a-da8202588f0f-kube-api-access-fqhxf" (OuterVolumeSpecName: "kube-api-access-fqhxf") pod "ea94dfeb-8659-48ad-9f5a-da8202588f0f" (UID: "ea94dfeb-8659-48ad-9f5a-da8202588f0f"). InnerVolumeSpecName "kube-api-access-fqhxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.871595 5002 scope.go:117] "RemoveContainer" containerID="31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902" Dec 09 10:24:19 crc kubenswrapper[5002]: E1209 10:24:19.877690 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902\": container with ID starting with 31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902 not found: ID does not exist" containerID="31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.877772 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902"} err="failed to get container status \"31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902\": rpc error: code = NotFound desc = could not find container \"31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902\": container with ID starting with 31c0b2dcf0f9a9403dc1e061b2386dbfb708c1ba3620f0a47335b55cf6589902 not found: ID does not exist" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.877801 5002 scope.go:117] "RemoveContainer" containerID="f0293e5b5f7f5a7d5ec9c6b4dfce8e4f7f1b1faa0700b8fb01f00a08e3aa8546" Dec 09 10:24:19 crc kubenswrapper[5002]: E1209 10:24:19.879187 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0293e5b5f7f5a7d5ec9c6b4dfce8e4f7f1b1faa0700b8fb01f00a08e3aa8546\": container with ID starting with f0293e5b5f7f5a7d5ec9c6b4dfce8e4f7f1b1faa0700b8fb01f00a08e3aa8546 not found: ID does not exist" containerID="f0293e5b5f7f5a7d5ec9c6b4dfce8e4f7f1b1faa0700b8fb01f00a08e3aa8546" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.879213 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0293e5b5f7f5a7d5ec9c6b4dfce8e4f7f1b1faa0700b8fb01f00a08e3aa8546"} err="failed to get container status \"f0293e5b5f7f5a7d5ec9c6b4dfce8e4f7f1b1faa0700b8fb01f00a08e3aa8546\": rpc error: code = NotFound desc = could not find container \"f0293e5b5f7f5a7d5ec9c6b4dfce8e4f7f1b1faa0700b8fb01f00a08e3aa8546\": container with ID starting with f0293e5b5f7f5a7d5ec9c6b4dfce8e4f7f1b1faa0700b8fb01f00a08e3aa8546 not found: ID does not exist" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.933377 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ea94dfeb-8659-48ad-9f5a-da8202588f0f" (UID: "ea94dfeb-8659-48ad-9f5a-da8202588f0f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.942494 5002 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.942530 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqhxf\" (UniqueName: \"kubernetes.io/projected/ea94dfeb-8659-48ad-9f5a-da8202588f0f-kube-api-access-fqhxf\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.952702 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ea94dfeb-8659-48ad-9f5a-da8202588f0f" (UID: "ea94dfeb-8659-48ad-9f5a-da8202588f0f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.957546 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ea94dfeb-8659-48ad-9f5a-da8202588f0f" (UID: "ea94dfeb-8659-48ad-9f5a-da8202588f0f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.965494 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea94dfeb-8659-48ad-9f5a-da8202588f0f" (UID: "ea94dfeb-8659-48ad-9f5a-da8202588f0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:24:19 crc kubenswrapper[5002]: I1209 10:24:19.978273 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-config" (OuterVolumeSpecName: "config") pod "ea94dfeb-8659-48ad-9f5a-da8202588f0f" (UID: "ea94dfeb-8659-48ad-9f5a-da8202588f0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.044715 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.044765 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.044783 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.044795 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea94dfeb-8659-48ad-9f5a-da8202588f0f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.159462 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.159440345 podStartE2EDuration="3.159440345s" podCreationTimestamp="2025-12-09 10:24:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:24:19.823475713 +0000 UTC m=+1392.215526794" watchObservedRunningTime="2025-12-09 10:24:20.159440345 +0000 UTC m=+1392.551491426" Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.168339 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7dbbg"] Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.177564 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7dbbg"] Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.557068 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": dial tcp 10.217.0.188:8775: connect: connection refused" Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.557108 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": dial tcp 10.217.0.188:8775: connect: connection refused" Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.795497 5002 generic.go:334] "Generic (PLEG): container finished" podID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerID="02343a85391413db414177ad41f87801d36aa19d3a0fc605a5eedfd1aa3dcf76" exitCode=0 Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.795573 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b","Type":"ContainerDied","Data":"02343a85391413db414177ad41f87801d36aa19d3a0fc605a5eedfd1aa3dcf76"} Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.809704 5002 generic.go:334] "Generic (PLEG): container finished" podID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerID="429f78f8bb9188800330f3a2b6f3d36e71b4c9578367e496ad00775874a81bc2" exitCode=0 Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.809736 5002 generic.go:334] "Generic (PLEG): container finished" podID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerID="26497c5fbd77add1281dbc1c31b1c516391bb5c3c7fa69e4d5ed2284d381bb0e" exitCode=2 Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.809743 5002 generic.go:334] "Generic (PLEG): container finished" podID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerID="eeaf8951f3287f5784fa9f9c2771e4bbb41389c597a580bb67b5f503705904e8" exitCode=0 Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.809751 5002 generic.go:334] "Generic (PLEG): container finished" podID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerID="76149de6ce66b8cedf4a6dfeeb296a676c4aad79e373b6768b5a2d2a97a4c0e7" exitCode=0 Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.809829 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda","Type":"ContainerDied","Data":"429f78f8bb9188800330f3a2b6f3d36e71b4c9578367e496ad00775874a81bc2"} Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.809904 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda","Type":"ContainerDied","Data":"26497c5fbd77add1281dbc1c31b1c516391bb5c3c7fa69e4d5ed2284d381bb0e"} Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.809918 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda","Type":"ContainerDied","Data":"eeaf8951f3287f5784fa9f9c2771e4bbb41389c597a580bb67b5f503705904e8"} Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.809930 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda","Type":"ContainerDied","Data":"76149de6ce66b8cedf4a6dfeeb296a676c4aad79e373b6768b5a2d2a97a4c0e7"} Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.812318 5002 generic.go:334] "Generic (PLEG): container finished" podID="f12d2757-e55a-4fd6-a910-857f03cb8f64" containerID="e86f743bac5750837d76b9e1ac24b67b07615ac1c447105837da31f3f7fe02f1" exitCode=0 Dec 09 10:24:20 crc kubenswrapper[5002]: I1209 10:24:20.813075 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f12d2757-e55a-4fd6-a910-857f03cb8f64","Type":"ContainerDied","Data":"e86f743bac5750837d76b9e1ac24b67b07615ac1c447105837da31f3f7fe02f1"} Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.121239 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.268973 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.269696 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws6zs\" (UniqueName: \"kubernetes.io/projected/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-kube-api-access-ws6zs\") pod \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.269741 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-nova-metadata-tls-certs\") pod \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.269996 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-config-data\") pod \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.270026 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-logs\") pod \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.270045 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-combined-ca-bundle\") pod \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\" (UID: \"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.278097 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.284466 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-logs" (OuterVolumeSpecName: "logs") pod "f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" (UID: "f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.322055 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-kube-api-access-ws6zs" (OuterVolumeSpecName: "kube-api-access-ws6zs") pod "f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" (UID: "f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b"). InnerVolumeSpecName "kube-api-access-ws6zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.386326 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-combined-ca-bundle\") pod \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.386379 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-ceilometer-tls-certs\") pod \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.386406 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-combined-ca-bundle\") pod \"f12d2757-e55a-4fd6-a910-857f03cb8f64\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.386429 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmld9\" (UniqueName: \"kubernetes.io/projected/f12d2757-e55a-4fd6-a910-857f03cb8f64-kube-api-access-fmld9\") pod \"f12d2757-e55a-4fd6-a910-857f03cb8f64\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.386547 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-log-httpd\") pod \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.386628 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-sg-core-conf-yaml\") pod \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.386643 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-run-httpd\") pod \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.386694 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-config-data\") pod \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.386720 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-config-data\") pod \"f12d2757-e55a-4fd6-a910-857f03cb8f64\" (UID: \"f12d2757-e55a-4fd6-a910-857f03cb8f64\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.386735 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7s5l\" (UniqueName: \"kubernetes.io/projected/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-kube-api-access-r7s5l\") pod \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.386772 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-scripts\") pod \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\" (UID: \"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda\") " Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.387131 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.387142 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws6zs\" (UniqueName: \"kubernetes.io/projected/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-kube-api-access-ws6zs\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.389983 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-scripts" (OuterVolumeSpecName: "scripts") pod "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" (UID: "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.393697 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" (UID: "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.402162 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" (UID: "f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.409056 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" (UID: "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.409664 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-config-data" (OuterVolumeSpecName: "config-data") pod "f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" (UID: "f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.434028 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-kube-api-access-r7s5l" (OuterVolumeSpecName: "kube-api-access-r7s5l") pod "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" (UID: "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda"). InnerVolumeSpecName "kube-api-access-r7s5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.436472 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12d2757-e55a-4fd6-a910-857f03cb8f64-kube-api-access-fmld9" (OuterVolumeSpecName: "kube-api-access-fmld9") pod "f12d2757-e55a-4fd6-a910-857f03cb8f64" (UID: "f12d2757-e55a-4fd6-a910-857f03cb8f64"). InnerVolumeSpecName "kube-api-access-fmld9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.466694 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f12d2757-e55a-4fd6-a910-857f03cb8f64" (UID: "f12d2757-e55a-4fd6-a910-857f03cb8f64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.483047 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-config-data" (OuterVolumeSpecName: "config-data") pod "f12d2757-e55a-4fd6-a910-857f03cb8f64" (UID: "f12d2757-e55a-4fd6-a910-857f03cb8f64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.488933 5002 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.488959 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.488968 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7s5l\" (UniqueName: \"kubernetes.io/projected/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-kube-api-access-r7s5l\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.488980 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.488990 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12d2757-e55a-4fd6-a910-857f03cb8f64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.488999 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmld9\" (UniqueName: \"kubernetes.io/projected/f12d2757-e55a-4fd6-a910-857f03cb8f64-kube-api-access-fmld9\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.489008 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.489016 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.489025 5002 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.502351 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" (UID: "f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.505037 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" (UID: "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.507403 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" (UID: "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.573708 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-config-data" (OuterVolumeSpecName: "config-data") pod "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" (UID: "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.578068 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" (UID: "1b2a4f32-d84d-4f4c-8085-8df23c1a1fda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.590293 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.590321 5002 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.590332 5002 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.590341 5002 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.590349 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.822591 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b","Type":"ContainerDied","Data":"7dde305264ab5f31fca8ca2a9dc8095f30989349501c79d8a49ac78269d8cfa9"} Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.822653 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.823201 5002 scope.go:117] "RemoveContainer" containerID="02343a85391413db414177ad41f87801d36aa19d3a0fc605a5eedfd1aa3dcf76" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.826413 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b2a4f32-d84d-4f4c-8085-8df23c1a1fda","Type":"ContainerDied","Data":"b3aa33a9a4fc42b4224380fad05e9109c6d4c984a5497f6f554c60d94aefe67f"} Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.826443 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.831331 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f12d2757-e55a-4fd6-a910-857f03cb8f64","Type":"ContainerDied","Data":"d0144d5cf6b715a94b6a4f5a2430c03665c10aadc7c0a091fa9f192c2304f812"} Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.831432 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.861188 5002 scope.go:117] "RemoveContainer" containerID="de0c51c01e1bbeadedfa7668fc8d59bbd0722e47a08a6a6ee9a64bf53dcad53a" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.871557 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.895372 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.899191 5002 scope.go:117] "RemoveContainer" containerID="429f78f8bb9188800330f3a2b6f3d36e71b4c9578367e496ad00775874a81bc2" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.909954 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.933002 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.959502 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:24:21 crc kubenswrapper[5002]: E1209 10:24:21.961014 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea94dfeb-8659-48ad-9f5a-da8202588f0f" containerName="init" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961040 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea94dfeb-8659-48ad-9f5a-da8202588f0f" containerName="init" Dec 09 10:24:21 crc kubenswrapper[5002]: E1209 10:24:21.961070 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="ceilometer-notification-agent" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961078 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="ceilometer-notification-agent" Dec 09 10:24:21 crc kubenswrapper[5002]: E1209 10:24:21.961089 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-log" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961100 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-log" Dec 09 10:24:21 crc kubenswrapper[5002]: E1209 10:24:21.961113 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea94dfeb-8659-48ad-9f5a-da8202588f0f" containerName="dnsmasq-dns" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961121 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea94dfeb-8659-48ad-9f5a-da8202588f0f" containerName="dnsmasq-dns" Dec 09 10:24:21 crc kubenswrapper[5002]: E1209 10:24:21.961138 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12d2757-e55a-4fd6-a910-857f03cb8f64" containerName="nova-scheduler-scheduler" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961146 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12d2757-e55a-4fd6-a910-857f03cb8f64" containerName="nova-scheduler-scheduler" Dec 09 10:24:21 crc kubenswrapper[5002]: E1209 10:24:21.961161 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="proxy-httpd" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961169 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="proxy-httpd" Dec 09 10:24:21 crc kubenswrapper[5002]: E1209 10:24:21.961189 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="sg-core" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961197 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="sg-core" Dec 09 10:24:21 crc kubenswrapper[5002]: E1209 10:24:21.961209 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-metadata" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961218 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-metadata" Dec 09 10:24:21 crc kubenswrapper[5002]: E1209 10:24:21.961236 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="ceilometer-central-agent" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961244 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="ceilometer-central-agent" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961480 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="sg-core" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961518 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="ceilometer-notification-agent" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961535 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="proxy-httpd" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961551 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12d2757-e55a-4fd6-a910-857f03cb8f64" containerName="nova-scheduler-scheduler" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961564 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" containerName="ceilometer-central-agent" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961580 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-metadata" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961595 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" containerName="nova-metadata-log" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.961614 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea94dfeb-8659-48ad-9f5a-da8202588f0f" containerName="dnsmasq-dns" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.964385 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.968195 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.972099 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.972248 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.978751 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.979735 5002 scope.go:117] "RemoveContainer" containerID="26497c5fbd77add1281dbc1c31b1c516391bb5c3c7fa69e4d5ed2284d381bb0e" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.988143 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.989950 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.992880 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.996234 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 09 10:24:21 crc kubenswrapper[5002]: I1209 10:24:21.996842 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.005610 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.005692 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd8a7609-928f-4a68-9903-fa846e4baeda-logs\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.005716 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.005742 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.005869 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sgqj\" (UniqueName: \"kubernetes.io/projected/cd8a7609-928f-4a68-9903-fa846e4baeda-kube-api-access-6sgqj\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.005922 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-config-data\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.005971 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.006030 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd4lq\" (UniqueName: \"kubernetes.io/projected/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-kube-api-access-cd4lq\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.006119 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-config-data\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.006173 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.006258 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-log-httpd\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.006276 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-run-httpd\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.006301 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-scripts\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.011430 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.021462 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.026163 5002 scope.go:117] "RemoveContainer" containerID="eeaf8951f3287f5784fa9f9c2771e4bbb41389c597a580bb67b5f503705904e8" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.035226 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.037370 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.040222 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.045676 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.061257 5002 scope.go:117] "RemoveContainer" containerID="76149de6ce66b8cedf4a6dfeeb296a676c4aad79e373b6768b5a2d2a97a4c0e7" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.071564 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2a4f32-d84d-4f4c-8085-8df23c1a1fda" path="/var/lib/kubelet/pods/1b2a4f32-d84d-4f4c-8085-8df23c1a1fda/volumes" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.072519 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea94dfeb-8659-48ad-9f5a-da8202588f0f" path="/var/lib/kubelet/pods/ea94dfeb-8659-48ad-9f5a-da8202588f0f/volumes" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.073151 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f12d2757-e55a-4fd6-a910-857f03cb8f64" path="/var/lib/kubelet/pods/f12d2757-e55a-4fd6-a910-857f03cb8f64/volumes" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.074277 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b" path="/var/lib/kubelet/pods/f25c15ad-ecf7-44b0-b68c-06a00ad0ab9b/volumes" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.084713 5002 scope.go:117] "RemoveContainer" containerID="e86f743bac5750837d76b9e1ac24b67b07615ac1c447105837da31f3f7fe02f1" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.107675 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.107743 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd4lq\" (UniqueName: \"kubernetes.io/projected/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-kube-api-access-cd4lq\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.107794 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-config-data\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.107860 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.107887 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-config-data\") pod \"nova-scheduler-0\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " pod="openstack/nova-scheduler-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.107916 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpcg9\" (UniqueName: \"kubernetes.io/projected/43512a9c-be3a-4c0e-a178-82c5a065acf4-kube-api-access-rpcg9\") pod \"nova-scheduler-0\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " pod="openstack/nova-scheduler-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.107950 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " pod="openstack/nova-scheduler-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.107993 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-log-httpd\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.108016 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-run-httpd\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.108039 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-scripts\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.108097 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.108165 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.108185 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd8a7609-928f-4a68-9903-fa846e4baeda-logs\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.108222 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.108243 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sgqj\" (UniqueName: \"kubernetes.io/projected/cd8a7609-928f-4a68-9903-fa846e4baeda-kube-api-access-6sgqj\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.108266 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-config-data\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.109413 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-run-httpd\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.109801 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd8a7609-928f-4a68-9903-fa846e4baeda-logs\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.111888 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-log-httpd\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.113009 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.113369 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-config-data\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.113881 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.113028 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-scripts\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.115436 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.116015 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.116130 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.117645 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-config-data\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.131398 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd4lq\" (UniqueName: \"kubernetes.io/projected/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-kube-api-access-cd4lq\") pod \"ceilometer-0\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.135457 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sgqj\" (UniqueName: \"kubernetes.io/projected/cd8a7609-928f-4a68-9903-fa846e4baeda-kube-api-access-6sgqj\") pod \"nova-metadata-0\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.210505 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-config-data\") pod \"nova-scheduler-0\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " pod="openstack/nova-scheduler-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.210567 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpcg9\" (UniqueName: \"kubernetes.io/projected/43512a9c-be3a-4c0e-a178-82c5a065acf4-kube-api-access-rpcg9\") pod \"nova-scheduler-0\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " pod="openstack/nova-scheduler-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.210602 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " pod="openstack/nova-scheduler-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.214290 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " pod="openstack/nova-scheduler-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.215480 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-config-data\") pod \"nova-scheduler-0\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " pod="openstack/nova-scheduler-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.229380 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpcg9\" (UniqueName: \"kubernetes.io/projected/43512a9c-be3a-4c0e-a178-82c5a065acf4-kube-api-access-rpcg9\") pod \"nova-scheduler-0\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " pod="openstack/nova-scheduler-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.294396 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.325941 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.357141 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.618164 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.618478 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.668174 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.835014 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:24:22 crc kubenswrapper[5002]: W1209 10:24:22.847035 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd8a7609_928f_4a68_9903_fa846e4baeda.slice/crio-0d74668f0d505c6a8c3194b12cdbeebf089a0943b4250117059eacc60705a897 WatchSource:0}: Error finding container 0d74668f0d505c6a8c3194b12cdbeebf089a0943b4250117059eacc60705a897: Status 404 returned error can't find the container with id 0d74668f0d505c6a8c3194b12cdbeebf089a0943b4250117059eacc60705a897 Dec 09 10:24:22 crc kubenswrapper[5002]: W1209 10:24:22.855937 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5893e6fa_5b64_47e0_b8e1_f68baf27a65c.slice/crio-f8771d85afadccb5dd59c206f45b23474d80ca51d4c15a745a5ccc0e50b4d6c6 WatchSource:0}: Error finding container f8771d85afadccb5dd59c206f45b23474d80ca51d4c15a745a5ccc0e50b4d6c6: Status 404 returned error can't find the container with id f8771d85afadccb5dd59c206f45b23474d80ca51d4c15a745a5ccc0e50b4d6c6 Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.858900 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.907012 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:22 crc kubenswrapper[5002]: I1209 10:24:22.973120 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ltfqf"] Dec 09 10:24:23 crc kubenswrapper[5002]: I1209 10:24:23.008273 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:24:23 crc kubenswrapper[5002]: W1209 10:24:23.011009 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43512a9c_be3a_4c0e_a178_82c5a065acf4.slice/crio-0bb1d4253cd67a6308db800435b668431a4ad39102bd21c8bdb2de49d5a470ee WatchSource:0}: Error finding container 0bb1d4253cd67a6308db800435b668431a4ad39102bd21c8bdb2de49d5a470ee: Status 404 returned error can't find the container with id 0bb1d4253cd67a6308db800435b668431a4ad39102bd21c8bdb2de49d5a470ee Dec 09 10:24:23 crc kubenswrapper[5002]: I1209 10:24:23.879017 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5893e6fa-5b64-47e0-b8e1-f68baf27a65c","Type":"ContainerStarted","Data":"7a033c871bfeb20ac9488a0b5376331a6b527d79d3198b2f244e621f3b53eb12"} Dec 09 10:24:23 crc kubenswrapper[5002]: I1209 10:24:23.879775 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5893e6fa-5b64-47e0-b8e1-f68baf27a65c","Type":"ContainerStarted","Data":"f8771d85afadccb5dd59c206f45b23474d80ca51d4c15a745a5ccc0e50b4d6c6"} Dec 09 10:24:23 crc kubenswrapper[5002]: I1209 10:24:23.881571 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43512a9c-be3a-4c0e-a178-82c5a065acf4","Type":"ContainerStarted","Data":"36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a"} Dec 09 10:24:23 crc kubenswrapper[5002]: I1209 10:24:23.881616 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43512a9c-be3a-4c0e-a178-82c5a065acf4","Type":"ContainerStarted","Data":"0bb1d4253cd67a6308db800435b668431a4ad39102bd21c8bdb2de49d5a470ee"} Dec 09 10:24:23 crc kubenswrapper[5002]: I1209 10:24:23.885445 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8a7609-928f-4a68-9903-fa846e4baeda","Type":"ContainerStarted","Data":"177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780"} Dec 09 10:24:23 crc kubenswrapper[5002]: I1209 10:24:23.885476 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8a7609-928f-4a68-9903-fa846e4baeda","Type":"ContainerStarted","Data":"47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9"} Dec 09 10:24:23 crc kubenswrapper[5002]: I1209 10:24:23.885486 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8a7609-928f-4a68-9903-fa846e4baeda","Type":"ContainerStarted","Data":"0d74668f0d505c6a8c3194b12cdbeebf089a0943b4250117059eacc60705a897"} Dec 09 10:24:23 crc kubenswrapper[5002]: I1209 10:24:23.918721 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.918699661 podStartE2EDuration="2.918699661s" podCreationTimestamp="2025-12-09 10:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:24:23.901693832 +0000 UTC m=+1396.293744933" watchObservedRunningTime="2025-12-09 10:24:23.918699661 +0000 UTC m=+1396.310750762" Dec 09 10:24:23 crc kubenswrapper[5002]: I1209 10:24:23.923500 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.923492455 podStartE2EDuration="2.923492455s" podCreationTimestamp="2025-12-09 10:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 10:24:23.919336597 +0000 UTC m=+1396.311387688" watchObservedRunningTime="2025-12-09 10:24:23.923492455 +0000 UTC m=+1396.315543546" Dec 09 10:24:24 crc kubenswrapper[5002]: I1209 10:24:24.908237 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5893e6fa-5b64-47e0-b8e1-f68baf27a65c","Type":"ContainerStarted","Data":"2d1c701bf68c79d11c50d424397a24223cbdbf5471946d3e7f2ac92b66b2c778"} Dec 09 10:24:24 crc kubenswrapper[5002]: I1209 10:24:24.908577 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ltfqf" podUID="e661e6fc-55a2-4371-80a2-7d403ca48469" containerName="registry-server" containerID="cri-o://50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb" gracePeriod=2 Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.517797 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.606319 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-catalog-content\") pod \"e661e6fc-55a2-4371-80a2-7d403ca48469\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.606478 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-utilities\") pod \"e661e6fc-55a2-4371-80a2-7d403ca48469\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.606525 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgj78\" (UniqueName: \"kubernetes.io/projected/e661e6fc-55a2-4371-80a2-7d403ca48469-kube-api-access-tgj78\") pod \"e661e6fc-55a2-4371-80a2-7d403ca48469\" (UID: \"e661e6fc-55a2-4371-80a2-7d403ca48469\") " Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.607941 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-utilities" (OuterVolumeSpecName: "utilities") pod "e661e6fc-55a2-4371-80a2-7d403ca48469" (UID: "e661e6fc-55a2-4371-80a2-7d403ca48469"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.611748 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e661e6fc-55a2-4371-80a2-7d403ca48469-kube-api-access-tgj78" (OuterVolumeSpecName: "kube-api-access-tgj78") pod "e661e6fc-55a2-4371-80a2-7d403ca48469" (UID: "e661e6fc-55a2-4371-80a2-7d403ca48469"). InnerVolumeSpecName "kube-api-access-tgj78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.708670 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.708708 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgj78\" (UniqueName: \"kubernetes.io/projected/e661e6fc-55a2-4371-80a2-7d403ca48469-kube-api-access-tgj78\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.727632 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e661e6fc-55a2-4371-80a2-7d403ca48469" (UID: "e661e6fc-55a2-4371-80a2-7d403ca48469"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.810552 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e661e6fc-55a2-4371-80a2-7d403ca48469-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.919576 5002 generic.go:334] "Generic (PLEG): container finished" podID="e661e6fc-55a2-4371-80a2-7d403ca48469" containerID="50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb" exitCode=0 Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.919644 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltfqf" Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.919657 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltfqf" event={"ID":"e661e6fc-55a2-4371-80a2-7d403ca48469","Type":"ContainerDied","Data":"50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb"} Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.919694 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltfqf" event={"ID":"e661e6fc-55a2-4371-80a2-7d403ca48469","Type":"ContainerDied","Data":"863b52096323ce38596157a8f2c5d0fccd5a2782fea3d0678b951a6bbeaff570"} Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.919714 5002 scope.go:117] "RemoveContainer" containerID="50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb" Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.925122 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5893e6fa-5b64-47e0-b8e1-f68baf27a65c","Type":"ContainerStarted","Data":"6b859d8bb2242febe0b904062dc96fd2d5ce25ae8ec8c45d40b3a0d97cab32ab"} Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.953769 5002 scope.go:117] "RemoveContainer" containerID="e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001" Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.961648 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ltfqf"] Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.974286 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ltfqf"] Dec 09 10:24:25 crc kubenswrapper[5002]: I1209 10:24:25.980166 5002 scope.go:117] "RemoveContainer" containerID="3050ad96585d359616c6b9a3489cb5b3592bc428a9ce1eeb21c9a74fc2dc3dfd" Dec 09 10:24:26 crc kubenswrapper[5002]: I1209 10:24:26.026215 5002 scope.go:117] "RemoveContainer" containerID="50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb" Dec 09 10:24:26 crc kubenswrapper[5002]: E1209 10:24:26.026708 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb\": container with ID starting with 50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb not found: ID does not exist" containerID="50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb" Dec 09 10:24:26 crc kubenswrapper[5002]: I1209 10:24:26.026749 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb"} err="failed to get container status \"50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb\": rpc error: code = NotFound desc = could not find container \"50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb\": container with ID starting with 50fce9de11d2b86e1039e2a27217dc0032e893204ec09a49666963bc089e42fb not found: ID does not exist" Dec 09 10:24:26 crc kubenswrapper[5002]: I1209 10:24:26.026778 5002 scope.go:117] "RemoveContainer" containerID="e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001" Dec 09 10:24:26 crc kubenswrapper[5002]: E1209 10:24:26.027169 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001\": container with ID starting with e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001 not found: ID does not exist" containerID="e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001" Dec 09 10:24:26 crc kubenswrapper[5002]: I1209 10:24:26.027221 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001"} err="failed to get container status \"e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001\": rpc error: code = NotFound desc = could not find container \"e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001\": container with ID starting with e62f8d1ee27c30ca6628ceb12392b4dfa27e36909a6a17de7b40d41d4999a001 not found: ID does not exist" Dec 09 10:24:26 crc kubenswrapper[5002]: I1209 10:24:26.027256 5002 scope.go:117] "RemoveContainer" containerID="3050ad96585d359616c6b9a3489cb5b3592bc428a9ce1eeb21c9a74fc2dc3dfd" Dec 09 10:24:26 crc kubenswrapper[5002]: E1209 10:24:26.027581 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3050ad96585d359616c6b9a3489cb5b3592bc428a9ce1eeb21c9a74fc2dc3dfd\": container with ID starting with 3050ad96585d359616c6b9a3489cb5b3592bc428a9ce1eeb21c9a74fc2dc3dfd not found: ID does not exist" containerID="3050ad96585d359616c6b9a3489cb5b3592bc428a9ce1eeb21c9a74fc2dc3dfd" Dec 09 10:24:26 crc kubenswrapper[5002]: I1209 10:24:26.027617 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3050ad96585d359616c6b9a3489cb5b3592bc428a9ce1eeb21c9a74fc2dc3dfd"} err="failed to get container status \"3050ad96585d359616c6b9a3489cb5b3592bc428a9ce1eeb21c9a74fc2dc3dfd\": rpc error: code = NotFound desc = could not find container \"3050ad96585d359616c6b9a3489cb5b3592bc428a9ce1eeb21c9a74fc2dc3dfd\": container with ID starting with 3050ad96585d359616c6b9a3489cb5b3592bc428a9ce1eeb21c9a74fc2dc3dfd not found: ID does not exist" Dec 09 10:24:26 crc kubenswrapper[5002]: I1209 10:24:26.082779 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e661e6fc-55a2-4371-80a2-7d403ca48469" path="/var/lib/kubelet/pods/e661e6fc-55a2-4371-80a2-7d403ca48469/volumes" Dec 09 10:24:26 crc kubenswrapper[5002]: I1209 10:24:26.965413 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5893e6fa-5b64-47e0-b8e1-f68baf27a65c","Type":"ContainerStarted","Data":"7bf0075652dce88cf4c715938171cbd87da9f63720497ca4a0f0e4c414c5e29f"} Dec 09 10:24:26 crc kubenswrapper[5002]: I1209 10:24:26.965652 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 10:24:26 crc kubenswrapper[5002]: I1209 10:24:26.997410 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.190204306 podStartE2EDuration="5.997390939s" podCreationTimestamp="2025-12-09 10:24:21 +0000 UTC" firstStartedPulling="2025-12-09 10:24:22.86838657 +0000 UTC m=+1395.260437651" lastFinishedPulling="2025-12-09 10:24:26.675573153 +0000 UTC m=+1399.067624284" observedRunningTime="2025-12-09 10:24:26.983567282 +0000 UTC m=+1399.375618373" watchObservedRunningTime="2025-12-09 10:24:26.997390939 +0000 UTC m=+1399.389442040" Dec 09 10:24:27 crc kubenswrapper[5002]: I1209 10:24:27.334381 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 10:24:27 crc kubenswrapper[5002]: I1209 10:24:27.334488 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 10:24:27 crc kubenswrapper[5002]: I1209 10:24:27.357618 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 10:24:27 crc kubenswrapper[5002]: I1209 10:24:27.408221 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 10:24:27 crc kubenswrapper[5002]: I1209 10:24:27.408314 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 10:24:28 crc kubenswrapper[5002]: I1209 10:24:28.420046 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6b5836b7-7b16-477f-9a20-f30032362374" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.197:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 10:24:28 crc kubenswrapper[5002]: I1209 10:24:28.420060 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6b5836b7-7b16-477f-9a20-f30032362374" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.197:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 10:24:32 crc kubenswrapper[5002]: I1209 10:24:32.334082 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 10:24:32 crc kubenswrapper[5002]: I1209 10:24:32.334904 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 10:24:32 crc kubenswrapper[5002]: I1209 10:24:32.358134 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 10:24:32 crc kubenswrapper[5002]: I1209 10:24:32.391584 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 10:24:33 crc kubenswrapper[5002]: I1209 10:24:33.059772 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 10:24:33 crc kubenswrapper[5002]: I1209 10:24:33.351114 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 10:24:33 crc kubenswrapper[5002]: I1209 10:24:33.351358 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 10:24:37 crc kubenswrapper[5002]: I1209 10:24:37.417415 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 10:24:37 crc kubenswrapper[5002]: I1209 10:24:37.419611 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 10:24:37 crc kubenswrapper[5002]: I1209 10:24:37.419685 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 10:24:37 crc kubenswrapper[5002]: I1209 10:24:37.428212 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 10:24:37 crc kubenswrapper[5002]: I1209 10:24:37.965225 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:24:37 crc kubenswrapper[5002]: I1209 10:24:37.965322 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:24:38 crc kubenswrapper[5002]: I1209 10:24:38.077491 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 10:24:38 crc kubenswrapper[5002]: I1209 10:24:38.086334 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 10:24:42 crc kubenswrapper[5002]: I1209 10:24:42.340734 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 10:24:42 crc kubenswrapper[5002]: I1209 10:24:42.341317 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 10:24:42 crc kubenswrapper[5002]: I1209 10:24:42.347498 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 10:24:42 crc kubenswrapper[5002]: I1209 10:24:42.348592 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 10:24:52 crc kubenswrapper[5002]: I1209 10:24:52.303879 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 10:25:07 crc kubenswrapper[5002]: I1209 10:25:07.965139 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:25:07 crc kubenswrapper[5002]: I1209 10:25:07.965806 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.467146 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.467914 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="c9e681d8-0720-4f5e-8893-ec4f1cf43edf" containerName="openstackclient" containerID="cri-o://bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5" gracePeriod=2 Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.524133 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.788685 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.834700 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican73fb-account-delete-6zw8z"] Dec 09 10:25:15 crc kubenswrapper[5002]: E1209 10:25:15.835121 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e661e6fc-55a2-4371-80a2-7d403ca48469" containerName="registry-server" Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.835138 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e661e6fc-55a2-4371-80a2-7d403ca48469" containerName="registry-server" Dec 09 10:25:15 crc kubenswrapper[5002]: E1209 10:25:15.835152 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e661e6fc-55a2-4371-80a2-7d403ca48469" containerName="extract-content" Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.835158 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e661e6fc-55a2-4371-80a2-7d403ca48469" containerName="extract-content" Dec 09 10:25:15 crc kubenswrapper[5002]: E1209 10:25:15.835173 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e681d8-0720-4f5e-8893-ec4f1cf43edf" containerName="openstackclient" Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.835178 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e681d8-0720-4f5e-8893-ec4f1cf43edf" containerName="openstackclient" Dec 09 10:25:15 crc kubenswrapper[5002]: E1209 10:25:15.835212 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e661e6fc-55a2-4371-80a2-7d403ca48469" containerName="extract-utilities" Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.835218 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e661e6fc-55a2-4371-80a2-7d403ca48469" containerName="extract-utilities" Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.835379 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e681d8-0720-4f5e-8893-ec4f1cf43edf" containerName="openstackclient" Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.835399 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e661e6fc-55a2-4371-80a2-7d403ca48469" containerName="registry-server" Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.836057 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican73fb-account-delete-6zw8z" Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.874695 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican73fb-account-delete-6zw8z"] Dec 09 10:25:15 crc kubenswrapper[5002]: E1209 10:25:15.937506 5002 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 10:25:15 crc kubenswrapper[5002]: E1209 10:25:15.937579 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data podName:9278e14e-2524-4e42-b870-f493ea02ede8 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:16.437555434 +0000 UTC m=+1448.829606515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data") pod "rabbitmq-cell1-server-0" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8") : configmap "rabbitmq-cell1-config-data" not found Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.943951 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cindere0a9-account-delete-b5zfk"] Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.946169 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cindere0a9-account-delete-b5zfk" Dec 09 10:25:15 crc kubenswrapper[5002]: E1209 10:25:15.988188 5002 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.132:58580->38.102.83.132:39941: write tcp 38.102.83.132:58580->38.102.83.132:39941: write: broken pipe Dec 09 10:25:15 crc kubenswrapper[5002]: I1209 10:25:15.989644 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cindere0a9-account-delete-b5zfk"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.037607 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe95257d-a02e-4f04-a543-a2db08231043-operator-scripts\") pod \"barbican73fb-account-delete-6zw8z\" (UID: \"fe95257d-a02e-4f04-a543-a2db08231043\") " pod="openstack/barbican73fb-account-delete-6zw8z" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.037650 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4wkz\" (UniqueName: \"kubernetes.io/projected/fe95257d-a02e-4f04-a543-a2db08231043-kube-api-access-h4wkz\") pod \"barbican73fb-account-delete-6zw8z\" (UID: \"fe95257d-a02e-4f04-a543-a2db08231043\") " pod="openstack/barbican73fb-account-delete-6zw8z" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.037710 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae6c00ce-3152-42ae-890f-bb76aac103c5-operator-scripts\") pod \"cindere0a9-account-delete-b5zfk\" (UID: \"ae6c00ce-3152-42ae-890f-bb76aac103c5\") " pod="openstack/cindere0a9-account-delete-b5zfk" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.037755 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7ptd\" (UniqueName: \"kubernetes.io/projected/ae6c00ce-3152-42ae-890f-bb76aac103c5-kube-api-access-d7ptd\") pod \"cindere0a9-account-delete-b5zfk\" (UID: \"ae6c00ce-3152-42ae-890f-bb76aac103c5\") " pod="openstack/cindere0a9-account-delete-b5zfk" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.053883 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.077318 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glancea775-account-delete-zmp4n"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.078429 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancea775-account-delete-zmp4n" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.095196 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancea775-account-delete-zmp4n"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.140705 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe95257d-a02e-4f04-a543-a2db08231043-operator-scripts\") pod \"barbican73fb-account-delete-6zw8z\" (UID: \"fe95257d-a02e-4f04-a543-a2db08231043\") " pod="openstack/barbican73fb-account-delete-6zw8z" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.140749 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4wkz\" (UniqueName: \"kubernetes.io/projected/fe95257d-a02e-4f04-a543-a2db08231043-kube-api-access-h4wkz\") pod \"barbican73fb-account-delete-6zw8z\" (UID: \"fe95257d-a02e-4f04-a543-a2db08231043\") " pod="openstack/barbican73fb-account-delete-6zw8z" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.140824 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae6c00ce-3152-42ae-890f-bb76aac103c5-operator-scripts\") pod \"cindere0a9-account-delete-b5zfk\" (UID: \"ae6c00ce-3152-42ae-890f-bb76aac103c5\") " pod="openstack/cindere0a9-account-delete-b5zfk" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.140875 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7ptd\" (UniqueName: \"kubernetes.io/projected/ae6c00ce-3152-42ae-890f-bb76aac103c5-kube-api-access-d7ptd\") pod \"cindere0a9-account-delete-b5zfk\" (UID: \"ae6c00ce-3152-42ae-890f-bb76aac103c5\") " pod="openstack/cindere0a9-account-delete-b5zfk" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.141491 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe95257d-a02e-4f04-a543-a2db08231043-operator-scripts\") pod \"barbican73fb-account-delete-6zw8z\" (UID: \"fe95257d-a02e-4f04-a543-a2db08231043\") " pod="openstack/barbican73fb-account-delete-6zw8z" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.142324 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae6c00ce-3152-42ae-890f-bb76aac103c5-operator-scripts\") pod \"cindere0a9-account-delete-b5zfk\" (UID: \"ae6c00ce-3152-42ae-890f-bb76aac103c5\") " pod="openstack/cindere0a9-account-delete-b5zfk" Dec 09 10:25:16 crc kubenswrapper[5002]: E1209 10:25:16.142499 5002 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 10:25:16 crc kubenswrapper[5002]: E1209 10:25:16.142535 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data podName:58c08274-46ea-48be-a135-0c1174cd6135 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:16.642521337 +0000 UTC m=+1449.034572418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data") pod "rabbitmq-server-0" (UID: "58c08274-46ea-48be-a135-0c1174cd6135") : configmap "rabbitmq-config-data" not found Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.171330 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4wkz\" (UniqueName: \"kubernetes.io/projected/fe95257d-a02e-4f04-a543-a2db08231043-kube-api-access-h4wkz\") pod \"barbican73fb-account-delete-6zw8z\" (UID: \"fe95257d-a02e-4f04-a543-a2db08231043\") " pod="openstack/barbican73fb-account-delete-6zw8z" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.187949 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.188337 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="28cb84ad-b399-4fe4-9631-e481dfa75aed" containerName="openstack-network-exporter" containerID="cri-o://faa9bd08445de88d08d39878352fad3e34339b91a452f594ffc3a3e18843d443" gracePeriod=300 Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.205200 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7ptd\" (UniqueName: \"kubernetes.io/projected/ae6c00ce-3152-42ae-890f-bb76aac103c5-kube-api-access-d7ptd\") pod \"cindere0a9-account-delete-b5zfk\" (UID: \"ae6c00ce-3152-42ae-890f-bb76aac103c5\") " pod="openstack/cindere0a9-account-delete-b5zfk" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.205880 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.206238 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="322c0304-1696-43fb-9225-a709e7e2ea89" containerName="openstack-network-exporter" containerID="cri-o://61bf667d89aa459332a0bf66073b7adb7457f78a04b9772762f9e64fcf4753ad" gracePeriod=300 Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.246731 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6b9775f-22d1-413b-8d2f-1dbe890b582c-operator-scripts\") pod \"glancea775-account-delete-zmp4n\" (UID: \"c6b9775f-22d1-413b-8d2f-1dbe890b582c\") " pod="openstack/glancea775-account-delete-zmp4n" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.246767 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp2kq\" (UniqueName: \"kubernetes.io/projected/c6b9775f-22d1-413b-8d2f-1dbe890b582c-kube-api-access-mp2kq\") pod \"glancea775-account-delete-zmp4n\" (UID: \"c6b9775f-22d1-413b-8d2f-1dbe890b582c\") " pod="openstack/glancea775-account-delete-zmp4n" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.252455 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-t7vtz"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.275845 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-t7vtz"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.301096 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cindere0a9-account-delete-b5zfk" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.314594 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement60c5-account-delete-729k9"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.315966 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement60c5-account-delete-729k9" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.335455 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-4kmzk"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.357359 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6b9775f-22d1-413b-8d2f-1dbe890b582c-operator-scripts\") pod \"glancea775-account-delete-zmp4n\" (UID: \"c6b9775f-22d1-413b-8d2f-1dbe890b582c\") " pod="openstack/glancea775-account-delete-zmp4n" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.357498 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp2kq\" (UniqueName: \"kubernetes.io/projected/c6b9775f-22d1-413b-8d2f-1dbe890b582c-kube-api-access-mp2kq\") pod \"glancea775-account-delete-zmp4n\" (UID: \"c6b9775f-22d1-413b-8d2f-1dbe890b582c\") " pod="openstack/glancea775-account-delete-zmp4n" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.358615 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6b9775f-22d1-413b-8d2f-1dbe890b582c-operator-scripts\") pod \"glancea775-account-delete-zmp4n\" (UID: \"c6b9775f-22d1-413b-8d2f-1dbe890b582c\") " pod="openstack/glancea775-account-delete-zmp4n" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.371907 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement60c5-account-delete-729k9"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.375104 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="322c0304-1696-43fb-9225-a709e7e2ea89" containerName="ovsdbserver-nb" containerID="cri-o://48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050" gracePeriod=300 Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.382956 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="28cb84ad-b399-4fe4-9631-e481dfa75aed" containerName="ovsdbserver-sb" containerID="cri-o://2215494876baf67d40bfc6391dc6cc221f9e14b2fc38cc62efc7ad13c22f507b" gracePeriod=300 Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.411594 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp2kq\" (UniqueName: \"kubernetes.io/projected/c6b9775f-22d1-413b-8d2f-1dbe890b582c-kube-api-access-mp2kq\") pod \"glancea775-account-delete-zmp4n\" (UID: \"c6b9775f-22d1-413b-8d2f-1dbe890b582c\") " pod="openstack/glancea775-account-delete-zmp4n" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.411690 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-4kmzk"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.432264 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancea775-account-delete-zmp4n" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.456682 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican73fb-account-delete-6zw8z" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.457946 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.458204 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" containerName="ovn-northd" containerID="cri-o://077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be" gracePeriod=30 Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.458625 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" containerName="openstack-network-exporter" containerID="cri-o://e7405eccb60d4c551738f72265103db45ab534fc28ebd77e569e7e80a729397d" gracePeriod=30 Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.459726 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a67e154b-1de7-4e2b-9b87-049ea273fa01-operator-scripts\") pod \"placement60c5-account-delete-729k9\" (UID: \"a67e154b-1de7-4e2b-9b87-049ea273fa01\") " pod="openstack/placement60c5-account-delete-729k9" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.459937 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg4r8\" (UniqueName: \"kubernetes.io/projected/a67e154b-1de7-4e2b-9b87-049ea273fa01-kube-api-access-vg4r8\") pod \"placement60c5-account-delete-729k9\" (UID: \"a67e154b-1de7-4e2b-9b87-049ea273fa01\") " pod="openstack/placement60c5-account-delete-729k9" Dec 09 10:25:16 crc kubenswrapper[5002]: E1209 10:25:16.460033 5002 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 10:25:16 crc kubenswrapper[5002]: E1209 10:25:16.460072 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data podName:9278e14e-2524-4e42-b870-f493ea02ede8 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:17.460057647 +0000 UTC m=+1449.852108728 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data") pod "rabbitmq-cell1-server-0" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8") : configmap "rabbitmq-cell1-config-data" not found Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.536758 5002 generic.go:334] "Generic (PLEG): container finished" podID="28cb84ad-b399-4fe4-9631-e481dfa75aed" containerID="faa9bd08445de88d08d39878352fad3e34339b91a452f594ffc3a3e18843d443" exitCode=2 Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.536798 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"28cb84ad-b399-4fe4-9631-e481dfa75aed","Type":"ContainerDied","Data":"faa9bd08445de88d08d39878352fad3e34339b91a452f594ffc3a3e18843d443"} Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.543811 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron1d25-account-delete-f87kn"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.545220 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron1d25-account-delete-f87kn" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.556263 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_322c0304-1696-43fb-9225-a709e7e2ea89/ovsdbserver-nb/0.log" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.556578 5002 generic.go:334] "Generic (PLEG): container finished" podID="322c0304-1696-43fb-9225-a709e7e2ea89" containerID="61bf667d89aa459332a0bf66073b7adb7457f78a04b9772762f9e64fcf4753ad" exitCode=2 Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.556681 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"322c0304-1696-43fb-9225-a709e7e2ea89","Type":"ContainerDied","Data":"61bf667d89aa459332a0bf66073b7adb7457f78a04b9772762f9e64fcf4753ad"} Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.563590 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg4r8\" (UniqueName: \"kubernetes.io/projected/a67e154b-1de7-4e2b-9b87-049ea273fa01-kube-api-access-vg4r8\") pod \"placement60c5-account-delete-729k9\" (UID: \"a67e154b-1de7-4e2b-9b87-049ea273fa01\") " pod="openstack/placement60c5-account-delete-729k9" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.563855 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a67e154b-1de7-4e2b-9b87-049ea273fa01-operator-scripts\") pod \"placement60c5-account-delete-729k9\" (UID: \"a67e154b-1de7-4e2b-9b87-049ea273fa01\") " pod="openstack/placement60c5-account-delete-729k9" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.564767 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a67e154b-1de7-4e2b-9b87-049ea273fa01-operator-scripts\") pod \"placement60c5-account-delete-729k9\" (UID: \"a67e154b-1de7-4e2b-9b87-049ea273fa01\") " pod="openstack/placement60c5-account-delete-729k9" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.573378 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron1d25-account-delete-f87kn"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.585181 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg4r8\" (UniqueName: \"kubernetes.io/projected/a67e154b-1de7-4e2b-9b87-049ea273fa01-kube-api-access-vg4r8\") pod \"placement60c5-account-delete-729k9\" (UID: \"a67e154b-1de7-4e2b-9b87-049ea273fa01\") " pod="openstack/placement60c5-account-delete-729k9" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.603208 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0539b-account-delete-t9blx"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.604772 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0539b-account-delete-t9blx" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.669094 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z82r\" (UniqueName: \"kubernetes.io/projected/c44aced5-6d19-429a-8917-cd4229341433-kube-api-access-4z82r\") pod \"neutron1d25-account-delete-f87kn\" (UID: \"c44aced5-6d19-429a-8917-cd4229341433\") " pod="openstack/neutron1d25-account-delete-f87kn" Dec 09 10:25:16 crc kubenswrapper[5002]: E1209 10:25:16.669149 5002 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 10:25:16 crc kubenswrapper[5002]: E1209 10:25:16.669206 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data podName:58c08274-46ea-48be-a135-0c1174cd6135 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:17.669189622 +0000 UTC m=+1450.061240703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data") pod "rabbitmq-server-0" (UID: "58c08274-46ea-48be-a135-0c1174cd6135") : configmap "rabbitmq-config-data" not found Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.669340 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c44aced5-6d19-429a-8917-cd4229341433-operator-scripts\") pod \"neutron1d25-account-delete-f87kn\" (UID: \"c44aced5-6d19-429a-8917-cd4229341433\") " pod="openstack/neutron1d25-account-delete-f87kn" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.680370 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0539b-account-delete-t9blx"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.736956 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-h2xt6"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.758881 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-h2xt6"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.759312 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement60c5-account-delete-729k9" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.775559 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c44aced5-6d19-429a-8917-cd4229341433-operator-scripts\") pod \"neutron1d25-account-delete-f87kn\" (UID: \"c44aced5-6d19-429a-8917-cd4229341433\") " pod="openstack/neutron1d25-account-delete-f87kn" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.776042 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z82r\" (UniqueName: \"kubernetes.io/projected/c44aced5-6d19-429a-8917-cd4229341433-kube-api-access-4z82r\") pod \"neutron1d25-account-delete-f87kn\" (UID: \"c44aced5-6d19-429a-8917-cd4229341433\") " pod="openstack/neutron1d25-account-delete-f87kn" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.776139 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58de676b-7b73-4c04-b5d5-5de38a88072c-operator-scripts\") pod \"novacell0539b-account-delete-t9blx\" (UID: \"58de676b-7b73-4c04-b5d5-5de38a88072c\") " pod="openstack/novacell0539b-account-delete-t9blx" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.776180 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfspk\" (UniqueName: \"kubernetes.io/projected/58de676b-7b73-4c04-b5d5-5de38a88072c-kube-api-access-sfspk\") pod \"novacell0539b-account-delete-t9blx\" (UID: \"58de676b-7b73-4c04-b5d5-5de38a88072c\") " pod="openstack/novacell0539b-account-delete-t9blx" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.776715 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c44aced5-6d19-429a-8917-cd4229341433-operator-scripts\") pod \"neutron1d25-account-delete-f87kn\" (UID: \"c44aced5-6d19-429a-8917-cd4229341433\") " pod="openstack/neutron1d25-account-delete-f87kn" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.778064 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi5809-account-delete-b9cvt"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.779610 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi5809-account-delete-b9cvt" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.800025 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-h7bpf"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.822549 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi5809-account-delete-b9cvt"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.844274 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-h7bpf"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.852529 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z82r\" (UniqueName: \"kubernetes.io/projected/c44aced5-6d19-429a-8917-cd4229341433-kube-api-access-4z82r\") pod \"neutron1d25-account-delete-f87kn\" (UID: \"c44aced5-6d19-429a-8917-cd4229341433\") " pod="openstack/neutron1d25-account-delete-f87kn" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.875090 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-cbdk5"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.875319 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" podUID="e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" containerName="dnsmasq-dns" containerID="cri-o://3ddbe12bd810be7f1ea041b90c527d9d3fc61e0ea9bf9d295213946e6d8f92d8" gracePeriod=10 Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.878805 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58de676b-7b73-4c04-b5d5-5de38a88072c-operator-scripts\") pod \"novacell0539b-account-delete-t9blx\" (UID: \"58de676b-7b73-4c04-b5d5-5de38a88072c\") " pod="openstack/novacell0539b-account-delete-t9blx" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.878863 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfspk\" (UniqueName: \"kubernetes.io/projected/58de676b-7b73-4c04-b5d5-5de38a88072c-kube-api-access-sfspk\") pod \"novacell0539b-account-delete-t9blx\" (UID: \"58de676b-7b73-4c04-b5d5-5de38a88072c\") " pod="openstack/novacell0539b-account-delete-t9blx" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.878961 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-operator-scripts\") pod \"novaapi5809-account-delete-b9cvt\" (UID: \"f41619d4-24a3-46e4-9cb9-2e388f7cd36b\") " pod="openstack/novaapi5809-account-delete-b9cvt" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.879015 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjkq2\" (UniqueName: \"kubernetes.io/projected/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-kube-api-access-wjkq2\") pod \"novaapi5809-account-delete-b9cvt\" (UID: \"f41619d4-24a3-46e4-9cb9-2e388f7cd36b\") " pod="openstack/novaapi5809-account-delete-b9cvt" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.879652 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58de676b-7b73-4c04-b5d5-5de38a88072c-operator-scripts\") pod \"novacell0539b-account-delete-t9blx\" (UID: \"58de676b-7b73-4c04-b5d5-5de38a88072c\") " pod="openstack/novacell0539b-account-delete-t9blx" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.901258 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron1d25-account-delete-f87kn" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.901901 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-ghft5"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.902160 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-ghft5" podUID="e0f7675b-6614-4e41-86e6-364b7f04664e" containerName="openstack-network-exporter" containerID="cri-o://9c44cdbba87ffe7a2f1b4ebc4e3392a164aba9181ae3fb15f68ca8ab66302c4d" gracePeriod=30 Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.907140 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-47b4k"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.918572 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfspk\" (UniqueName: \"kubernetes.io/projected/58de676b-7b73-4c04-b5d5-5de38a88072c-kube-api-access-sfspk\") pod \"novacell0539b-account-delete-t9blx\" (UID: \"58de676b-7b73-4c04-b5d5-5de38a88072c\") " pod="openstack/novacell0539b-account-delete-t9blx" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.930640 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-g4kc8"] Dec 09 10:25:16 crc kubenswrapper[5002]: E1209 10:25:16.931134 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be is running failed: container process not found" containerID="077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.933231 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0539b-account-delete-t9blx" Dec 09 10:25:16 crc kubenswrapper[5002]: E1209 10:25:16.941087 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be is running failed: container process not found" containerID="077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 10:25:16 crc kubenswrapper[5002]: E1209 10:25:16.952622 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be is running failed: container process not found" containerID="077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 10:25:16 crc kubenswrapper[5002]: E1209 10:25:16.952693 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" containerName="ovn-northd" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.981631 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-operator-scripts\") pod \"novaapi5809-account-delete-b9cvt\" (UID: \"f41619d4-24a3-46e4-9cb9-2e388f7cd36b\") " pod="openstack/novaapi5809-account-delete-b9cvt" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.981730 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjkq2\" (UniqueName: \"kubernetes.io/projected/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-kube-api-access-wjkq2\") pod \"novaapi5809-account-delete-b9cvt\" (UID: \"f41619d4-24a3-46e4-9cb9-2e388f7cd36b\") " pod="openstack/novaapi5809-account-delete-b9cvt" Dec 09 10:25:16 crc kubenswrapper[5002]: I1209 10:25:16.982916 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-operator-scripts\") pod \"novaapi5809-account-delete-b9cvt\" (UID: \"f41619d4-24a3-46e4-9cb9-2e388f7cd36b\") " pod="openstack/novaapi5809-account-delete-b9cvt" Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:16.995931 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-sdzqj"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.040521 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjkq2\" (UniqueName: \"kubernetes.io/projected/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-kube-api-access-wjkq2\") pod \"novaapi5809-account-delete-b9cvt\" (UID: \"f41619d4-24a3-46e4-9cb9-2e388f7cd36b\") " pod="openstack/novaapi5809-account-delete-b9cvt" Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.109286 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-sdzqj"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.124269 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi5809-account-delete-b9cvt" Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.163338 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-52w6p"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.183888 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-52w6p"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.202865 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.203110 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f702a539-ec25-44d4-8629-97b3c5499b96" containerName="cinder-scheduler" containerID="cri-o://87bebcf10614da44af2b08b3844e8a098da235879ef6a0ce2fdbe6d780cb77c8" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.203510 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f702a539-ec25-44d4-8629-97b3c5499b96" containerName="probe" containerID="cri-o://346c243539a09bb0cf2ecabd1fa68b92b5e3b4d887823c3ba0eff1a45067f934" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.225358 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.225611 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="02c94bee-a522-4ea6-85af-1ba68e174203" containerName="cinder-api-log" containerID="cri-o://63586c424dc327bbd0545721f624a8d5562a690863d2d8565ad85890deded297" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.226004 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="02c94bee-a522-4ea6-85af-1ba68e174203" containerName="cinder-api" containerID="cri-o://d1c65c897a0495448b0bba138435b4e6ce0da36de283bd456d1269f9d3226c84" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.254602 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.255376 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-server" containerID="cri-o://572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.255744 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="swift-recon-cron" containerID="cri-o://76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.255793 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="rsync" containerID="cri-o://7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.255858 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-replicator" containerID="cri-o://6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.255900 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-server" containerID="cri-o://8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.255897 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-auditor" containerID="cri-o://9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.255939 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-auditor" containerID="cri-o://2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.255970 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-replicator" containerID="cri-o://3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.256029 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-auditor" containerID="cri-o://e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.256043 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-replicator" containerID="cri-o://4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.256062 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-expirer" containerID="cri-o://4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.256084 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-server" containerID="cri-o://64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.256100 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-updater" containerID="cri-o://1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.256134 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-updater" containerID="cri-o://8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.255930 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-reaper" containerID="cri-o://b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.293249 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-sblvw"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.324237 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-sblvw"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.334527 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-s99q2"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.341088 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-s99q2"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.366644 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-56f74754d8-5pd9q"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.366937 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-56f74754d8-5pd9q" podUID="0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" containerName="placement-log" containerID="cri-o://e31fd05bd7e517de1ef420971d431d6a2f3089efe18a744cd769acc8bbc26a81" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.367069 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-56f74754d8-5pd9q" podUID="0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" containerName="placement-api" containerID="cri-o://1767450d54e834d07c9f4e17540dd48734de123d1bb880696d3cd80a533970df" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.370381 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.370602 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="54351653-7ebd-40ba-8181-bb1023f18190" containerName="glance-log" containerID="cri-o://1df4bb922098dea87efcb2c4b87131e9b7b7222c65f32b527e20b6017dad1370" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.370788 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="54351653-7ebd-40ba-8181-bb1023f18190" containerName="glance-httpd" containerID="cri-o://1e6f6d93e3bd5dcd93496f1389572c1e7e7d5c754f5b03c5cf8bcd4e3532e5b0" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.384937 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.385205 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="65df60b6-4049-47b6-9907-ebf76c151213" containerName="glance-log" containerID="cri-o://1dd5020ee445c45b526aafb1a2e0ba3b4c5c0fb89e017224140c385ac48d4a20" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.385566 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="65df60b6-4049-47b6-9907-ebf76c151213" containerName="glance-httpd" containerID="cri-o://165a1620604d830dded2bfca6d82a903dd8294bcf0db3c611e329f3f40e88ace" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.454601 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.469071 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-776dccf8bb-k9gt4"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.469342 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" podUID="c4ddce94-6333-4233-951d-571a761b708f" containerName="barbican-keystone-listener-log" containerID="cri-o://138deb878d9eae8db13bac5892a85b4950d565f4c2cf09cbe03061fbf965d6a2" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.469727 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" podUID="c4ddce94-6333-4233-951d-571a761b708f" containerName="barbican-keystone-listener" containerID="cri-o://1685138d02316a9a12cf58ddcc259dbabe79ffe5bb459f0ae7a6a2cbefac195b" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: E1209 10:25:17.495319 5002 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 10:25:17 crc kubenswrapper[5002]: E1209 10:25:17.495392 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data podName:9278e14e-2524-4e42-b870-f493ea02ede8 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:19.495371871 +0000 UTC m=+1451.887422962 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data") pod "rabbitmq-cell1-server-0" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8") : configmap "rabbitmq-cell1-config-data" not found Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.508697 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-857f77df5c-skx8f"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.508934 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-857f77df5c-skx8f" podUID="41f46a2d-f158-497f-b61b-60f39c64149b" containerName="neutron-api" containerID="cri-o://a034c8a435eb9196808c0d7f0f523d44f798925554bc10f46ac57bff50643ec3" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.509225 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-857f77df5c-skx8f" podUID="41f46a2d-f158-497f-b61b-60f39c64149b" containerName="neutron-httpd" containerID="cri-o://a9f9119ac359c6cfc08e9e4ec057f2b160949a68cbe68fb1fa38fc53c004e69a" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.532698 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.549387 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5855d5f975-nmr2s"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.549961 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5855d5f975-nmr2s" podUID="c9198258-4919-4ade-88ba-4a0773b32012" containerName="barbican-worker-log" containerID="cri-o://3cfc9050975f650d9997515f3f47032beccc8afe01479ddcb1d077ee3deff954" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.550336 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5855d5f975-nmr2s" podUID="c9198258-4919-4ade-88ba-4a0773b32012" containerName="barbican-worker" containerID="cri-o://9ce8868f1af38995fb075822c7445d85a0cf6501e86e92fde84885aaa86d36ad" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.566398 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c454948fd-lwcxn"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.569306 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.569536 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6b5836b7-7b16-477f-9a20-f30032362374" containerName="nova-api-log" containerID="cri-o://290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.569675 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6b5836b7-7b16-477f-9a20-f30032362374" containerName="nova-api-api" containerID="cri-o://aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.570687 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c454948fd-lwcxn" podUID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerName="barbican-api" containerID="cri-o://2944a25a7c0f087015f80b3d4d12a8b2ffabefdcb9d6f6b684e88e4e6b57e2db" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.570638 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c454948fd-lwcxn" podUID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerName="barbican-api-log" containerID="cri-o://6cb449e2adfcabb9641ca2b98611d189bd76e94105b2edc6a7b7b41e8dbf68a0" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.576546 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="9278e14e-2524-4e42-b870-f493ea02ede8" containerName="rabbitmq" containerID="cri-o://1faa363b9769f751a8c09fade1d2f2f3b3905666130dc1d039543eef99f84775" gracePeriod=604800 Dec 09 10:25:17 crc kubenswrapper[5002]: E1209 10:25:17.585602 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050 is running failed: container process not found" containerID="48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 09 10:25:17 crc kubenswrapper[5002]: E1209 10:25:17.586676 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050 is running failed: container process not found" containerID="48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 09 10:25:17 crc kubenswrapper[5002]: E1209 10:25:17.587369 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050 is running failed: container process not found" containerID="48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 09 10:25:17 crc kubenswrapper[5002]: E1209 10:25:17.587397 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="322c0304-1696-43fb-9225-a709e7e2ea89" containerName="ovsdbserver-nb" Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.621800 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.632984 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_322c0304-1696-43fb-9225-a709e7e2ea89/ovsdbserver-nb/0.log" Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.633040 5002 generic.go:334] "Generic (PLEG): container finished" podID="322c0304-1696-43fb-9225-a709e7e2ea89" containerID="48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050" exitCode=143 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.633113 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"322c0304-1696-43fb-9225-a709e7e2ea89","Type":"ContainerDied","Data":"48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050"} Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.635923 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.636182 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-log" containerID="cri-o://47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.636589 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-metadata" containerID="cri-o://177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780" gracePeriod=30 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.655093 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="58c08274-46ea-48be-a135-0c1174cd6135" containerName="rabbitmq" containerID="cri-o://b05714ada64dee7eaed39017f863e151b219f928c230aa2f336910df9726668b" gracePeriod=604800 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.663587 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mvvcv"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.687640 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_36fbd6d1-d87d-45a2-9bca-0f25f3daca0c/ovn-northd/0.log" Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.687692 5002 generic.go:334] "Generic (PLEG): container finished" podID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" containerID="e7405eccb60d4c551738f72265103db45ab534fc28ebd77e569e7e80a729397d" exitCode=2 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.687759 5002 generic.go:334] "Generic (PLEG): container finished" podID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" containerID="077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be" exitCode=143 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.687929 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c","Type":"ContainerDied","Data":"e7405eccb60d4c551738f72265103db45ab534fc28ebd77e569e7e80a729397d"} Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.687960 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c","Type":"ContainerDied","Data":"077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be"} Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.700537 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_28cb84ad-b399-4fe4-9631-e481dfa75aed/ovsdbserver-sb/0.log" Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.700580 5002 generic.go:334] "Generic (PLEG): container finished" podID="28cb84ad-b399-4fe4-9631-e481dfa75aed" containerID="2215494876baf67d40bfc6391dc6cc221f9e14b2fc38cc62efc7ad13c22f507b" exitCode=143 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.700636 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"28cb84ad-b399-4fe4-9631-e481dfa75aed","Type":"ContainerDied","Data":"2215494876baf67d40bfc6391dc6cc221f9e14b2fc38cc62efc7ad13c22f507b"} Dec 09 10:25:17 crc kubenswrapper[5002]: E1209 10:25:17.701829 5002 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 10:25:17 crc kubenswrapper[5002]: E1209 10:25:17.701871 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data podName:58c08274-46ea-48be-a135-0c1174cd6135 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:19.701857695 +0000 UTC m=+1452.093908766 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data") pod "rabbitmq-server-0" (UID: "58c08274-46ea-48be-a135-0c1174cd6135") : configmap "rabbitmq-config-data" not found Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.716383 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mvvcv"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.752048 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ghft5_e0f7675b-6614-4e41-86e6-364b7f04664e/openstack-network-exporter/0.log" Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.752089 5002 generic.go:334] "Generic (PLEG): container finished" podID="e0f7675b-6614-4e41-86e6-364b7f04664e" containerID="9c44cdbba87ffe7a2f1b4ebc4e3392a164aba9181ae3fb15f68ca8ab66302c4d" exitCode=2 Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.752118 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ghft5" event={"ID":"e0f7675b-6614-4e41-86e6-364b7f04664e","Type":"ContainerDied","Data":"9c44cdbba87ffe7a2f1b4ebc4e3392a164aba9181ae3fb15f68ca8ab66302c4d"} Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.759950 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-05d2-account-create-update-2lrlj"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.797190 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-05d2-account-create-update-2lrlj"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.823866 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:25:17 crc kubenswrapper[5002]: I1209 10:25:17.824380 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b613f5a4-9369-45ae-8c2c-10e16e639999" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://58ecfbecbf171e347e375d6249ce709b457a4fd2276c833f84808e224ef3e5ed" gracePeriod=30 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.033773 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_322c0304-1696-43fb-9225-a709e7e2ea89/ovsdbserver-nb/0.log" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.033855 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.043132 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_36fbd6d1-d87d-45a2-9bca-0f25f3daca0c/ovn-northd/0.log" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.043206 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.087758 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ebb6ea-f36b-440a-a437-ff39f9766fca" path="/var/lib/kubelet/pods/20ebb6ea-f36b-440a-a437-ff39f9766fca/volumes" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.088433 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354d641d-dc9c-4aa4-b821-90ce72ef6d5c" path="/var/lib/kubelet/pods/354d641d-dc9c-4aa4-b821-90ce72ef6d5c/volumes" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.096128 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d382de4-ddc6-4781-9815-76b74cbccadc" path="/var/lib/kubelet/pods/3d382de4-ddc6-4781-9815-76b74cbccadc/volumes" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.105844 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c575708-ef27-4116-8eb1-9eae1aae903f" path="/var/lib/kubelet/pods/4c575708-ef27-4116-8eb1-9eae1aae903f/volumes" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.116715 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjxfk\" (UniqueName: \"kubernetes.io/projected/322c0304-1696-43fb-9225-a709e7e2ea89-kube-api-access-qjxfk\") pod \"322c0304-1696-43fb-9225-a709e7e2ea89\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.116798 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdbserver-nb-tls-certs\") pod \"322c0304-1696-43fb-9225-a709e7e2ea89\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.116848 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-metrics-certs-tls-certs\") pod \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.116881 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-config\") pod \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.116925 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-northd-tls-certs\") pod \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.116949 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdb-rundir\") pod \"322c0304-1696-43fb-9225-a709e7e2ea89\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.116970 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-metrics-certs-tls-certs\") pod \"322c0304-1696-43fb-9225-a709e7e2ea89\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.117003 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"322c0304-1696-43fb-9225-a709e7e2ea89\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.117054 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-config\") pod \"322c0304-1696-43fb-9225-a709e7e2ea89\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.117069 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swmdj\" (UniqueName: \"kubernetes.io/projected/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-kube-api-access-swmdj\") pod \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.117128 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-combined-ca-bundle\") pod \"322c0304-1696-43fb-9225-a709e7e2ea89\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.117152 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-scripts\") pod \"322c0304-1696-43fb-9225-a709e7e2ea89\" (UID: \"322c0304-1696-43fb-9225-a709e7e2ea89\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.117201 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-scripts\") pod \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.117239 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-combined-ca-bundle\") pod \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.117256 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-rundir\") pod \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\" (UID: \"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.118707 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-config" (OuterVolumeSpecName: "config") pod "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" (UID: "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.119211 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-scripts" (OuterVolumeSpecName: "scripts") pod "322c0304-1696-43fb-9225-a709e7e2ea89" (UID: "322c0304-1696-43fb-9225-a709e7e2ea89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.119196 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-scripts" (OuterVolumeSpecName: "scripts") pod "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" (UID: "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.119672 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-config" (OuterVolumeSpecName: "config") pod "322c0304-1696-43fb-9225-a709e7e2ea89" (UID: "322c0304-1696-43fb-9225-a709e7e2ea89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.120008 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="759658bf-fbc8-40b6-96a5-b691f2ecec64" path="/var/lib/kubelet/pods/759658bf-fbc8-40b6-96a5-b691f2ecec64/volumes" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.120572 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7" path="/var/lib/kubelet/pods/83eb6a58-6bec-43eb-bbef-1fc7d4ec76e7/volumes" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.121108 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71308aa-e2b5-4775-917e-1b100ff8969c" path="/var/lib/kubelet/pods/d71308aa-e2b5-4775-917e-1b100ff8969c/volumes" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.121622 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d846942e-6d0d-4e42-a584-a910a56d9718" path="/var/lib/kubelet/pods/d846942e-6d0d-4e42-a584-a910a56d9718/volumes" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.124062 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="d100f321-6fe6-4eb3-a00c-50b9ff5e2861" containerName="galera" containerID="cri-o://c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9" gracePeriod=30 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.124096 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "322c0304-1696-43fb-9225-a709e7e2ea89" (UID: "322c0304-1696-43fb-9225-a709e7e2ea89"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.124937 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" (UID: "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.125233 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "322c0304-1696-43fb-9225-a709e7e2ea89" (UID: "322c0304-1696-43fb-9225-a709e7e2ea89"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.125270 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/322c0304-1696-43fb-9225-a709e7e2ea89-kube-api-access-qjxfk" (OuterVolumeSpecName: "kube-api-access-qjxfk") pod "322c0304-1696-43fb-9225-a709e7e2ea89" (UID: "322c0304-1696-43fb-9225-a709e7e2ea89"). InnerVolumeSpecName "kube-api-access-qjxfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.126134 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-kube-api-access-swmdj" (OuterVolumeSpecName: "kube-api-access-swmdj") pod "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" (UID: "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c"). InnerVolumeSpecName "kube-api-access-swmdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.132347 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f27813-7f40-4967-9e37-34e4ae205cb7" path="/var/lib/kubelet/pods/d8f27813-7f40-4967-9e37-34e4ae205cb7/volumes" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.133127 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6" path="/var/lib/kubelet/pods/f6f5f4d8-2b06-4eb4-92d5-313b5fdfeab6/volumes" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.208053 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" (UID: "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.213499 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="58c08274-46ea-48be-a135-0c1174cd6135" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.219058 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.219090 5002 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.219099 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjxfk\" (UniqueName: \"kubernetes.io/projected/322c0304-1696-43fb-9225-a709e7e2ea89-kube-api-access-qjxfk\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.219108 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.219119 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.219140 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.219150 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.219159 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swmdj\" (UniqueName: \"kubernetes.io/projected/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-kube-api-access-swmdj\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.219167 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/322c0304-1696-43fb-9225-a709e7e2ea89-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.219175 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.258226 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovs-vswitchd" containerID="cri-o://3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" gracePeriod=29 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.280220 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "322c0304-1696-43fb-9225-a709e7e2ea89" (UID: "322c0304-1696-43fb-9225-a709e7e2ea89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.329872 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.335536 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.335745 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="e5a9794a-b66f-40d4-9e70-efc6a0a72d83" containerName="nova-cell1-conductor-conductor" containerID="cri-o://ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9" gracePeriod=30 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.338473 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.339279 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "322c0304-1696-43fb-9225-a709e7e2ea89" (UID: "322c0304-1696-43fb-9225-a709e7e2ea89"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.370079 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6ck56"] Dec 09 10:25:18 crc kubenswrapper[5002]: E1209 10:25:18.381457 5002 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 09 10:25:18 crc kubenswrapper[5002]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 09 10:25:18 crc kubenswrapper[5002]: + source /usr/local/bin/container-scripts/functions Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNBridge=br-int Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNRemote=tcp:localhost:6642 Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNEncapType=geneve Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNAvailabilityZones= Dec 09 10:25:18 crc kubenswrapper[5002]: ++ EnableChassisAsGateway=true Dec 09 10:25:18 crc kubenswrapper[5002]: ++ PhysicalNetworks= Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNHostName= Dec 09 10:25:18 crc kubenswrapper[5002]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 09 10:25:18 crc kubenswrapper[5002]: ++ ovs_dir=/var/lib/openvswitch Dec 09 10:25:18 crc kubenswrapper[5002]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 09 10:25:18 crc kubenswrapper[5002]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 09 10:25:18 crc kubenswrapper[5002]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 09 10:25:18 crc kubenswrapper[5002]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 09 10:25:18 crc kubenswrapper[5002]: + sleep 0.5 Dec 09 10:25:18 crc kubenswrapper[5002]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 09 10:25:18 crc kubenswrapper[5002]: + sleep 0.5 Dec 09 10:25:18 crc kubenswrapper[5002]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 09 10:25:18 crc kubenswrapper[5002]: + cleanup_ovsdb_server_semaphore Dec 09 10:25:18 crc kubenswrapper[5002]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 09 10:25:18 crc kubenswrapper[5002]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 09 10:25:18 crc kubenswrapper[5002]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-g4kc8" message=< Dec 09 10:25:18 crc kubenswrapper[5002]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 09 10:25:18 crc kubenswrapper[5002]: + source /usr/local/bin/container-scripts/functions Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNBridge=br-int Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNRemote=tcp:localhost:6642 Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNEncapType=geneve Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNAvailabilityZones= Dec 09 10:25:18 crc kubenswrapper[5002]: ++ EnableChassisAsGateway=true Dec 09 10:25:18 crc kubenswrapper[5002]: ++ PhysicalNetworks= Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNHostName= Dec 09 10:25:18 crc kubenswrapper[5002]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 09 10:25:18 crc kubenswrapper[5002]: ++ ovs_dir=/var/lib/openvswitch Dec 09 10:25:18 crc kubenswrapper[5002]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 09 10:25:18 crc kubenswrapper[5002]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 09 10:25:18 crc kubenswrapper[5002]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 09 10:25:18 crc kubenswrapper[5002]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 09 10:25:18 crc kubenswrapper[5002]: + sleep 0.5 Dec 09 10:25:18 crc kubenswrapper[5002]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 09 10:25:18 crc kubenswrapper[5002]: + sleep 0.5 Dec 09 10:25:18 crc kubenswrapper[5002]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 09 10:25:18 crc kubenswrapper[5002]: + cleanup_ovsdb_server_semaphore Dec 09 10:25:18 crc kubenswrapper[5002]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 09 10:25:18 crc kubenswrapper[5002]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 09 10:25:18 crc kubenswrapper[5002]: > Dec 09 10:25:18 crc kubenswrapper[5002]: E1209 10:25:18.383218 5002 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 09 10:25:18 crc kubenswrapper[5002]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 09 10:25:18 crc kubenswrapper[5002]: + source /usr/local/bin/container-scripts/functions Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNBridge=br-int Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNRemote=tcp:localhost:6642 Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNEncapType=geneve Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNAvailabilityZones= Dec 09 10:25:18 crc kubenswrapper[5002]: ++ EnableChassisAsGateway=true Dec 09 10:25:18 crc kubenswrapper[5002]: ++ PhysicalNetworks= Dec 09 10:25:18 crc kubenswrapper[5002]: ++ OVNHostName= Dec 09 10:25:18 crc kubenswrapper[5002]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 09 10:25:18 crc kubenswrapper[5002]: ++ ovs_dir=/var/lib/openvswitch Dec 09 10:25:18 crc kubenswrapper[5002]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 09 10:25:18 crc kubenswrapper[5002]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 09 10:25:18 crc kubenswrapper[5002]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 09 10:25:18 crc kubenswrapper[5002]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 09 10:25:18 crc kubenswrapper[5002]: + sleep 0.5 Dec 09 10:25:18 crc kubenswrapper[5002]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 09 10:25:18 crc kubenswrapper[5002]: + sleep 0.5 Dec 09 10:25:18 crc kubenswrapper[5002]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 09 10:25:18 crc kubenswrapper[5002]: + cleanup_ovsdb_server_semaphore Dec 09 10:25:18 crc kubenswrapper[5002]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 09 10:25:18 crc kubenswrapper[5002]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 09 10:25:18 crc kubenswrapper[5002]: > pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server" containerID="cri-o://5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.383945 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server" containerID="cri-o://5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" gracePeriod=29 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.386870 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6ck56"] Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.398470 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.399333 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="7ae47b25-e6fd-451f-9827-72ee4e12e526" containerName="nova-cell0-conductor-conductor" containerID="cri-o://b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847" gracePeriod=30 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.414548 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7b9g"] Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.419384 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" (UID: "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.431265 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7b9g"] Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.434087 5002 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.434117 5002 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.434126 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.440351 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "322c0304-1696-43fb-9225-a709e7e2ea89" (UID: "322c0304-1696-43fb-9225-a709e7e2ea89"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.466651 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" (UID: "36fbd6d1-d87d-45a2-9bca-0f25f3daca0c"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.515865 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="9278e14e-2524-4e42-b870-f493ea02ede8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.523991 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_28cb84ad-b399-4fe4-9631-e481dfa75aed/ovsdbserver-sb/0.log" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.524074 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.539288 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/322c0304-1696-43fb-9225-a709e7e2ea89-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.539314 5002 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.541961 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ghft5_e0f7675b-6614-4e41-86e6-364b7f04664e/openstack-network-exporter/0.log" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.542114 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.543870 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cindere0a9-account-delete-b5zfk"] Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.644706 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxmdr\" (UniqueName: \"kubernetes.io/projected/e0f7675b-6614-4e41-86e6-364b7f04664e-kube-api-access-mxmdr\") pod \"e0f7675b-6614-4e41-86e6-364b7f04664e\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645116 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f7675b-6614-4e41-86e6-364b7f04664e-config\") pod \"e0f7675b-6614-4e41-86e6-364b7f04664e\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645166 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovn-rundir\") pod \"e0f7675b-6614-4e41-86e6-364b7f04664e\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645193 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovs-rundir\") pod \"e0f7675b-6614-4e41-86e6-364b7f04664e\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645223 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdb-rundir\") pod \"28cb84ad-b399-4fe4-9631-e481dfa75aed\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645252 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn4c4\" (UniqueName: \"kubernetes.io/projected/28cb84ad-b399-4fe4-9631-e481dfa75aed-kube-api-access-hn4c4\") pod \"28cb84ad-b399-4fe4-9631-e481dfa75aed\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645278 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-scripts\") pod \"28cb84ad-b399-4fe4-9631-e481dfa75aed\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645326 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-metrics-certs-tls-certs\") pod \"e0f7675b-6614-4e41-86e6-364b7f04664e\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645582 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdbserver-sb-tls-certs\") pod \"28cb84ad-b399-4fe4-9631-e481dfa75aed\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645649 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"28cb84ad-b399-4fe4-9631-e481dfa75aed\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645669 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-combined-ca-bundle\") pod \"e0f7675b-6614-4e41-86e6-364b7f04664e\" (UID: \"e0f7675b-6614-4e41-86e6-364b7f04664e\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645723 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-config\") pod \"28cb84ad-b399-4fe4-9631-e481dfa75aed\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645743 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-metrics-certs-tls-certs\") pod \"28cb84ad-b399-4fe4-9631-e481dfa75aed\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.645927 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-combined-ca-bundle\") pod \"28cb84ad-b399-4fe4-9631-e481dfa75aed\" (UID: \"28cb84ad-b399-4fe4-9631-e481dfa75aed\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.651533 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-scripts" (OuterVolumeSpecName: "scripts") pod "28cb84ad-b399-4fe4-9631-e481dfa75aed" (UID: "28cb84ad-b399-4fe4-9631-e481dfa75aed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.651604 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "e0f7675b-6614-4e41-86e6-364b7f04664e" (UID: "e0f7675b-6614-4e41-86e6-364b7f04664e"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.652915 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "e0f7675b-6614-4e41-86e6-364b7f04664e" (UID: "e0f7675b-6614-4e41-86e6-364b7f04664e"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.653185 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "28cb84ad-b399-4fe4-9631-e481dfa75aed" (UID: "28cb84ad-b399-4fe4-9631-e481dfa75aed"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.654178 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f7675b-6614-4e41-86e6-364b7f04664e-config" (OuterVolumeSpecName: "config") pod "e0f7675b-6614-4e41-86e6-364b7f04664e" (UID: "e0f7675b-6614-4e41-86e6-364b7f04664e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.655059 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-config" (OuterVolumeSpecName: "config") pod "28cb84ad-b399-4fe4-9631-e481dfa75aed" (UID: "28cb84ad-b399-4fe4-9631-e481dfa75aed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.659576 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f7675b-6614-4e41-86e6-364b7f04664e-kube-api-access-mxmdr" (OuterVolumeSpecName: "kube-api-access-mxmdr") pod "e0f7675b-6614-4e41-86e6-364b7f04664e" (UID: "e0f7675b-6614-4e41-86e6-364b7f04664e"). InnerVolumeSpecName "kube-api-access-mxmdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.659611 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28cb84ad-b399-4fe4-9631-e481dfa75aed-kube-api-access-hn4c4" (OuterVolumeSpecName: "kube-api-access-hn4c4") pod "28cb84ad-b399-4fe4-9631-e481dfa75aed" (UID: "28cb84ad-b399-4fe4-9631-e481dfa75aed"). InnerVolumeSpecName "kube-api-access-hn4c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.664274 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "28cb84ad-b399-4fe4-9631-e481dfa75aed" (UID: "28cb84ad-b399-4fe4-9631-e481dfa75aed"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.723956 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0f7675b-6614-4e41-86e6-364b7f04664e" (UID: "e0f7675b-6614-4e41-86e6-364b7f04664e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.730479 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.741234 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28cb84ad-b399-4fe4-9631-e481dfa75aed" (UID: "28cb84ad-b399-4fe4-9631-e481dfa75aed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.753165 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.753400 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="43512a9c-be3a-4c0e-a178-82c5a065acf4" containerName="nova-scheduler-scheduler" containerID="cri-o://36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a" gracePeriod=30 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.753836 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxmdr\" (UniqueName: \"kubernetes.io/projected/e0f7675b-6614-4e41-86e6-364b7f04664e-kube-api-access-mxmdr\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.753869 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f7675b-6614-4e41-86e6-364b7f04664e-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.753879 5002 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.753888 5002 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e0f7675b-6614-4e41-86e6-364b7f04664e-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.753897 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.753906 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn4c4\" (UniqueName: \"kubernetes.io/projected/28cb84ad-b399-4fe4-9631-e481dfa75aed-kube-api-access-hn4c4\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.753914 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.754521 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.754541 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.754554 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cb84ad-b399-4fe4-9631-e481dfa75aed-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.754628 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.779916 5002 generic.go:334] "Generic (PLEG): container finished" podID="54351653-7ebd-40ba-8181-bb1023f18190" containerID="1df4bb922098dea87efcb2c4b87131e9b7b7222c65f32b527e20b6017dad1370" exitCode=143 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.787660 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54351653-7ebd-40ba-8181-bb1023f18190","Type":"ContainerDied","Data":"1df4bb922098dea87efcb2c4b87131e9b7b7222c65f32b527e20b6017dad1370"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.794801 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ghft5_e0f7675b-6614-4e41-86e6-364b7f04664e/openstack-network-exporter/0.log" Dec 09 10:25:18 crc kubenswrapper[5002]: E1209 10:25:18.794851 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.794985 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ghft5" event={"ID":"e0f7675b-6614-4e41-86e6-364b7f04664e","Type":"ContainerDied","Data":"ca41310247f17d665549977b5992bdf6d31b44886b6d9cc390d984e83a71300c"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.795058 5002 scope.go:117] "RemoveContainer" containerID="9c44cdbba87ffe7a2f1b4ebc4e3392a164aba9181ae3fb15f68ca8ab66302c4d" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.795207 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ghft5" Dec 09 10:25:18 crc kubenswrapper[5002]: E1209 10:25:18.797834 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:18 crc kubenswrapper[5002]: E1209 10:25:18.798283 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:18 crc kubenswrapper[5002]: E1209 10:25:18.801441 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:18 crc kubenswrapper[5002]: E1209 10:25:18.801477 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.803535 5002 generic.go:334] "Generic (PLEG): container finished" podID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerID="47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9" exitCode=143 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.803606 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8a7609-928f-4a68-9903-fa846e4baeda","Type":"ContainerDied","Data":"47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9"} Dec 09 10:25:18 crc kubenswrapper[5002]: E1209 10:25:18.803639 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:18 crc kubenswrapper[5002]: E1209 10:25:18.809510 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:18 crc kubenswrapper[5002]: E1209 10:25:18.809551 5002 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovs-vswitchd" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.821429 5002 generic.go:334] "Generic (PLEG): container finished" podID="65df60b6-4049-47b6-9907-ebf76c151213" containerID="1dd5020ee445c45b526aafb1a2e0ba3b4c5c0fb89e017224140c385ac48d4a20" exitCode=143 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.821780 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65df60b6-4049-47b6-9907-ebf76c151213","Type":"ContainerDied","Data":"1dd5020ee445c45b526aafb1a2e0ba3b4c5c0fb89e017224140c385ac48d4a20"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.824594 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_36fbd6d1-d87d-45a2-9bca-0f25f3daca0c/ovn-northd/0.log" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.824652 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"36fbd6d1-d87d-45a2-9bca-0f25f3daca0c","Type":"ContainerDied","Data":"08b0af9597b2797bac52a9423398ee7af507abb50d93d997131e6ed75aa92eba"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.824684 5002 scope.go:117] "RemoveContainer" containerID="e7405eccb60d4c551738f72265103db45ab534fc28ebd77e569e7e80a729397d" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.824778 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.832739 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cindere0a9-account-delete-b5zfk" event={"ID":"ae6c00ce-3152-42ae-890f-bb76aac103c5","Type":"ContainerStarted","Data":"0b53d8e361dde882e6f1fc7c25ad43d552fd155607d3b0f8ef6d766e9eb8fb9d"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.849017 5002 generic.go:334] "Generic (PLEG): container finished" podID="0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" containerID="e31fd05bd7e517de1ef420971d431d6a2f3089efe18a744cd769acc8bbc26a81" exitCode=143 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.849093 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56f74754d8-5pd9q" event={"ID":"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc","Type":"ContainerDied","Data":"e31fd05bd7e517de1ef420971d431d6a2f3089efe18a744cd769acc8bbc26a81"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.851751 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_28cb84ad-b399-4fe4-9631-e481dfa75aed/ovsdbserver-sb/0.log" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.851806 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"28cb84ad-b399-4fe4-9631-e481dfa75aed","Type":"ContainerDied","Data":"96faafc2f3c60b72f11fea10579fb5eb0859efde3ad8721d48d167fd9c26d95d"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.851902 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.852519 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.852974 5002 scope.go:117] "RemoveContainer" containerID="077beda74ae3e5c25e7ec8cca4e2084bfba25475c005f8a85d1fd2f854c613be" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.855773 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config-secret\") pod \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.855803 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config\") pod \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.855895 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-combined-ca-bundle\") pod \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.855953 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6x44\" (UniqueName: \"kubernetes.io/projected/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-kube-api-access-f6x44\") pod \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\" (UID: \"c9e681d8-0720-4f5e-8893-ec4f1cf43edf\") " Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.856717 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.884409 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.889645 5002 scope.go:117] "RemoveContainer" containerID="faa9bd08445de88d08d39878352fad3e34339b91a452f594ffc3a3e18843d443" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894379 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894413 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894423 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894433 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894440 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894466 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894474 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894480 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894486 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894502 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894542 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894551 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894557 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894564 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894649 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894675 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894686 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894696 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894705 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894715 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894724 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894733 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894742 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894750 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894758 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894766 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894774 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.894781 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.899912 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.910526 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancea775-account-delete-zmp4n"] Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.911748 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-kube-api-access-f6x44" (OuterVolumeSpecName: "kube-api-access-f6x44") pod "c9e681d8-0720-4f5e-8893-ec4f1cf43edf" (UID: "c9e681d8-0720-4f5e-8893-ec4f1cf43edf"). InnerVolumeSpecName "kube-api-access-f6x44". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.915952 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "28cb84ad-b399-4fe4-9631-e481dfa75aed" (UID: "28cb84ad-b399-4fe4-9631-e481dfa75aed"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.917517 5002 generic.go:334] "Generic (PLEG): container finished" podID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.917555 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-g4kc8" event={"ID":"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6","Type":"ContainerDied","Data":"5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.918096 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c9e681d8-0720-4f5e-8893-ec4f1cf43edf" (UID: "c9e681d8-0720-4f5e-8893-ec4f1cf43edf"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.924038 5002 generic.go:334] "Generic (PLEG): container finished" podID="6b5836b7-7b16-477f-9a20-f30032362374" containerID="290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd" exitCode=143 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.924103 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b5836b7-7b16-477f-9a20-f30032362374","Type":"ContainerDied","Data":"290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.927087 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "28cb84ad-b399-4fe4-9631-e481dfa75aed" (UID: "28cb84ad-b399-4fe4-9631-e481dfa75aed"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.934339 5002 scope.go:117] "RemoveContainer" containerID="2215494876baf67d40bfc6391dc6cc221f9e14b2fc38cc62efc7ad13c22f507b" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.938826 5002 generic.go:334] "Generic (PLEG): container finished" podID="02c94bee-a522-4ea6-85af-1ba68e174203" containerID="63586c424dc327bbd0545721f624a8d5562a690863d2d8565ad85890deded297" exitCode=143 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.939517 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02c94bee-a522-4ea6-85af-1ba68e174203","Type":"ContainerDied","Data":"63586c424dc327bbd0545721f624a8d5562a690863d2d8565ad85890deded297"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.942373 5002 generic.go:334] "Generic (PLEG): container finished" podID="c9198258-4919-4ade-88ba-4a0773b32012" containerID="9ce8868f1af38995fb075822c7445d85a0cf6501e86e92fde84885aaa86d36ad" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.942391 5002 generic.go:334] "Generic (PLEG): container finished" podID="c9198258-4919-4ade-88ba-4a0773b32012" containerID="3cfc9050975f650d9997515f3f47032beccc8afe01479ddcb1d077ee3deff954" exitCode=143 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.942424 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5855d5f975-nmr2s" event={"ID":"c9198258-4919-4ade-88ba-4a0773b32012","Type":"ContainerDied","Data":"9ce8868f1af38995fb075822c7445d85a0cf6501e86e92fde84885aaa86d36ad"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.942444 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5855d5f975-nmr2s" event={"ID":"c9198258-4919-4ade-88ba-4a0773b32012","Type":"ContainerDied","Data":"3cfc9050975f650d9997515f3f47032beccc8afe01479ddcb1d077ee3deff954"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.943913 5002 generic.go:334] "Generic (PLEG): container finished" podID="c4ddce94-6333-4233-951d-571a761b708f" containerID="1685138d02316a9a12cf58ddcc259dbabe79ffe5bb459f0ae7a6a2cbefac195b" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.943930 5002 generic.go:334] "Generic (PLEG): container finished" podID="c4ddce94-6333-4233-951d-571a761b708f" containerID="138deb878d9eae8db13bac5892a85b4950d565f4c2cf09cbe03061fbf965d6a2" exitCode=143 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.943959 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" event={"ID":"c4ddce94-6333-4233-951d-571a761b708f","Type":"ContainerDied","Data":"1685138d02316a9a12cf58ddcc259dbabe79ffe5bb459f0ae7a6a2cbefac195b"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.943977 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" event={"ID":"c4ddce94-6333-4233-951d-571a761b708f","Type":"ContainerDied","Data":"138deb878d9eae8db13bac5892a85b4950d565f4c2cf09cbe03061fbf965d6a2"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.945597 5002 generic.go:334] "Generic (PLEG): container finished" podID="b613f5a4-9369-45ae-8c2c-10e16e639999" containerID="58ecfbecbf171e347e375d6249ce709b457a4fd2276c833f84808e224ef3e5ed" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.945636 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b613f5a4-9369-45ae-8c2c-10e16e639999","Type":"ContainerDied","Data":"58ecfbecbf171e347e375d6249ce709b457a4fd2276c833f84808e224ef3e5ed"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.951475 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9e681d8-0720-4f5e-8893-ec4f1cf43edf" (UID: "c9e681d8-0720-4f5e-8893-ec4f1cf43edf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.958042 5002 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.958071 5002 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.958080 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.958091 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6x44\" (UniqueName: \"kubernetes.io/projected/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-kube-api-access-f6x44\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.958099 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb84ad-b399-4fe4-9631-e481dfa75aed-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:18 crc kubenswrapper[5002]: W1209 10:25:18.964027 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6b9775f_22d1_413b_8d2f_1dbe890b582c.slice/crio-52a71cddee3baa6357f7934a437e21cdbc14fece93531ef0157841bd9610069e WatchSource:0}: Error finding container 52a71cddee3baa6357f7934a437e21cdbc14fece93531ef0157841bd9610069e: Status 404 returned error can't find the container with id 52a71cddee3baa6357f7934a437e21cdbc14fece93531ef0157841bd9610069e Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.964189 5002 generic.go:334] "Generic (PLEG): container finished" podID="41f46a2d-f158-497f-b61b-60f39c64149b" containerID="a9f9119ac359c6cfc08e9e4ec057f2b160949a68cbe68fb1fa38fc53c004e69a" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.964301 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-857f77df5c-skx8f" event={"ID":"41f46a2d-f158-497f-b61b-60f39c64149b","Type":"ContainerDied","Data":"a9f9119ac359c6cfc08e9e4ec057f2b160949a68cbe68fb1fa38fc53c004e69a"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.971188 5002 generic.go:334] "Generic (PLEG): container finished" podID="c9e681d8-0720-4f5e-8893-ec4f1cf43edf" containerID="bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5" exitCode=137 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.971308 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.985215 5002 generic.go:334] "Generic (PLEG): container finished" podID="e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" containerID="3ddbe12bd810be7f1ea041b90c527d9d3fc61e0ea9bf9d295213946e6d8f92d8" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.985302 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" event={"ID":"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0","Type":"ContainerDied","Data":"3ddbe12bd810be7f1ea041b90c527d9d3fc61e0ea9bf9d295213946e6d8f92d8"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.989432 5002 generic.go:334] "Generic (PLEG): container finished" podID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerID="6cb449e2adfcabb9641ca2b98611d189bd76e94105b2edc6a7b7b41e8dbf68a0" exitCode=143 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.989474 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c454948fd-lwcxn" event={"ID":"a4061af7-7669-4bd4-a36c-6ec982e86753","Type":"ContainerDied","Data":"6cb449e2adfcabb9641ca2b98611d189bd76e94105b2edc6a7b7b41e8dbf68a0"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.990978 5002 generic.go:334] "Generic (PLEG): container finished" podID="f702a539-ec25-44d4-8629-97b3c5499b96" containerID="346c243539a09bb0cf2ecabd1fa68b92b5e3b4d887823c3ba0eff1a45067f934" exitCode=0 Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.991011 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f702a539-ec25-44d4-8629-97b3c5499b96","Type":"ContainerDied","Data":"346c243539a09bb0cf2ecabd1fa68b92b5e3b4d887823c3ba0eff1a45067f934"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.992601 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_322c0304-1696-43fb-9225-a709e7e2ea89/ovsdbserver-nb/0.log" Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.992628 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"322c0304-1696-43fb-9225-a709e7e2ea89","Type":"ContainerDied","Data":"493ef9e1293cf81052f3942296fa2500b1a949f6744a038a176f166c2ff4311c"} Dec 09 10:25:18 crc kubenswrapper[5002]: I1209 10:25:18.992694 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:18.999163 5002 scope.go:117] "RemoveContainer" containerID="bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.016797 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican73fb-account-delete-6zw8z"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.037649 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e0f7675b-6614-4e41-86e6-364b7f04664e" (UID: "e0f7675b-6614-4e41-86e6-364b7f04664e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.039659 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c9e681d8-0720-4f5e-8893-ec4f1cf43edf" (UID: "c9e681d8-0720-4f5e-8893-ec4f1cf43edf"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.052174 5002 scope.go:117] "RemoveContainer" containerID="bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5" Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.053250 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5\": container with ID starting with bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5 not found: ID does not exist" containerID="bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.053289 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5"} err="failed to get container status \"bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5\": rpc error: code = NotFound desc = could not find container \"bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5\": container with ID starting with bdb3e6450b2a8a07c44a8a4e234bdae3f558edb01f57c75bf896f882589094c5 not found: ID does not exist" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.053485 5002 scope.go:117] "RemoveContainer" containerID="61bf667d89aa459332a0bf66073b7adb7457f78a04b9772762f9e64fcf4753ad" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.065429 5002 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0f7675b-6614-4e41-86e6-364b7f04664e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.065546 5002 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9e681d8-0720-4f5e-8893-ec4f1cf43edf-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.098880 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.106481 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.108461 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.108926 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.112761 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.126092 5002 scope.go:117] "RemoveContainer" containerID="48d09c2ebf2544131b6474f56670b0b8781e9f927fc4903fb00c40fed41a9050" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169371 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-svc\") pod \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169415 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whgq5\" (UniqueName: \"kubernetes.io/projected/c9198258-4919-4ade-88ba-4a0773b32012-kube-api-access-whgq5\") pod \"c9198258-4919-4ade-88ba-4a0773b32012\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169439 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data\") pod \"c9198258-4919-4ade-88ba-4a0773b32012\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169461 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data-custom\") pod \"c4ddce94-6333-4233-951d-571a761b708f\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169491 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-combined-ca-bundle\") pod \"c9198258-4919-4ade-88ba-4a0773b32012\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169512 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-sb\") pod \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169569 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data\") pod \"c4ddce94-6333-4233-951d-571a761b708f\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169615 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-combined-ca-bundle\") pod \"c4ddce94-6333-4233-951d-571a761b708f\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169644 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxtm6\" (UniqueName: \"kubernetes.io/projected/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-kube-api-access-gxtm6\") pod \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169673 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4ddce94-6333-4233-951d-571a761b708f-logs\") pod \"c4ddce94-6333-4233-951d-571a761b708f\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169750 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data-custom\") pod \"c9198258-4919-4ade-88ba-4a0773b32012\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169793 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-swift-storage-0\") pod \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169834 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv9wj\" (UniqueName: \"kubernetes.io/projected/c4ddce94-6333-4233-951d-571a761b708f-kube-api-access-vv9wj\") pod \"c4ddce94-6333-4233-951d-571a761b708f\" (UID: \"c4ddce94-6333-4233-951d-571a761b708f\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169851 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9198258-4919-4ade-88ba-4a0773b32012-logs\") pod \"c9198258-4919-4ade-88ba-4a0773b32012\" (UID: \"c9198258-4919-4ade-88ba-4a0773b32012\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169871 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-config\") pod \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.169886 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-nb\") pod \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\" (UID: \"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.171236 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9198258-4919-4ade-88ba-4a0773b32012-logs" (OuterVolumeSpecName: "logs") pod "c9198258-4919-4ade-88ba-4a0773b32012" (UID: "c9198258-4919-4ade-88ba-4a0773b32012"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.171533 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ddce94-6333-4233-951d-571a761b708f-logs" (OuterVolumeSpecName: "logs") pod "c4ddce94-6333-4233-951d-571a761b708f" (UID: "c4ddce94-6333-4233-951d-571a761b708f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.173259 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c4ddce94-6333-4233-951d-571a761b708f" (UID: "c4ddce94-6333-4233-951d-571a761b708f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.201224 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ddce94-6333-4233-951d-571a761b708f-kube-api-access-vv9wj" (OuterVolumeSpecName: "kube-api-access-vv9wj") pod "c4ddce94-6333-4233-951d-571a761b708f" (UID: "c4ddce94-6333-4233-951d-571a761b708f"). InnerVolumeSpecName "kube-api-access-vv9wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.215366 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-ghft5"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.217986 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c9198258-4919-4ade-88ba-4a0773b32012" (UID: "c9198258-4919-4ade-88ba-4a0773b32012"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.218106 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-kube-api-access-gxtm6" (OuterVolumeSpecName: "kube-api-access-gxtm6") pod "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" (UID: "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0"). InnerVolumeSpecName "kube-api-access-gxtm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.224658 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-ghft5"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.229201 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9198258-4919-4ade-88ba-4a0773b32012-kube-api-access-whgq5" (OuterVolumeSpecName: "kube-api-access-whgq5") pod "c9198258-4919-4ade-88ba-4a0773b32012" (UID: "c9198258-4919-4ade-88ba-4a0773b32012"). InnerVolumeSpecName "kube-api-access-whgq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.243047 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data" (OuterVolumeSpecName: "config-data") pod "c4ddce94-6333-4233-951d-571a761b708f" (UID: "c4ddce94-6333-4233-951d-571a761b708f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.257284 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9198258-4919-4ade-88ba-4a0773b32012" (UID: "c9198258-4919-4ade-88ba-4a0773b32012"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.271689 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whgq5\" (UniqueName: \"kubernetes.io/projected/c9198258-4919-4ade-88ba-4a0773b32012-kube-api-access-whgq5\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.271718 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.271727 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.271735 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.271744 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxtm6\" (UniqueName: \"kubernetes.io/projected/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-kube-api-access-gxtm6\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.271751 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4ddce94-6333-4233-951d-571a761b708f-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.271759 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.271768 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv9wj\" (UniqueName: \"kubernetes.io/projected/c4ddce94-6333-4233-951d-571a761b708f-kube-api-access-vv9wj\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.271776 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9198258-4919-4ade-88ba-4a0773b32012-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.294790 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" (UID: "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.309656 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4ddce94-6333-4233-951d-571a761b708f" (UID: "c4ddce94-6333-4233-951d-571a761b708f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.310241 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-config" (OuterVolumeSpecName: "config") pod "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" (UID: "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.334859 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" (UID: "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.355279 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5c99967b8c-vjq4g"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.355509 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5c99967b8c-vjq4g" podUID="1bf056c0-a496-4499-92c7-3b1300b4a29d" containerName="proxy-httpd" containerID="cri-o://5f516788aa2a399cee7b3aa95c438e477b373a2ef4a7033783be06cdfa843ec6" gracePeriod=30 Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.357498 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5c99967b8c-vjq4g" podUID="1bf056c0-a496-4499-92c7-3b1300b4a29d" containerName="proxy-server" containerID="cri-o://3edf9b4007e80e9e88d05e62a1daa140acdad44a7fdd6235ed0a8bb73242f7e0" gracePeriod=30 Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.376217 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.376248 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.376257 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.376269 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ddce94-6333-4233-951d-571a761b708f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.386401 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" (UID: "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.389847 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data" (OuterVolumeSpecName: "config-data") pod "c9198258-4919-4ade-88ba-4a0773b32012" (UID: "c9198258-4919-4ade-88ba-4a0773b32012"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.397698 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" (UID: "e0a5beb3-4401-42b8-b8e3-4d2af995a4d0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.478347 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9198258-4919-4ade-88ba-4a0773b32012-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.478373 5002 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.478382 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.580430 5002 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.580782 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data podName:9278e14e-2524-4e42-b870-f493ea02ede8 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:23.580766188 +0000 UTC m=+1455.972817269 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data") pod "rabbitmq-cell1-server-0" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8") : configmap "rabbitmq-cell1-config-data" not found Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.615485 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.653126 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.670013 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9 is running failed: container process not found" containerID="ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.674910 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9 is running failed: container process not found" containerID="ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.675046 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.678117 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9 is running failed: container process not found" containerID="ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.678151 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="e5a9794a-b66f-40d4-9e70-efc6a0a72d83" containerName="nova-cell1-conductor-conductor" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.681968 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbmfh\" (UniqueName: \"kubernetes.io/projected/b613f5a4-9369-45ae-8c2c-10e16e639999-kube-api-access-mbmfh\") pod \"b613f5a4-9369-45ae-8c2c-10e16e639999\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.682071 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-combined-ca-bundle\") pod \"b613f5a4-9369-45ae-8c2c-10e16e639999\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.682104 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-nova-novncproxy-tls-certs\") pod \"b613f5a4-9369-45ae-8c2c-10e16e639999\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.682182 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-config-data\") pod \"b613f5a4-9369-45ae-8c2c-10e16e639999\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.682255 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-vencrypt-tls-certs\") pod \"b613f5a4-9369-45ae-8c2c-10e16e639999\" (UID: \"b613f5a4-9369-45ae-8c2c-10e16e639999\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.688671 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b613f5a4-9369-45ae-8c2c-10e16e639999-kube-api-access-mbmfh" (OuterVolumeSpecName: "kube-api-access-mbmfh") pod "b613f5a4-9369-45ae-8c2c-10e16e639999" (UID: "b613f5a4-9369-45ae-8c2c-10e16e639999"). InnerVolumeSpecName "kube-api-access-mbmfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.730060 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b613f5a4-9369-45ae-8c2c-10e16e639999" (UID: "b613f5a4-9369-45ae-8c2c-10e16e639999"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.750971 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-config-data" (OuterVolumeSpecName: "config-data") pod "b613f5a4-9369-45ae-8c2c-10e16e639999" (UID: "b613f5a4-9369-45ae-8c2c-10e16e639999"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.753049 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.785011 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.785040 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.785050 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbmfh\" (UniqueName: \"kubernetes.io/projected/b613f5a4-9369-45ae-8c2c-10e16e639999-kube-api-access-mbmfh\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.785107 5002 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.785189 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data podName:58c08274-46ea-48be-a135-0c1174cd6135 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:23.785139735 +0000 UTC m=+1456.177190816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data") pod "rabbitmq-server-0" (UID: "58c08274-46ea-48be-a135-0c1174cd6135") : configmap "rabbitmq-config-data" not found Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.794529 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "b613f5a4-9369-45ae-8c2c-10e16e639999" (UID: "b613f5a4-9369-45ae-8c2c-10e16e639999"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.825222 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "b613f5a4-9369-45ae-8c2c-10e16e639999" (UID: "b613f5a4-9369-45ae-8c2c-10e16e639999"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.830073 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron1d25-account-delete-f87kn"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.852946 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement60c5-account-delete-729k9"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.882083 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0539b-account-delete-t9blx"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.894964 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-operator-scripts\") pod \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.895056 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-default\") pod \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.895095 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-galera-tls-certs\") pod \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.895134 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjgz9\" (UniqueName: \"kubernetes.io/projected/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kube-api-access-wjgz9\") pod \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.895171 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-combined-ca-bundle\") pod \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.895291 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.895368 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kolla-config\") pod \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.895400 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-generated\") pod \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\" (UID: \"d100f321-6fe6-4eb3-a00c-50b9ff5e2861\") " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.896068 5002 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.896090 5002 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b613f5a4-9369-45ae-8c2c-10e16e639999-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.910649 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "d100f321-6fe6-4eb3-a00c-50b9ff5e2861" (UID: "d100f321-6fe6-4eb3-a00c-50b9ff5e2861"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.913774 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d100f321-6fe6-4eb3-a00c-50b9ff5e2861" (UID: "d100f321-6fe6-4eb3-a00c-50b9ff5e2861"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.923137 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847 is running failed: container process not found" containerID="b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.923304 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kube-api-access-wjgz9" (OuterVolumeSpecName: "kube-api-access-wjgz9") pod "d100f321-6fe6-4eb3-a00c-50b9ff5e2861" (UID: "d100f321-6fe6-4eb3-a00c-50b9ff5e2861"). InnerVolumeSpecName "kube-api-access-wjgz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.924588 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847 is running failed: container process not found" containerID="b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.925529 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847 is running failed: container process not found" containerID="b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 10:25:19 crc kubenswrapper[5002]: E1209 10:25:19.925644 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7ae47b25-e6fd-451f-9827-72ee4e12e526" containerName="nova-cell0-conductor-conductor" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.927923 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi5809-account-delete-b9cvt"] Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.931657 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "d100f321-6fe6-4eb3-a00c-50b9ff5e2861" (UID: "d100f321-6fe6-4eb3-a00c-50b9ff5e2861"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.942352 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d100f321-6fe6-4eb3-a00c-50b9ff5e2861" (UID: "d100f321-6fe6-4eb3-a00c-50b9ff5e2861"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.953300 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "d100f321-6fe6-4eb3-a00c-50b9ff5e2861" (UID: "d100f321-6fe6-4eb3-a00c-50b9ff5e2861"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.966095 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d100f321-6fe6-4eb3-a00c-50b9ff5e2861" (UID: "d100f321-6fe6-4eb3-a00c-50b9ff5e2861"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.967515 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.997936 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.997964 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjgz9\" (UniqueName: \"kubernetes.io/projected/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kube-api-access-wjgz9\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.997974 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.997997 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.998006 5002 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.998015 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:19 crc kubenswrapper[5002]: I1209 10:25:19.998023 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.008567 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "d100f321-6fe6-4eb3-a00c-50b9ff5e2861" (UID: "d100f321-6fe6-4eb3-a00c-50b9ff5e2861"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.012785 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron1d25-account-delete-f87kn" event={"ID":"c44aced5-6d19-429a-8917-cd4229341433","Type":"ContainerStarted","Data":"815ca33aded12bd65e92a6114007217ffb908a39e3774f83c6f60f6185d4f244"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.016261 5002 generic.go:334] "Generic (PLEG): container finished" podID="ae6c00ce-3152-42ae-890f-bb76aac103c5" containerID="1f38de5ee96eed384a1278e773754b632caf13ae00809e5b9356f6454c15cea2" exitCode=0 Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.016552 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cindere0a9-account-delete-b5zfk" event={"ID":"ae6c00ce-3152-42ae-890f-bb76aac103c5","Type":"ContainerDied","Data":"1f38de5ee96eed384a1278e773754b632caf13ae00809e5b9356f6454c15cea2"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.019697 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" event={"ID":"c4ddce94-6333-4233-951d-571a761b708f","Type":"ContainerDied","Data":"4b3b23b6fc197a312b7003fe33b10441bdd1ab7558d8a61ed4717d43a14f0af2"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.019738 5002 scope.go:117] "RemoveContainer" containerID="1685138d02316a9a12cf58ddcc259dbabe79ffe5bb459f0ae7a6a2cbefac195b" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.019868 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-776dccf8bb-k9gt4" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.022893 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.028453 5002 generic.go:334] "Generic (PLEG): container finished" podID="1bf056c0-a496-4499-92c7-3b1300b4a29d" containerID="3edf9b4007e80e9e88d05e62a1daa140acdad44a7fdd6235ed0a8bb73242f7e0" exitCode=0 Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.028484 5002 generic.go:334] "Generic (PLEG): container finished" podID="1bf056c0-a496-4499-92c7-3b1300b4a29d" containerID="5f516788aa2a399cee7b3aa95c438e477b373a2ef4a7033783be06cdfa843ec6" exitCode=0 Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.028564 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99967b8c-vjq4g" event={"ID":"1bf056c0-a496-4499-92c7-3b1300b4a29d","Type":"ContainerDied","Data":"3edf9b4007e80e9e88d05e62a1daa140acdad44a7fdd6235ed0a8bb73242f7e0"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.028591 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99967b8c-vjq4g" event={"ID":"1bf056c0-a496-4499-92c7-3b1300b4a29d","Type":"ContainerDied","Data":"5f516788aa2a399cee7b3aa95c438e477b373a2ef4a7033783be06cdfa843ec6"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.033680 5002 generic.go:334] "Generic (PLEG): container finished" podID="7ae47b25-e6fd-451f-9827-72ee4e12e526" containerID="b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847" exitCode=0 Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.033773 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7ae47b25-e6fd-451f-9827-72ee4e12e526","Type":"ContainerDied","Data":"b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.040899 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancea775-account-delete-zmp4n" event={"ID":"c6b9775f-22d1-413b-8d2f-1dbe890b582c","Type":"ContainerStarted","Data":"00a44392d7da93c53644215f968e2a020ff72b6f8259eac37e70a2e51e250090"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.040932 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancea775-account-delete-zmp4n" event={"ID":"c6b9775f-22d1-413b-8d2f-1dbe890b582c","Type":"ContainerStarted","Data":"52a71cddee3baa6357f7934a437e21cdbc14fece93531ef0157841bd9610069e"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.046988 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b613f5a4-9369-45ae-8c2c-10e16e639999","Type":"ContainerDied","Data":"95902ae3f27c573bbc6865a40cc0ada0249ce299ad6554c44da8c9b964e9464b"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.047060 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.066864 5002 generic.go:334] "Generic (PLEG): container finished" podID="d100f321-6fe6-4eb3-a00c-50b9ff5e2861" containerID="c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9" exitCode=0 Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.066974 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.083900 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5855d5f975-nmr2s" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.099209 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-combined-ca-bundle\") pod \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.099939 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-config-data\") pod \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.100055 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnp7t\" (UniqueName: \"kubernetes.io/projected/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-kube-api-access-lnp7t\") pod \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\" (UID: \"e5a9794a-b66f-40d4-9e70-efc6a0a72d83\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.100905 5002 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d100f321-6fe6-4eb3-a00c-50b9ff5e2861-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.101196 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.112154 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-kube-api-access-lnp7t" (OuterVolumeSpecName: "kube-api-access-lnp7t") pod "e5a9794a-b66f-40d4-9e70-efc6a0a72d83" (UID: "e5a9794a-b66f-40d4-9e70-efc6a0a72d83"). InnerVolumeSpecName "kube-api-access-lnp7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.115399 5002 generic.go:334] "Generic (PLEG): container finished" podID="e5a9794a-b66f-40d4-9e70-efc6a0a72d83" containerID="ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9" exitCode=0 Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.115560 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.119045 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a833c4-f8ea-4259-a659-a11ea55a8f88" path="/var/lib/kubelet/pods/15a833c4-f8ea-4259-a659-a11ea55a8f88/volumes" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.119541 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e2d58a-f6d2-4e30-a327-042f181b7ba0" path="/var/lib/kubelet/pods/26e2d58a-f6d2-4e30-a327-042f181b7ba0/volumes" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.123415 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28cb84ad-b399-4fe4-9631-e481dfa75aed" path="/var/lib/kubelet/pods/28cb84ad-b399-4fe4-9631-e481dfa75aed/volumes" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.124116 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="322c0304-1696-43fb-9225-a709e7e2ea89" path="/var/lib/kubelet/pods/322c0304-1696-43fb-9225-a709e7e2ea89/volumes" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.128855 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" path="/var/lib/kubelet/pods/36fbd6d1-d87d-45a2-9bca-0f25f3daca0c/volumes" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.129494 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e681d8-0720-4f5e-8893-ec4f1cf43edf" path="/var/lib/kubelet/pods/c9e681d8-0720-4f5e-8893-ec4f1cf43edf/volumes" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.130021 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0f7675b-6614-4e41-86e6-364b7f04664e" path="/var/lib/kubelet/pods/e0f7675b-6614-4e41-86e6-364b7f04664e/volumes" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.144058 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.146907 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-config-data" (OuterVolumeSpecName: "config-data") pod "e5a9794a-b66f-40d4-9e70-efc6a0a72d83" (UID: "e5a9794a-b66f-40d4-9e70-efc6a0a72d83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.151987 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5a9794a-b66f-40d4-9e70-efc6a0a72d83" (UID: "e5a9794a-b66f-40d4-9e70-efc6a0a72d83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.159004 5002 scope.go:117] "RemoveContainer" containerID="138deb878d9eae8db13bac5892a85b4950d565f4c2cf09cbe03061fbf965d6a2" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.162212 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d100f321-6fe6-4eb3-a00c-50b9ff5e2861","Type":"ContainerDied","Data":"c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.162250 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d100f321-6fe6-4eb3-a00c-50b9ff5e2861","Type":"ContainerDied","Data":"40804e59a7145457f96a2a586d17f80f2f2f358bad75b7d0296bbf50d61359d2"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.162275 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi5809-account-delete-b9cvt" event={"ID":"f41619d4-24a3-46e4-9cb9-2e388f7cd36b","Type":"ContainerStarted","Data":"179df0062e8de5005d386674c0c65154f75bb2efd0bb9beba8dd1f9d2c20a0fa"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.162289 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0539b-account-delete-t9blx" event={"ID":"58de676b-7b73-4c04-b5d5-5de38a88072c","Type":"ContainerStarted","Data":"ea887b59a65e1a0c46201433d9c42e1919e4d2130b12ab041d94bf4cb968f9a2"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.162299 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican73fb-account-delete-6zw8z" event={"ID":"fe95257d-a02e-4f04-a543-a2db08231043","Type":"ContainerStarted","Data":"095377d58ea1601d42a04fa4489e5820cf9a568c32d93720234b6c5bcf8a9454"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.162308 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican73fb-account-delete-6zw8z" event={"ID":"fe95257d-a02e-4f04-a543-a2db08231043","Type":"ContainerStarted","Data":"bf76248eab81c01b78ef18341d0bd4ee580bf6798431d646712839a831d76c9f"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.162317 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5855d5f975-nmr2s" event={"ID":"c9198258-4919-4ade-88ba-4a0773b32012","Type":"ContainerDied","Data":"6010a8767b6a0095cf0b09e1064ec844493933934ec0fb3b6282edd13fde39be"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.162328 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e5a9794a-b66f-40d4-9e70-efc6a0a72d83","Type":"ContainerDied","Data":"ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.162338 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e5a9794a-b66f-40d4-9e70-efc6a0a72d83","Type":"ContainerDied","Data":"bb62e1e09e69cdfa540a5529b1263c7fc9cae85b97824273bbfb93fd66ecca52"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.162347 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement60c5-account-delete-729k9" event={"ID":"a67e154b-1de7-4e2b-9b87-049ea273fa01","Type":"ContainerStarted","Data":"99b62954faee487ae496d2100f99fd237cf97a90af6ebc5a45a93493d3275990"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.162358 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" event={"ID":"e0a5beb3-4401-42b8-b8e3-4d2af995a4d0","Type":"ContainerDied","Data":"9d8bc8ae158180c360cb14e1855a6831c104753e3830b538656e18fbb138197d"} Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.203542 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.203570 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.203583 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnp7t\" (UniqueName: \"kubernetes.io/projected/e5a9794a-b66f-40d4-9e70-efc6a0a72d83-kube-api-access-lnp7t\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.213499 5002 scope.go:117] "RemoveContainer" containerID="58ecfbecbf171e347e375d6249ce709b457a4fd2276c833f84808e224ef3e5ed" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.347287 5002 scope.go:117] "RemoveContainer" containerID="c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.352571 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.359711 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.368570 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.380215 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.395969 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.404647 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-776dccf8bb-k9gt4"] Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.419903 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-776dccf8bb-k9gt4"] Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.472196 5002 scope.go:117] "RemoveContainer" containerID="c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.499517 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.508278 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-combined-ca-bundle\") pod \"7ae47b25-e6fd-451f-9827-72ee4e12e526\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.508373 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p799z\" (UniqueName: \"kubernetes.io/projected/7ae47b25-e6fd-451f-9827-72ee4e12e526-kube-api-access-p799z\") pod \"7ae47b25-e6fd-451f-9827-72ee4e12e526\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.508538 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-config-data\") pod \"7ae47b25-e6fd-451f-9827-72ee4e12e526\" (UID: \"7ae47b25-e6fd-451f-9827-72ee4e12e526\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.515324 5002 scope.go:117] "RemoveContainer" containerID="c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9" Dec 09 10:25:20 crc kubenswrapper[5002]: E1209 10:25:20.518290 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9\": container with ID starting with c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9 not found: ID does not exist" containerID="c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.518335 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9"} err="failed to get container status \"c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9\": rpc error: code = NotFound desc = could not find container \"c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9\": container with ID starting with c364964ff05fc5d33bca8efbcb8e29d176a3c1b08131de3be926d3ea34e48ec9 not found: ID does not exist" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.518366 5002 scope.go:117] "RemoveContainer" containerID="c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703" Dec 09 10:25:20 crc kubenswrapper[5002]: E1209 10:25:20.521383 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703\": container with ID starting with c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703 not found: ID does not exist" containerID="c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.521442 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703"} err="failed to get container status \"c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703\": rpc error: code = NotFound desc = could not find container \"c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703\": container with ID starting with c4c4beba225ea5afddb1b6621102042b3f4df2ad80569272c3662253f7a03703 not found: ID does not exist" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.521465 5002 scope.go:117] "RemoveContainer" containerID="9ce8868f1af38995fb075822c7445d85a0cf6501e86e92fde84885aaa86d36ad" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.536070 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae47b25-e6fd-451f-9827-72ee4e12e526-kube-api-access-p799z" (OuterVolumeSpecName: "kube-api-access-p799z") pod "7ae47b25-e6fd-451f-9827-72ee4e12e526" (UID: "7ae47b25-e6fd-451f-9827-72ee4e12e526"). InnerVolumeSpecName "kube-api-access-p799z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.602306 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-config-data" (OuterVolumeSpecName: "config-data") pod "7ae47b25-e6fd-451f-9827-72ee4e12e526" (UID: "7ae47b25-e6fd-451f-9827-72ee4e12e526"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.610476 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-internal-tls-certs\") pod \"1bf056c0-a496-4499-92c7-3b1300b4a29d\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.610900 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-run-httpd\") pod \"1bf056c0-a496-4499-92c7-3b1300b4a29d\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.611045 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-combined-ca-bundle\") pod \"1bf056c0-a496-4499-92c7-3b1300b4a29d\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.611226 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-config-data\") pod \"1bf056c0-a496-4499-92c7-3b1300b4a29d\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.611292 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqgc2\" (UniqueName: \"kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-kube-api-access-tqgc2\") pod \"1bf056c0-a496-4499-92c7-3b1300b4a29d\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.611361 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-public-tls-certs\") pod \"1bf056c0-a496-4499-92c7-3b1300b4a29d\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.611393 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-log-httpd\") pod \"1bf056c0-a496-4499-92c7-3b1300b4a29d\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.611411 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-etc-swift\") pod \"1bf056c0-a496-4499-92c7-3b1300b4a29d\" (UID: \"1bf056c0-a496-4499-92c7-3b1300b4a29d\") " Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.611926 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.611953 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p799z\" (UniqueName: \"kubernetes.io/projected/7ae47b25-e6fd-451f-9827-72ee4e12e526-kube-api-access-p799z\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.612488 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1bf056c0-a496-4499-92c7-3b1300b4a29d" (UID: "1bf056c0-a496-4499-92c7-3b1300b4a29d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.614304 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1bf056c0-a496-4499-92c7-3b1300b4a29d" (UID: "1bf056c0-a496-4499-92c7-3b1300b4a29d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.618984 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-kube-api-access-tqgc2" (OuterVolumeSpecName: "kube-api-access-tqgc2") pod "1bf056c0-a496-4499-92c7-3b1300b4a29d" (UID: "1bf056c0-a496-4499-92c7-3b1300b4a29d"). InnerVolumeSpecName "kube-api-access-tqgc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.620424 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1bf056c0-a496-4499-92c7-3b1300b4a29d" (UID: "1bf056c0-a496-4499-92c7-3b1300b4a29d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.663346 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ae47b25-e6fd-451f-9827-72ee4e12e526" (UID: "7ae47b25-e6fd-451f-9827-72ee4e12e526"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.708988 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1bf056c0-a496-4499-92c7-3b1300b4a29d" (UID: "1bf056c0-a496-4499-92c7-3b1300b4a29d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.715130 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqgc2\" (UniqueName: \"kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-kube-api-access-tqgc2\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.715172 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae47b25-e6fd-451f-9827-72ee4e12e526-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.715188 5002 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.715199 5002 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.715210 5002 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1bf056c0-a496-4499-92c7-3b1300b4a29d-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.715221 5002 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bf056c0-a496-4499-92c7-3b1300b4a29d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.731501 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-config-data" (OuterVolumeSpecName: "config-data") pod "1bf056c0-a496-4499-92c7-3b1300b4a29d" (UID: "1bf056c0-a496-4499-92c7-3b1300b4a29d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.737521 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bf056c0-a496-4499-92c7-3b1300b4a29d" (UID: "1bf056c0-a496-4499-92c7-3b1300b4a29d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.746007 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1bf056c0-a496-4499-92c7-3b1300b4a29d" (UID: "1bf056c0-a496-4499-92c7-3b1300b4a29d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.818788 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.818838 5002 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.818853 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bf056c0-a496-4499-92c7-3b1300b4a29d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.851847 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.852115 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="ceilometer-central-agent" containerID="cri-o://7a033c871bfeb20ac9488a0b5376331a6b527d79d3198b2f244e621f3b53eb12" gracePeriod=30 Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.852482 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="proxy-httpd" containerID="cri-o://7bf0075652dce88cf4c715938171cbd87da9f63720497ca4a0f0e4c414c5e29f" gracePeriod=30 Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.852525 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="sg-core" containerID="cri-o://6b859d8bb2242febe0b904062dc96fd2d5ce25ae8ec8c45d40b3a0d97cab32ab" gracePeriod=30 Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.852560 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="ceilometer-notification-agent" containerID="cri-o://2d1c701bf68c79d11c50d424397a24223cbdbf5471946d3e7f2ac92b66b2c778" gracePeriod=30 Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.889609 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:25:20 crc kubenswrapper[5002]: I1209 10:25:20.890245 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2adbbd67-ccdf-4444-b667-2b549bc200b5" containerName="kube-state-metrics" containerID="cri-o://7599cb61e23fe24a1fe7539f2fe49839b0013514838b373c23975501d3f53ed4" gracePeriod=30 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.035093 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.035307 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="1e36e954-d9c1-41e3-8542-e8f300db90cb" containerName="memcached" containerID="cri-o://b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8" gracePeriod=30 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.089822 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-7zbg6"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.101874 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jrbkq"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.109870 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-7zbg6"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.126727 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jrbkq"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.140117 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6d565f9c5b-d7trd"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.140601 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-6d565f9c5b-d7trd" podUID="f514395b-6067-4e42-98e6-f3c5ac427982" containerName="keystone-api" containerID="cri-o://6037ffe3713fc44574c5f932602a50f9c94b830e5d71df9a677e79d52c3571ba" gracePeriod=30 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.198008 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.211123 5002 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/openstack-galera-0" secret="" err="secret \"galera-openstack-dockercfg-r5phn\" not found" Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.240149 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7ae47b25-e6fd-451f-9827-72ee4e12e526","Type":"ContainerDied","Data":"da23d51b564af3329adb4eb9eebdd7e2460c0512a6b09a65189ec881705e17cf"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.240294 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.281686 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-d7hdr"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.325862 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-d7hdr"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.339269 5002 generic.go:334] "Generic (PLEG): container finished" podID="54351653-7ebd-40ba-8181-bb1023f18190" containerID="1e6f6d93e3bd5dcd93496f1389572c1e7e7d5c754f5b03c5cf8bcd4e3532e5b0" exitCode=0 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.339543 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54351653-7ebd-40ba-8181-bb1023f18190","Type":"ContainerDied","Data":"1e6f6d93e3bd5dcd93496f1389572c1e7e7d5c754f5b03c5cf8bcd4e3532e5b0"} Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.344312 5002 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.344331 5002 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.344361 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:21.844347929 +0000 UTC m=+1454.236399010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-config-data" not found Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.344368 5002 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.344374 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:21.8443692 +0000 UTC m=+1454.236420281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-config-data" not found Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.344388 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:21.8443794 +0000 UTC m=+1454.236430481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-scripts" not found Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.345115 5002 generic.go:334] "Generic (PLEG): container finished" podID="c6b9775f-22d1-413b-8d2f-1dbe890b582c" containerID="00a44392d7da93c53644215f968e2a020ff72b6f8259eac37e70a2e51e250090" exitCode=0 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.345170 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancea775-account-delete-zmp4n" event={"ID":"c6b9775f-22d1-413b-8d2f-1dbe890b582c","Type":"ContainerDied","Data":"00a44392d7da93c53644215f968e2a020ff72b6f8259eac37e70a2e51e250090"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.401124 5002 generic.go:334] "Generic (PLEG): container finished" podID="f41619d4-24a3-46e4-9cb9-2e388f7cd36b" containerID="57ce805df00a89c4c24d7fe42e68c89ab49ef78b1eebf20d9fece98c54c67e2f" exitCode=0 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.401263 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi5809-account-delete-b9cvt" event={"ID":"f41619d4-24a3-46e4-9cb9-2e388f7cd36b","Type":"ContainerDied","Data":"57ce805df00a89c4c24d7fe42e68c89ab49ef78b1eebf20d9fece98c54c67e2f"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.424866 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancea775-account-delete-zmp4n"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.441210 5002 generic.go:334] "Generic (PLEG): container finished" podID="02c94bee-a522-4ea6-85af-1ba68e174203" containerID="d1c65c897a0495448b0bba138435b4e6ce0da36de283bd456d1269f9d3226c84" exitCode=0 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.441359 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02c94bee-a522-4ea6-85af-1ba68e174203","Type":"ContainerDied","Data":"d1c65c897a0495448b0bba138435b4e6ce0da36de283bd456d1269f9d3226c84"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.490420 5002 generic.go:334] "Generic (PLEG): container finished" podID="fe95257d-a02e-4f04-a543-a2db08231043" containerID="095377d58ea1601d42a04fa4489e5820cf9a568c32d93720234b6c5bcf8a9454" exitCode=0 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.490757 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican73fb-account-delete-6zw8z" event={"ID":"fe95257d-a02e-4f04-a543-a2db08231043","Type":"ContainerDied","Data":"095377d58ea1601d42a04fa4489e5820cf9a568c32d93720234b6c5bcf8a9454"} Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.491553 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0172d8ed_9ef1_4aac_b246_1b1ed0df87fc.slice/crio-1767450d54e834d07c9f4e17540dd48734de123d1bb880696d3cd80a533970df.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5893e6fa_5b64_47e0_b8e1_f68baf27a65c.slice/crio-6b859d8bb2242febe0b904062dc96fd2d5ce25ae8ec8c45d40b3a0d97cab32ab.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0172d8ed_9ef1_4aac_b246_1b1ed0df87fc.slice/crio-conmon-1767450d54e834d07c9f4e17540dd48734de123d1bb880696d3cd80a533970df.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda67e154b_1de7_4e2b_9b87_049ea273fa01.slice/crio-conmon-3ee0c33be841aa47fb9e6a001724fc9d6d043c260bf489595a85fb2dccaf7988.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b5836b7_7b16_477f_9a20_f30032362374.slice/crio-aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd8a7609_928f_4a68_9903_fa846e4baeda.slice/crio-177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5893e6fa_5b64_47e0_b8e1_f68baf27a65c.slice/crio-conmon-7bf0075652dce88cf4c715938171cbd87da9f63720497ca4a0f0e4c414c5e29f.scope\": RecentStats: unable to find data in memory cache]" Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.518271 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a775-account-create-update-tb8ck"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.547369 5002 generic.go:334] "Generic (PLEG): container finished" podID="0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" containerID="1767450d54e834d07c9f4e17540dd48734de123d1bb880696d3cd80a533970df" exitCode=0 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.549182 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56f74754d8-5pd9q" event={"ID":"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc","Type":"ContainerDied","Data":"1767450d54e834d07c9f4e17540dd48734de123d1bb880696d3cd80a533970df"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.554776 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a775-account-create-update-tb8ck"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.559490 5002 generic.go:334] "Generic (PLEG): container finished" podID="2adbbd67-ccdf-4444-b667-2b549bc200b5" containerID="7599cb61e23fe24a1fe7539f2fe49839b0013514838b373c23975501d3f53ed4" exitCode=2 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.559547 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2adbbd67-ccdf-4444-b667-2b549bc200b5","Type":"ContainerDied","Data":"7599cb61e23fe24a1fe7539f2fe49839b0013514838b373c23975501d3f53ed4"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.560448 5002 generic.go:334] "Generic (PLEG): container finished" podID="c44aced5-6d19-429a-8917-cd4229341433" containerID="3b5492a7c894b209b3a8a3190a110940978e60b5672a1763ad5a6607cc93171f" exitCode=0 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.560487 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron1d25-account-delete-f87kn" event={"ID":"c44aced5-6d19-429a-8917-cd4229341433","Type":"ContainerDied","Data":"3b5492a7c894b209b3a8a3190a110940978e60b5672a1763ad5a6607cc93171f"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.561869 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0539b-account-delete-t9blx" event={"ID":"58de676b-7b73-4c04-b5d5-5de38a88072c","Type":"ContainerStarted","Data":"2cb6f2c5af24125d366be1b3d0454fead9863b253da1642f513cc98b1fc0571b"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.564670 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-9kc52"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.598014 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-9kc52"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.629655 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-10b7-account-create-update-9bwfh"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.657005 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-10b7-account-create-update-9bwfh"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.688451 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-qxbxl"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.705151 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-qxbxl"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.741436 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99967b8c-vjq4g" event={"ID":"1bf056c0-a496-4499-92c7-3b1300b4a29d","Type":"ContainerDied","Data":"4727074ce1552a7dbd1cb8de2c29fcabc3222bf51f6752e91c19c7c391aab0f5"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.741552 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c99967b8c-vjq4g" Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.755882 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-60c5-account-create-update-wp67w"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.765376 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement60c5-account-delete-729k9"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.766488 5002 generic.go:334] "Generic (PLEG): container finished" podID="65df60b6-4049-47b6-9907-ebf76c151213" containerID="165a1620604d830dded2bfca6d82a903dd8294bcf0db3c611e329f3f40e88ace" exitCode=0 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.766572 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65df60b6-4049-47b6-9907-ebf76c151213","Type":"ContainerDied","Data":"165a1620604d830dded2bfca6d82a903dd8294bcf0db3c611e329f3f40e88ace"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.778532 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c454948fd-lwcxn" podUID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:58916->10.217.0.157:9311: read: connection reset by peer" Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.778406 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c454948fd-lwcxn" podUID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:58922->10.217.0.157:9311: read: connection reset by peer" Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.780761 5002 generic.go:334] "Generic (PLEG): container finished" podID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerID="6b859d8bb2242febe0b904062dc96fd2d5ce25ae8ec8c45d40b3a0d97cab32ab" exitCode=2 Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.780939 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5893e6fa-5b64-47e0-b8e1-f68baf27a65c","Type":"ContainerDied","Data":"6b859d8bb2242febe0b904062dc96fd2d5ce25ae8ec8c45d40b3a0d97cab32ab"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.785205 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement60c5-account-delete-729k9" event={"ID":"a67e154b-1de7-4e2b-9b87-049ea273fa01","Type":"ContainerStarted","Data":"3ee0c33be841aa47fb9e6a001724fc9d6d043c260bf489595a85fb2dccaf7988"} Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.791467 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-60c5-account-create-update-wp67w"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.822330 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rztff"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.832164 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rztff"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.856048 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1d25-account-create-update-2kb69"] Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.858384 5002 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.858539 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:22.858519788 +0000 UTC m=+1455.250570879 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-scripts" not found Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.858911 5002 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.859046 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:22.859034732 +0000 UTC m=+1455.251085813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-config-data" not found Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.859169 5002 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 09 10:25:21 crc kubenswrapper[5002]: E1209 10:25:21.859272 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:22.859261088 +0000 UTC m=+1455.251312169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-config-data" not found Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.872949 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1d25-account-create-update-2kb69"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.889842 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron1d25-account-delete-f87kn"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.915873 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-kth45"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.918518 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-kth45"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.933940 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-539b-account-create-update-d4qnv"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.937212 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-539b-account-create-update-d4qnv"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.946950 5002 scope.go:117] "RemoveContainer" containerID="3cfc9050975f650d9997515f3f47032beccc8afe01479ddcb1d077ee3deff954" Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.947076 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0539b-account-delete-t9blx"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.952240 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-fb7r8"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.959148 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-fb7r8"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.967299 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5809-account-create-update-2sfhw"] Dec 09 10:25:21 crc kubenswrapper[5002]: I1209 10:25:21.983773 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5809-account-create-update-2sfhw"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.001952 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi5809-account-delete-b9cvt"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.002937 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.005185 5002 scope.go:117] "RemoveContainer" containerID="ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.010031 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.011556 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.011762 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-cbdk5"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.019632 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-cbdk5"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.026989 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.031292 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.032875 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.043749 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5855d5f975-nmr2s"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.049469 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.055763 5002 scope.go:117] "RemoveContainer" containerID="ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.055997 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5855d5f975-nmr2s"] Dec 09 10:25:22 crc kubenswrapper[5002]: E1209 10:25:22.057042 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9\": container with ID starting with ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9 not found: ID does not exist" containerID="ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.057088 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9"} err="failed to get container status \"ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9\": rpc error: code = NotFound desc = could not find container \"ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9\": container with ID starting with ab87dad784b7bbb18f78fbb2775ca2f2d80ebe172b8bdf2a5111b3315647afd9 not found: ID does not exist" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.057114 5002 scope.go:117] "RemoveContainer" containerID="3ddbe12bd810be7f1ea041b90c527d9d3fc61e0ea9bf9d295213946e6d8f92d8" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.093060 5002 scope.go:117] "RemoveContainer" containerID="adb75aabb4e985441e87b5cb83073e97ca8eadf6b2f9d37e10d79579ee3295dd" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.117059 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e4c9601-2d18-4b05-9187-f668fb760808" path="/var/lib/kubelet/pods/1e4c9601-2d18-4b05-9187-f668fb760808/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.117693 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e0f58f-0655-466b-90b0-5b0e1887fe75" path="/var/lib/kubelet/pods/31e0f58f-0655-466b-90b0-5b0e1887fe75/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.118202 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cb87512-ad2e-4510-ab87-ac4a0a8d09ae" path="/var/lib/kubelet/pods/5cb87512-ad2e-4510-ab87-ac4a0a8d09ae/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.118645 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73bcef93-39f3-4f68-b3f2-cc78b4698e3a" path="/var/lib/kubelet/pods/73bcef93-39f3-4f68-b3f2-cc78b4698e3a/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.119758 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3" path="/var/lib/kubelet/pods/8fb9392c-5a4f-4bc2-89b0-2c4b59853cf3/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.120295 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="955f29ee-6405-41b4-b905-3438ed1344fd" path="/var/lib/kubelet/pods/955f29ee-6405-41b4-b905-3438ed1344fd/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.120760 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af9df436-38ae-4001-bce6-e97e5e8d9cd2" path="/var/lib/kubelet/pods/af9df436-38ae-4001-bce6-e97e5e8d9cd2/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.122203 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b613f5a4-9369-45ae-8c2c-10e16e639999" path="/var/lib/kubelet/pods/b613f5a4-9369-45ae-8c2c-10e16e639999/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.123188 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc666b29-bbbf-4206-ae4d-7d7e52542577" path="/var/lib/kubelet/pods/bc666b29-bbbf-4206-ae4d-7d7e52542577/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.123805 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0d368da-0627-4c5e-ad8e-821bbc205874" path="/var/lib/kubelet/pods/c0d368da-0627-4c5e-ad8e-821bbc205874/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.139549 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14e584f-7d75-42a6-b6a0-1c3931b022ec" path="/var/lib/kubelet/pods/c14e584f-7d75-42a6-b6a0-1c3931b022ec/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.141140 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ddce94-6333-4233-951d-571a761b708f" path="/var/lib/kubelet/pods/c4ddce94-6333-4233-951d-571a761b708f/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.142075 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5afe93e-c94d-4e57-987b-956d67b03621" path="/var/lib/kubelet/pods/c5afe93e-c94d-4e57-987b-956d67b03621/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169103 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9198258-4919-4ade-88ba-4a0773b32012" path="/var/lib/kubelet/pods/c9198258-4919-4ade-88ba-4a0773b32012/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169599 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-public-tls-certs\") pod \"65df60b6-4049-47b6-9907-ebf76c151213\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169647 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-public-tls-certs\") pod \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169676 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-config-data\") pod \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169705 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lln2n\" (UniqueName: \"kubernetes.io/projected/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-kube-api-access-lln2n\") pod \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169747 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data\") pod \"02c94bee-a522-4ea6-85af-1ba68e174203\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169799 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-config-data\") pod \"54351653-7ebd-40ba-8181-bb1023f18190\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169857 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-combined-ca-bundle\") pod \"65df60b6-4049-47b6-9907-ebf76c151213\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169881 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-combined-ca-bundle\") pod \"2adbbd67-ccdf-4444-b667-2b549bc200b5\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169902 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-scripts\") pod \"65df60b6-4049-47b6-9907-ebf76c151213\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169921 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvm5x\" (UniqueName: \"kubernetes.io/projected/65df60b6-4049-47b6-9907-ebf76c151213-kube-api-access-lvm5x\") pod \"65df60b6-4049-47b6-9907-ebf76c151213\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169941 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7tg9\" (UniqueName: \"kubernetes.io/projected/02c94bee-a522-4ea6-85af-1ba68e174203-kube-api-access-q7tg9\") pod \"02c94bee-a522-4ea6-85af-1ba68e174203\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.169970 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02c94bee-a522-4ea6-85af-1ba68e174203-logs\") pod \"02c94bee-a522-4ea6-85af-1ba68e174203\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170008 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-certs\") pod \"2adbbd67-ccdf-4444-b667-2b549bc200b5\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170031 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-httpd-run\") pod \"65df60b6-4049-47b6-9907-ebf76c151213\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170064 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-combined-ca-bundle\") pod \"54351653-7ebd-40ba-8181-bb1023f18190\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170091 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data-custom\") pod \"02c94bee-a522-4ea6-85af-1ba68e174203\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170117 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"65df60b6-4049-47b6-9907-ebf76c151213\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170241 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww85p\" (UniqueName: \"kubernetes.io/projected/54351653-7ebd-40ba-8181-bb1023f18190-kube-api-access-ww85p\") pod \"54351653-7ebd-40ba-8181-bb1023f18190\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170270 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-internal-tls-certs\") pod \"02c94bee-a522-4ea6-85af-1ba68e174203\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170295 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-config-data\") pod \"65df60b6-4049-47b6-9907-ebf76c151213\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170327 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02c94bee-a522-4ea6-85af-1ba68e174203-etc-machine-id\") pod \"02c94bee-a522-4ea6-85af-1ba68e174203\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170351 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-logs\") pod \"54351653-7ebd-40ba-8181-bb1023f18190\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170374 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-scripts\") pod \"02c94bee-a522-4ea6-85af-1ba68e174203\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170397 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-public-tls-certs\") pod \"02c94bee-a522-4ea6-85af-1ba68e174203\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170426 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-config\") pod \"2adbbd67-ccdf-4444-b667-2b549bc200b5\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170454 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sjlt\" (UniqueName: \"kubernetes.io/projected/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-api-access-9sjlt\") pod \"2adbbd67-ccdf-4444-b667-2b549bc200b5\" (UID: \"2adbbd67-ccdf-4444-b667-2b549bc200b5\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170494 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-logs\") pod \"65df60b6-4049-47b6-9907-ebf76c151213\" (UID: \"65df60b6-4049-47b6-9907-ebf76c151213\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170516 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-combined-ca-bundle\") pod \"02c94bee-a522-4ea6-85af-1ba68e174203\" (UID: \"02c94bee-a522-4ea6-85af-1ba68e174203\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170540 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-logs\") pod \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170567 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-internal-tls-certs\") pod \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170590 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-internal-tls-certs\") pod \"54351653-7ebd-40ba-8181-bb1023f18190\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170612 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-scripts\") pod \"54351653-7ebd-40ba-8181-bb1023f18190\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170636 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-scripts\") pod \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170661 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"54351653-7ebd-40ba-8181-bb1023f18190\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170687 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-httpd-run\") pod \"54351653-7ebd-40ba-8181-bb1023f18190\" (UID: \"54351653-7ebd-40ba-8181-bb1023f18190\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.170742 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-combined-ca-bundle\") pod \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\" (UID: \"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.171568 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cec1e643-1f06-471e-8215-c690f691bb3c" path="/var/lib/kubelet/pods/cec1e643-1f06-471e-8215-c690f691bb3c/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.172350 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d100f321-6fe6-4eb3-a00c-50b9ff5e2861" path="/var/lib/kubelet/pods/d100f321-6fe6-4eb3-a00c-50b9ff5e2861/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.173433 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" path="/var/lib/kubelet/pods/e0a5beb3-4401-42b8-b8e3-4d2af995a4d0/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.174508 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a9794a-b66f-40d4-9e70-efc6a0a72d83" path="/var/lib/kubelet/pods/e5a9794a-b66f-40d4-9e70-efc6a0a72d83/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.175554 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea79378-54f4-4bc9-9673-04bdf650eb92" path="/var/lib/kubelet/pods/eea79378-54f4-4bc9-9673-04bdf650eb92/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.176221 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f12e361b-e5e4-4c7c-8c4f-fe266937ffda" path="/var/lib/kubelet/pods/f12e361b-e5e4-4c7c-8c4f-fe266937ffda/volumes" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.188056 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-logs" (OuterVolumeSpecName: "logs") pod "54351653-7ebd-40ba-8181-bb1023f18190" (UID: "54351653-7ebd-40ba-8181-bb1023f18190"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.190755 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02c94bee-a522-4ea6-85af-1ba68e174203-logs" (OuterVolumeSpecName: "logs") pod "02c94bee-a522-4ea6-85af-1ba68e174203" (UID: "02c94bee-a522-4ea6-85af-1ba68e174203"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.195134 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "54351653-7ebd-40ba-8181-bb1023f18190" (UID: "54351653-7ebd-40ba-8181-bb1023f18190"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.206523 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02c94bee-a522-4ea6-85af-1ba68e174203-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "02c94bee-a522-4ea6-85af-1ba68e174203" (UID: "02c94bee-a522-4ea6-85af-1ba68e174203"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.207244 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "65df60b6-4049-47b6-9907-ebf76c151213" (UID: "65df60b6-4049-47b6-9907-ebf76c151213"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.207559 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-logs" (OuterVolumeSpecName: "logs") pod "65df60b6-4049-47b6-9907-ebf76c151213" (UID: "65df60b6-4049-47b6-9907-ebf76c151213"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.214070 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-logs" (OuterVolumeSpecName: "logs") pod "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" (UID: "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.216213 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-scripts" (OuterVolumeSpecName: "scripts") pod "54351653-7ebd-40ba-8181-bb1023f18190" (UID: "54351653-7ebd-40ba-8181-bb1023f18190"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.217002 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-kube-api-access-lln2n" (OuterVolumeSpecName: "kube-api-access-lln2n") pod "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" (UID: "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc"). InnerVolumeSpecName "kube-api-access-lln2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.231274 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.231320 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.231338 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5c99967b8c-vjq4g"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.231353 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5c99967b8c-vjq4g"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.253150 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-scripts" (OuterVolumeSpecName: "scripts") pod "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" (UID: "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.262587 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65df60b6-4049-47b6-9907-ebf76c151213-kube-api-access-lvm5x" (OuterVolumeSpecName: "kube-api-access-lvm5x") pod "65df60b6-4049-47b6-9907-ebf76c151213" (UID: "65df60b6-4049-47b6-9907-ebf76c151213"). InnerVolumeSpecName "kube-api-access-lvm5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.262785 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "02c94bee-a522-4ea6-85af-1ba68e174203" (UID: "02c94bee-a522-4ea6-85af-1ba68e174203"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.268190 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-scripts" (OuterVolumeSpecName: "scripts") pod "02c94bee-a522-4ea6-85af-1ba68e174203" (UID: "02c94bee-a522-4ea6-85af-1ba68e174203"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274641 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvm5x\" (UniqueName: \"kubernetes.io/projected/65df60b6-4049-47b6-9907-ebf76c151213-kube-api-access-lvm5x\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274663 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02c94bee-a522-4ea6-85af-1ba68e174203-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274672 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274680 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274689 5002 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02c94bee-a522-4ea6-85af-1ba68e174203-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274698 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274707 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274716 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65df60b6-4049-47b6-9907-ebf76c151213-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274724 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274732 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274741 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274749 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54351653-7ebd-40ba-8181-bb1023f18190-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.274757 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lln2n\" (UniqueName: \"kubernetes.io/projected/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-kube-api-access-lln2n\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.311322 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.198:3000/\": dial tcp 10.217.0.198:3000: connect: connection refused" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.347020 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54351653-7ebd-40ba-8181-bb1023f18190-kube-api-access-ww85p" (OuterVolumeSpecName: "kube-api-access-ww85p") pod "54351653-7ebd-40ba-8181-bb1023f18190" (UID: "54351653-7ebd-40ba-8181-bb1023f18190"). InnerVolumeSpecName "kube-api-access-ww85p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.353042 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "54351653-7ebd-40ba-8181-bb1023f18190" (UID: "54351653-7ebd-40ba-8181-bb1023f18190"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.354731 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "65df60b6-4049-47b6-9907-ebf76c151213" (UID: "65df60b6-4049-47b6-9907-ebf76c151213"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.354832 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-scripts" (OuterVolumeSpecName: "scripts") pod "65df60b6-4049-47b6-9907-ebf76c151213" (UID: "65df60b6-4049-47b6-9907-ebf76c151213"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.355163 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02c94bee-a522-4ea6-85af-1ba68e174203-kube-api-access-q7tg9" (OuterVolumeSpecName: "kube-api-access-q7tg9") pod "02c94bee-a522-4ea6-85af-1ba68e174203" (UID: "02c94bee-a522-4ea6-85af-1ba68e174203"). InnerVolumeSpecName "kube-api-access-q7tg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.355238 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-api-access-9sjlt" (OuterVolumeSpecName: "kube-api-access-9sjlt") pod "2adbbd67-ccdf-4444-b667-2b549bc200b5" (UID: "2adbbd67-ccdf-4444-b667-2b549bc200b5"). InnerVolumeSpecName "kube-api-access-9sjlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: E1209 10:25:22.364073 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a is running failed: container process not found" containerID="36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 10:25:22 crc kubenswrapper[5002]: E1209 10:25:22.364833 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a is running failed: container process not found" containerID="36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 10:25:22 crc kubenswrapper[5002]: E1209 10:25:22.365188 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a is running failed: container process not found" containerID="36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 10:25:22 crc kubenswrapper[5002]: E1209 10:25:22.365217 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="43512a9c-be3a-4c0e-a178-82c5a065acf4" containerName="nova-scheduler-scheduler" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.378769 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww85p\" (UniqueName: \"kubernetes.io/projected/54351653-7ebd-40ba-8181-bb1023f18190-kube-api-access-ww85p\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.378830 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.378846 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sjlt\" (UniqueName: \"kubernetes.io/projected/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-api-access-9sjlt\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.378865 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.378879 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7tg9\" (UniqueName: \"kubernetes.io/projected/02c94bee-a522-4ea6-85af-1ba68e174203-kube-api-access-q7tg9\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.378891 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.476536 5002 scope.go:117] "RemoveContainer" containerID="b248460156a9f8eb7f40491f548be924840ee184fd17bcff0402ca92134b5847" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.483919 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-config-data" (OuterVolumeSpecName: "config-data") pod "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" (UID: "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.483958 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2adbbd67-ccdf-4444-b667-2b549bc200b5" (UID: "2adbbd67-ccdf-4444-b667-2b549bc200b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.484103 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54351653-7ebd-40ba-8181-bb1023f18190" (UID: "54351653-7ebd-40ba-8181-bb1023f18190"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.505088 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02c94bee-a522-4ea6-85af-1ba68e174203" (UID: "02c94bee-a522-4ea6-85af-1ba68e174203"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.515551 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data" (OuterVolumeSpecName: "config-data") pod "02c94bee-a522-4ea6-85af-1ba68e174203" (UID: "02c94bee-a522-4ea6-85af-1ba68e174203"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.572970 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65df60b6-4049-47b6-9907-ebf76c151213" (UID: "65df60b6-4049-47b6-9907-ebf76c151213"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.579453 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.581844 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.581881 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.581897 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.581913 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.581925 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.581937 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.581949 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.591988 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" (UID: "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.599322 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.601421 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "2adbbd67-ccdf-4444-b667-2b549bc200b5" (UID: "2adbbd67-ccdf-4444-b667-2b549bc200b5"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.633078 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "65df60b6-4049-47b6-9907-ebf76c151213" (UID: "65df60b6-4049-47b6-9907-ebf76c151213"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.636460 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "54351653-7ebd-40ba-8181-bb1023f18190" (UID: "54351653-7ebd-40ba-8181-bb1023f18190"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.637303 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "2adbbd67-ccdf-4444-b667-2b549bc200b5" (UID: "2adbbd67-ccdf-4444-b667-2b549bc200b5"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.662041 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-config-data" (OuterVolumeSpecName: "config-data") pod "65df60b6-4049-47b6-9907-ebf76c151213" (UID: "65df60b6-4049-47b6-9907-ebf76c151213"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.672116 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "02c94bee-a522-4ea6-85af-1ba68e174203" (UID: "02c94bee-a522-4ea6-85af-1ba68e174203"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.672736 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican73fb-account-delete-6zw8z" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.683513 5002 scope.go:117] "RemoveContainer" containerID="3edf9b4007e80e9e88d05e62a1daa140acdad44a7fdd6235ed0a8bb73242f7e0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.689925 5002 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.689956 5002 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.689968 5002 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.689980 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.689992 5002 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.690003 5002 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2adbbd67-ccdf-4444-b667-2b549bc200b5-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.690015 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.690024 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65df60b6-4049-47b6-9907-ebf76c151213-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.702568 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancea775-account-delete-zmp4n" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.704417 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.715868 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-config-data" (OuterVolumeSpecName: "config-data") pod "54351653-7ebd-40ba-8181-bb1023f18190" (UID: "54351653-7ebd-40ba-8181-bb1023f18190"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.728015 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.728309 5002 scope.go:117] "RemoveContainer" containerID="5f516788aa2a399cee7b3aa95c438e477b373a2ef4a7033783be06cdfa843ec6" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.728744 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "02c94bee-a522-4ea6-85af-1ba68e174203" (UID: "02c94bee-a522-4ea6-85af-1ba68e174203"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.750836 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" (UID: "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.758335 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" (UID: "0172d8ed-9ef1-4aac-b246-1b1ed0df87fc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.776995 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cindere0a9-account-delete-b5zfk" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.778211 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.795934 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe95257d-a02e-4f04-a543-a2db08231043-operator-scripts\") pod \"fe95257d-a02e-4f04-a543-a2db08231043\" (UID: \"fe95257d-a02e-4f04-a543-a2db08231043\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.796116 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4wkz\" (UniqueName: \"kubernetes.io/projected/fe95257d-a02e-4f04-a543-a2db08231043-kube-api-access-h4wkz\") pod \"fe95257d-a02e-4f04-a543-a2db08231043\" (UID: \"fe95257d-a02e-4f04-a543-a2db08231043\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.796627 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe95257d-a02e-4f04-a543-a2db08231043-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe95257d-a02e-4f04-a543-a2db08231043" (UID: "fe95257d-a02e-4f04-a543-a2db08231043"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.798514 5002 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.798535 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe95257d-a02e-4f04-a543-a2db08231043-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.798545 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54351653-7ebd-40ba-8181-bb1023f18190-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.798553 5002 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c94bee-a522-4ea6-85af-1ba68e174203-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.798562 5002 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.803050 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe95257d-a02e-4f04-a543-a2db08231043-kube-api-access-h4wkz" (OuterVolumeSpecName: "kube-api-access-h4wkz") pod "fe95257d-a02e-4f04-a543-a2db08231043" (UID: "fe95257d-a02e-4f04-a543-a2db08231043"). InnerVolumeSpecName "kube-api-access-h4wkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.807548 5002 generic.go:334] "Generic (PLEG): container finished" podID="43512a9c-be3a-4c0e-a178-82c5a065acf4" containerID="36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a" exitCode=0 Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.807673 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43512a9c-be3a-4c0e-a178-82c5a065acf4","Type":"ContainerDied","Data":"36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.816136 5002 generic.go:334] "Generic (PLEG): container finished" podID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerID="2944a25a7c0f087015f80b3d4d12a8b2ffabefdcb9d6f6b684e88e4e6b57e2db" exitCode=0 Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.816203 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c454948fd-lwcxn" event={"ID":"a4061af7-7669-4bd4-a36c-6ec982e86753","Type":"ContainerDied","Data":"2944a25a7c0f087015f80b3d4d12a8b2ffabefdcb9d6f6b684e88e4e6b57e2db"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.821226 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02c94bee-a522-4ea6-85af-1ba68e174203","Type":"ContainerDied","Data":"9e5966a72eb7d56f519b1957f1928609fb0ddbbdba8bfc272c163fdf3153bcf9"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.821262 5002 scope.go:117] "RemoveContainer" containerID="d1c65c897a0495448b0bba138435b4e6ce0da36de283bd456d1269f9d3226c84" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.821369 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.825796 5002 generic.go:334] "Generic (PLEG): container finished" podID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerID="7bf0075652dce88cf4c715938171cbd87da9f63720497ca4a0f0e4c414c5e29f" exitCode=0 Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.825927 5002 generic.go:334] "Generic (PLEG): container finished" podID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerID="7a033c871bfeb20ac9488a0b5376331a6b527d79d3198b2f244e621f3b53eb12" exitCode=0 Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.826020 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5893e6fa-5b64-47e0-b8e1-f68baf27a65c","Type":"ContainerDied","Data":"7bf0075652dce88cf4c715938171cbd87da9f63720497ca4a0f0e4c414c5e29f"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.826099 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5893e6fa-5b64-47e0-b8e1-f68baf27a65c","Type":"ContainerDied","Data":"7a033c871bfeb20ac9488a0b5376331a6b527d79d3198b2f244e621f3b53eb12"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.863967 5002 scope.go:117] "RemoveContainer" containerID="63586c424dc327bbd0545721f624a8d5562a690863d2d8565ad85890deded297" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.864426 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-857f77df5c-skx8f" podUID="41f46a2d-f158-497f-b61b-60f39c64149b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.174:9696/\": dial tcp 10.217.0.174:9696: connect: connection refused" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.867920 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2adbbd67-ccdf-4444-b667-2b549bc200b5","Type":"ContainerDied","Data":"6003a40fb265d020620cab6aa92b8f309caa3b915b079cf0782a54294791a405"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.868047 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.893264 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899058 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae6c00ce-3152-42ae-890f-bb76aac103c5-operator-scripts\") pod \"ae6c00ce-3152-42ae-890f-bb76aac103c5\" (UID: \"ae6c00ce-3152-42ae-890f-bb76aac103c5\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899097 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-config-data\") pod \"cd8a7609-928f-4a68-9903-fa846e4baeda\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899125 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sgqj\" (UniqueName: \"kubernetes.io/projected/cd8a7609-928f-4a68-9903-fa846e4baeda-kube-api-access-6sgqj\") pod \"cd8a7609-928f-4a68-9903-fa846e4baeda\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899158 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-public-tls-certs\") pod \"6b5836b7-7b16-477f-9a20-f30032362374\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899207 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-memcached-tls-certs\") pod \"1e36e954-d9c1-41e3-8542-e8f300db90cb\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899238 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6b9775f-22d1-413b-8d2f-1dbe890b582c-operator-scripts\") pod \"c6b9775f-22d1-413b-8d2f-1dbe890b582c\" (UID: \"c6b9775f-22d1-413b-8d2f-1dbe890b582c\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899263 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7ptd\" (UniqueName: \"kubernetes.io/projected/ae6c00ce-3152-42ae-890f-bb76aac103c5-kube-api-access-d7ptd\") pod \"ae6c00ce-3152-42ae-890f-bb76aac103c5\" (UID: \"ae6c00ce-3152-42ae-890f-bb76aac103c5\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899294 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-combined-ca-bundle\") pod \"6b5836b7-7b16-477f-9a20-f30032362374\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899320 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-combined-ca-bundle\") pod \"cd8a7609-928f-4a68-9903-fa846e4baeda\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899502 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-nova-metadata-tls-certs\") pod \"cd8a7609-928f-4a68-9903-fa846e4baeda\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899561 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-config-data\") pod \"6b5836b7-7b16-477f-9a20-f30032362374\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899598 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5836b7-7b16-477f-9a20-f30032362374-logs\") pod \"6b5836b7-7b16-477f-9a20-f30032362374\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899659 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs79l\" (UniqueName: \"kubernetes.io/projected/1e36e954-d9c1-41e3-8542-e8f300db90cb-kube-api-access-vs79l\") pod \"1e36e954-d9c1-41e3-8542-e8f300db90cb\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899687 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-internal-tls-certs\") pod \"6b5836b7-7b16-477f-9a20-f30032362374\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899755 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nsvr\" (UniqueName: \"kubernetes.io/projected/6b5836b7-7b16-477f-9a20-f30032362374-kube-api-access-5nsvr\") pod \"6b5836b7-7b16-477f-9a20-f30032362374\" (UID: \"6b5836b7-7b16-477f-9a20-f30032362374\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899769 5002 generic.go:334] "Generic (PLEG): container finished" podID="58de676b-7b73-4c04-b5d5-5de38a88072c" containerID="2cb6f2c5af24125d366be1b3d0454fead9863b253da1642f513cc98b1fc0571b" exitCode=0 Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899806 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd8a7609-928f-4a68-9903-fa846e4baeda-logs\") pod \"cd8a7609-928f-4a68-9903-fa846e4baeda\" (UID: \"cd8a7609-928f-4a68-9903-fa846e4baeda\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899869 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-config-data\") pod \"1e36e954-d9c1-41e3-8542-e8f300db90cb\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899880 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0539b-account-delete-t9blx" event={"ID":"58de676b-7b73-4c04-b5d5-5de38a88072c","Type":"ContainerDied","Data":"2cb6f2c5af24125d366be1b3d0454fead9863b253da1642f513cc98b1fc0571b"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899923 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-kolla-config\") pod \"1e36e954-d9c1-41e3-8542-e8f300db90cb\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.899970 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-combined-ca-bundle\") pod \"1e36e954-d9c1-41e3-8542-e8f300db90cb\" (UID: \"1e36e954-d9c1-41e3-8542-e8f300db90cb\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.900032 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp2kq\" (UniqueName: \"kubernetes.io/projected/c6b9775f-22d1-413b-8d2f-1dbe890b582c-kube-api-access-mp2kq\") pod \"c6b9775f-22d1-413b-8d2f-1dbe890b582c\" (UID: \"c6b9775f-22d1-413b-8d2f-1dbe890b582c\") " Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.900761 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4wkz\" (UniqueName: \"kubernetes.io/projected/fe95257d-a02e-4f04-a543-a2db08231043-kube-api-access-h4wkz\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:22 crc kubenswrapper[5002]: E1209 10:25:22.902447 5002 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 09 10:25:22 crc kubenswrapper[5002]: E1209 10:25:22.902525 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:24.902505306 +0000 UTC m=+1457.294556397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-config-data" not found Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.905688 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6c00ce-3152-42ae-890f-bb76aac103c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae6c00ce-3152-42ae-890f-bb76aac103c5" (UID: "ae6c00ce-3152-42ae-890f-bb76aac103c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.906314 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6b9775f-22d1-413b-8d2f-1dbe890b582c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6b9775f-22d1-413b-8d2f-1dbe890b582c" (UID: "c6b9775f-22d1-413b-8d2f-1dbe890b582c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.906310 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b5836b7-7b16-477f-9a20-f30032362374-logs" (OuterVolumeSpecName: "logs") pod "6b5836b7-7b16-477f-9a20-f30032362374" (UID: "6b5836b7-7b16-477f-9a20-f30032362374"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.906373 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "1e36e954-d9c1-41e3-8542-e8f300db90cb" (UID: "1e36e954-d9c1-41e3-8542-e8f300db90cb"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: E1209 10:25:22.906414 5002 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 09 10:25:22 crc kubenswrapper[5002]: E1209 10:25:22.906474 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:24.906453042 +0000 UTC m=+1457.298504183 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-config-data" not found Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.907879 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd8a7609-928f-4a68-9903-fa846e4baeda-kube-api-access-6sgqj" (OuterVolumeSpecName: "kube-api-access-6sgqj") pod "cd8a7609-928f-4a68-9903-fa846e4baeda" (UID: "cd8a7609-928f-4a68-9903-fa846e4baeda"). InnerVolumeSpecName "kube-api-access-6sgqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.908449 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd8a7609-928f-4a68-9903-fa846e4baeda-logs" (OuterVolumeSpecName: "logs") pod "cd8a7609-928f-4a68-9903-fa846e4baeda" (UID: "cd8a7609-928f-4a68-9903-fa846e4baeda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: E1209 10:25:22.909541 5002 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 10:25:22 crc kubenswrapper[5002]: E1209 10:25:22.909596 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:24.909576886 +0000 UTC m=+1457.301628037 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-scripts" not found Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.910004 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-config-data" (OuterVolumeSpecName: "config-data") pod "1e36e954-d9c1-41e3-8542-e8f300db90cb" (UID: "1e36e954-d9c1-41e3-8542-e8f300db90cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.913102 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.916636 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancea775-account-delete-zmp4n" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.918924 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancea775-account-delete-zmp4n" event={"ID":"c6b9775f-22d1-413b-8d2f-1dbe890b582c","Type":"ContainerDied","Data":"52a71cddee3baa6357f7934a437e21cdbc14fece93531ef0157841bd9610069e"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.918989 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52a71cddee3baa6357f7934a437e21cdbc14fece93531ef0157841bd9610069e" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.930099 5002 scope.go:117] "RemoveContainer" containerID="7599cb61e23fe24a1fe7539f2fe49839b0013514838b373c23975501d3f53ed4" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.939123 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.941178 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e36e954-d9c1-41e3-8542-e8f300db90cb-kube-api-access-vs79l" (OuterVolumeSpecName: "kube-api-access-vs79l") pod "1e36e954-d9c1-41e3-8542-e8f300db90cb" (UID: "1e36e954-d9c1-41e3-8542-e8f300db90cb"). InnerVolumeSpecName "kube-api-access-vs79l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.941366 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b5836b7-7b16-477f-9a20-f30032362374-kube-api-access-5nsvr" (OuterVolumeSpecName: "kube-api-access-5nsvr") pod "6b5836b7-7b16-477f-9a20-f30032362374" (UID: "6b5836b7-7b16-477f-9a20-f30032362374"). InnerVolumeSpecName "kube-api-access-5nsvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.941472 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6b9775f-22d1-413b-8d2f-1dbe890b582c-kube-api-access-mp2kq" (OuterVolumeSpecName: "kube-api-access-mp2kq") pod "c6b9775f-22d1-413b-8d2f-1dbe890b582c" (UID: "c6b9775f-22d1-413b-8d2f-1dbe890b582c"). InnerVolumeSpecName "kube-api-access-mp2kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.943902 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae6c00ce-3152-42ae-890f-bb76aac103c5-kube-api-access-d7ptd" (OuterVolumeSpecName: "kube-api-access-d7ptd") pod "ae6c00ce-3152-42ae-890f-bb76aac103c5" (UID: "ae6c00ce-3152-42ae-890f-bb76aac103c5"). InnerVolumeSpecName "kube-api-access-d7ptd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.949560 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.950291 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54351653-7ebd-40ba-8181-bb1023f18190","Type":"ContainerDied","Data":"6ac47bbc9469d9515e9d41c57c9542c548a6ab9b8c0c065b01deb61a1d008418"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.950369 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.961002 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65df60b6-4049-47b6-9907-ebf76c151213","Type":"ContainerDied","Data":"dfb3a50b874247de5cc4a9c4ef2098e5c6f64cce51b380cdfee9c6796737e449"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.961342 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.965734 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.968490 5002 generic.go:334] "Generic (PLEG): container finished" podID="6b5836b7-7b16-477f-9a20-f30032362374" containerID="aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304" exitCode=0 Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.968566 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.968777 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b5836b7-7b16-477f-9a20-f30032362374","Type":"ContainerDied","Data":"aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.968833 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b5836b7-7b16-477f-9a20-f30032362374","Type":"ContainerDied","Data":"b4ae8614bcce2d361a6cf0745ddf8e2e52c50ea1762b1b442ae058f343acb92f"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.970546 5002 scope.go:117] "RemoveContainer" containerID="1e6f6d93e3bd5dcd93496f1389572c1e7e7d5c754f5b03c5cf8bcd4e3532e5b0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.982899 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.985821 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.987524 5002 generic.go:334] "Generic (PLEG): container finished" podID="f702a539-ec25-44d4-8629-97b3c5499b96" containerID="87bebcf10614da44af2b08b3844e8a098da235879ef6a0ce2fdbe6d780cb77c8" exitCode=0 Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.989214 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.989229 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-config-data" (OuterVolumeSpecName: "config-data") pod "cd8a7609-928f-4a68-9903-fa846e4baeda" (UID: "cd8a7609-928f-4a68-9903-fa846e4baeda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.989314 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f702a539-ec25-44d4-8629-97b3c5499b96","Type":"ContainerDied","Data":"87bebcf10614da44af2b08b3844e8a098da235879ef6a0ce2fdbe6d780cb77c8"} Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.998395 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:25:22 crc kubenswrapper[5002]: I1209 10:25:22.999284 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b5836b7-7b16-477f-9a20-f30032362374" (UID: "6b5836b7-7b16-477f-9a20-f30032362374"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.003243 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004395 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6b9775f-22d1-413b-8d2f-1dbe890b582c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004417 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7ptd\" (UniqueName: \"kubernetes.io/projected/ae6c00ce-3152-42ae-890f-bb76aac103c5-kube-api-access-d7ptd\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004426 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004434 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5836b7-7b16-477f-9a20-f30032362374-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004442 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs79l\" (UniqueName: \"kubernetes.io/projected/1e36e954-d9c1-41e3-8542-e8f300db90cb-kube-api-access-vs79l\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004450 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nsvr\" (UniqueName: \"kubernetes.io/projected/6b5836b7-7b16-477f-9a20-f30032362374-kube-api-access-5nsvr\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004458 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd8a7609-928f-4a68-9903-fa846e4baeda-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004466 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004474 5002 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1e36e954-d9c1-41e3-8542-e8f300db90cb-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004482 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp2kq\" (UniqueName: \"kubernetes.io/projected/c6b9775f-22d1-413b-8d2f-1dbe890b582c-kube-api-access-mp2kq\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004490 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae6c00ce-3152-42ae-890f-bb76aac103c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004499 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.004507 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sgqj\" (UniqueName: \"kubernetes.io/projected/cd8a7609-928f-4a68-9903-fa846e4baeda-kube-api-access-6sgqj\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.007932 5002 scope.go:117] "RemoveContainer" containerID="1df4bb922098dea87efcb2c4b87131e9b7b7222c65f32b527e20b6017dad1370" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.011110 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd8a7609-928f-4a68-9903-fa846e4baeda" (UID: "cd8a7609-928f-4a68-9903-fa846e4baeda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.011620 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican73fb-account-delete-6zw8z" event={"ID":"fe95257d-a02e-4f04-a543-a2db08231043","Type":"ContainerDied","Data":"bf76248eab81c01b78ef18341d0bd4ee580bf6798431d646712839a831d76c9f"} Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.011689 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf76248eab81c01b78ef18341d0bd4ee580bf6798431d646712839a831d76c9f" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.011753 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican73fb-account-delete-6zw8z" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.018642 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56f74754d8-5pd9q" event={"ID":"0172d8ed-9ef1-4aac-b246-1b1ed0df87fc","Type":"ContainerDied","Data":"aea051762e75525f31b492676d261ace96e48ef4c3f11cdd6c08c119ff71d01a"} Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.018744 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56f74754d8-5pd9q" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.023375 5002 generic.go:334] "Generic (PLEG): container finished" podID="1e36e954-d9c1-41e3-8542-e8f300db90cb" containerID="b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8" exitCode=0 Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.023438 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1e36e954-d9c1-41e3-8542-e8f300db90cb","Type":"ContainerDied","Data":"b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8"} Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.023463 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1e36e954-d9c1-41e3-8542-e8f300db90cb","Type":"ContainerDied","Data":"2f2fe9d583ff861743289c5cbd4368ed28b66a703168bd606e18025b9335fab7"} Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.023513 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.025670 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6b5836b7-7b16-477f-9a20-f30032362374" (UID: "6b5836b7-7b16-477f-9a20-f30032362374"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.026661 5002 generic.go:334] "Generic (PLEG): container finished" podID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerID="177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780" exitCode=0 Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.026706 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8a7609-928f-4a68-9903-fa846e4baeda","Type":"ContainerDied","Data":"177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780"} Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.026723 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8a7609-928f-4a68-9903-fa846e4baeda","Type":"ContainerDied","Data":"0d74668f0d505c6a8c3194b12cdbeebf089a0943b4250117059eacc60705a897"} Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.026771 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.030181 5002 generic.go:334] "Generic (PLEG): container finished" podID="a67e154b-1de7-4e2b-9b87-049ea273fa01" containerID="3ee0c33be841aa47fb9e6a001724fc9d6d043c260bf489595a85fb2dccaf7988" exitCode=0 Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.030250 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement60c5-account-delete-729k9" event={"ID":"a67e154b-1de7-4e2b-9b87-049ea273fa01","Type":"ContainerDied","Data":"3ee0c33be841aa47fb9e6a001724fc9d6d043c260bf489595a85fb2dccaf7988"} Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.031048 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "1e36e954-d9c1-41e3-8542-e8f300db90cb" (UID: "1e36e954-d9c1-41e3-8542-e8f300db90cb"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.034709 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cindere0a9-account-delete-b5zfk" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.035292 5002 scope.go:117] "RemoveContainer" containerID="165a1620604d830dded2bfca6d82a903dd8294bcf0db3c611e329f3f40e88ace" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.035365 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cindere0a9-account-delete-b5zfk" event={"ID":"ae6c00ce-3152-42ae-890f-bb76aac103c5","Type":"ContainerDied","Data":"0b53d8e361dde882e6f1fc7c25ad43d552fd155607d3b0f8ef6d766e9eb8fb9d"} Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.035391 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b53d8e361dde882e6f1fc7c25ad43d552fd155607d3b0f8ef6d766e9eb8fb9d" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.072110 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cd8a7609-928f-4a68-9903-fa846e4baeda" (UID: "cd8a7609-928f-4a68-9903-fa846e4baeda"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.078751 5002 scope.go:117] "RemoveContainer" containerID="1dd5020ee445c45b526aafb1a2e0ba3b4c5c0fb89e017224140c385ac48d4a20" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.084602 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e36e954-d9c1-41e3-8542-e8f300db90cb" (UID: "1e36e954-d9c1-41e3-8542-e8f300db90cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.098844 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6b5836b7-7b16-477f-9a20-f30032362374" (UID: "6b5836b7-7b16-477f-9a20-f30032362374"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.100644 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-config-data" (OuterVolumeSpecName: "config-data") pod "6b5836b7-7b16-477f-9a20-f30032362374" (UID: "6b5836b7-7b16-477f-9a20-f30032362374"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105163 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-combined-ca-bundle\") pod \"f702a539-ec25-44d4-8629-97b3c5499b96\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105191 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data-custom\") pod \"f702a539-ec25-44d4-8629-97b3c5499b96\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105283 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f702a539-ec25-44d4-8629-97b3c5499b96-etc-machine-id\") pod \"f702a539-ec25-44d4-8629-97b3c5499b96\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105300 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-combined-ca-bundle\") pod \"a4061af7-7669-4bd4-a36c-6ec982e86753\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105739 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data\") pod \"f702a539-ec25-44d4-8629-97b3c5499b96\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105768 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data\") pod \"a4061af7-7669-4bd4-a36c-6ec982e86753\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105793 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlfpl\" (UniqueName: \"kubernetes.io/projected/f702a539-ec25-44d4-8629-97b3c5499b96-kube-api-access-xlfpl\") pod \"f702a539-ec25-44d4-8629-97b3c5499b96\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105862 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-config-data\") pod \"43512a9c-be3a-4c0e-a178-82c5a065acf4\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105901 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-combined-ca-bundle\") pod \"43512a9c-be3a-4c0e-a178-82c5a065acf4\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105936 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpcg9\" (UniqueName: \"kubernetes.io/projected/43512a9c-be3a-4c0e-a178-82c5a065acf4-kube-api-access-rpcg9\") pod \"43512a9c-be3a-4c0e-a178-82c5a065acf4\" (UID: \"43512a9c-be3a-4c0e-a178-82c5a065acf4\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105973 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-scripts\") pod \"f702a539-ec25-44d4-8629-97b3c5499b96\" (UID: \"f702a539-ec25-44d4-8629-97b3c5499b96\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.105992 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-internal-tls-certs\") pod \"a4061af7-7669-4bd4-a36c-6ec982e86753\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.106241 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4061af7-7669-4bd4-a36c-6ec982e86753-logs\") pod \"a4061af7-7669-4bd4-a36c-6ec982e86753\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.106270 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcfbp\" (UniqueName: \"kubernetes.io/projected/a4061af7-7669-4bd4-a36c-6ec982e86753-kube-api-access-tcfbp\") pod \"a4061af7-7669-4bd4-a36c-6ec982e86753\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.106300 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-public-tls-certs\") pod \"a4061af7-7669-4bd4-a36c-6ec982e86753\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.106320 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data-custom\") pod \"a4061af7-7669-4bd4-a36c-6ec982e86753\" (UID: \"a4061af7-7669-4bd4-a36c-6ec982e86753\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.107659 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.107676 5002 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd8a7609-928f-4a68-9903-fa846e4baeda-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.107686 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.107695 5002 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.117536 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.121588 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4061af7-7669-4bd4-a36c-6ec982e86753-logs" (OuterVolumeSpecName: "logs") pod "a4061af7-7669-4bd4-a36c-6ec982e86753" (UID: "a4061af7-7669-4bd4-a36c-6ec982e86753"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.117462 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f702a539-ec25-44d4-8629-97b3c5499b96-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f702a539-ec25-44d4-8629-97b3c5499b96" (UID: "f702a539-ec25-44d4-8629-97b3c5499b96"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.128016 5002 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5836b7-7b16-477f-9a20-f30032362374-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.128075 5002 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e36e954-d9c1-41e3-8542-e8f300db90cb-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.132323 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f702a539-ec25-44d4-8629-97b3c5499b96-kube-api-access-xlfpl" (OuterVolumeSpecName: "kube-api-access-xlfpl") pod "f702a539-ec25-44d4-8629-97b3c5499b96" (UID: "f702a539-ec25-44d4-8629-97b3c5499b96"). InnerVolumeSpecName "kube-api-access-xlfpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.145473 5002 scope.go:117] "RemoveContainer" containerID="aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.148440 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-scripts" (OuterVolumeSpecName: "scripts") pod "f702a539-ec25-44d4-8629-97b3c5499b96" (UID: "f702a539-ec25-44d4-8629-97b3c5499b96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.149136 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f702a539-ec25-44d4-8629-97b3c5499b96" (UID: "f702a539-ec25-44d4-8629-97b3c5499b96"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.149264 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4061af7-7669-4bd4-a36c-6ec982e86753-kube-api-access-tcfbp" (OuterVolumeSpecName: "kube-api-access-tcfbp") pod "a4061af7-7669-4bd4-a36c-6ec982e86753" (UID: "a4061af7-7669-4bd4-a36c-6ec982e86753"). InnerVolumeSpecName "kube-api-access-tcfbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.152938 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43512a9c-be3a-4c0e-a178-82c5a065acf4-kube-api-access-rpcg9" (OuterVolumeSpecName: "kube-api-access-rpcg9") pod "43512a9c-be3a-4c0e-a178-82c5a065acf4" (UID: "43512a9c-be3a-4c0e-a178-82c5a065acf4"). InnerVolumeSpecName "kube-api-access-rpcg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.160183 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4061af7-7669-4bd4-a36c-6ec982e86753" (UID: "a4061af7-7669-4bd4-a36c-6ec982e86753"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.183633 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43512a9c-be3a-4c0e-a178-82c5a065acf4" (UID: "43512a9c-be3a-4c0e-a178-82c5a065acf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.206051 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-config-data" (OuterVolumeSpecName: "config-data") pod "43512a9c-be3a-4c0e-a178-82c5a065acf4" (UID: "43512a9c-be3a-4c0e-a178-82c5a065acf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.206913 5002 scope.go:117] "RemoveContainer" containerID="290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.215731 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4061af7-7669-4bd4-a36c-6ec982e86753" (UID: "a4061af7-7669-4bd4-a36c-6ec982e86753"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.227871 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.229347 5002 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f702a539-ec25-44d4-8629-97b3c5499b96-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.229369 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.229404 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlfpl\" (UniqueName: \"kubernetes.io/projected/f702a539-ec25-44d4-8629-97b3c5499b96-kube-api-access-xlfpl\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.229419 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.229431 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43512a9c-be3a-4c0e-a178-82c5a065acf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.229441 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpcg9\" (UniqueName: \"kubernetes.io/projected/43512a9c-be3a-4c0e-a178-82c5a065acf4-kube-api-access-rpcg9\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.229450 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.229481 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4061af7-7669-4bd4-a36c-6ec982e86753-logs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.229492 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcfbp\" (UniqueName: \"kubernetes.io/projected/a4061af7-7669-4bd4-a36c-6ec982e86753-kube-api-access-tcfbp\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.229503 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.229572 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.236406 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a4061af7-7669-4bd4-a36c-6ec982e86753" (UID: "a4061af7-7669-4bd4-a36c-6ec982e86753"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.241914 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.242195 5002 scope.go:117] "RemoveContainer" containerID="aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304" Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.242879 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304\": container with ID starting with aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304 not found: ID does not exist" containerID="aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.243412 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304"} err="failed to get container status \"aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304\": rpc error: code = NotFound desc = could not find container \"aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304\": container with ID starting with aef926ebfeded32c3d77daf8bf94adfe524f72c37fbafa444980929d4131d304 not found: ID does not exist" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.243546 5002 scope.go:117] "RemoveContainer" containerID="290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd" Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.244579 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd\": container with ID starting with 290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd not found: ID does not exist" containerID="290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.244927 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd"} err="failed to get container status \"290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd\": rpc error: code = NotFound desc = could not find container \"290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd\": container with ID starting with 290faba9bb523cd08a95a906cba830cdb0fb097cbabfe914375d5a0fcdb253cd not found: ID does not exist" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.245687 5002 scope.go:117] "RemoveContainer" containerID="346c243539a09bb0cf2ecabd1fa68b92b5e3b4d887823c3ba0eff1a45067f934" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.251803 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-56f74754d8-5pd9q"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.256926 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f702a539-ec25-44d4-8629-97b3c5499b96" (UID: "f702a539-ec25-44d4-8629-97b3c5499b96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.258190 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-56f74754d8-5pd9q"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.267404 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data" (OuterVolumeSpecName: "config-data") pod "a4061af7-7669-4bd4-a36c-6ec982e86753" (UID: "a4061af7-7669-4bd4-a36c-6ec982e86753"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.274109 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a4061af7-7669-4bd4-a36c-6ec982e86753" (UID: "a4061af7-7669-4bd4-a36c-6ec982e86753"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.277015 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancea775-account-delete-zmp4n"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.281751 5002 scope.go:117] "RemoveContainer" containerID="87bebcf10614da44af2b08b3844e8a098da235879ef6a0ce2fdbe6d780cb77c8" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.287650 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glancea775-account-delete-zmp4n"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.292189 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data" (OuterVolumeSpecName: "config-data") pod "f702a539-ec25-44d4-8629-97b3c5499b96" (UID: "f702a539-ec25-44d4-8629-97b3c5499b96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.330996 5002 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.331026 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.331036 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f702a539-ec25-44d4-8629-97b3c5499b96-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.331044 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.331053 5002 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4061af7-7669-4bd4-a36c-6ec982e86753-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.331080 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="7faabd78-c9ab-4397-aa4d-b8aaff302251" containerName="galera" containerID="cri-o://6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5" gracePeriod=30 Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.494021 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron1d25-account-delete-f87kn" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.509754 5002 scope.go:117] "RemoveContainer" containerID="1767450d54e834d07c9f4e17540dd48734de123d1bb880696d3cd80a533970df" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.529546 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.538734 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c44aced5-6d19-429a-8917-cd4229341433-operator-scripts\") pod \"c44aced5-6d19-429a-8917-cd4229341433\" (UID: \"c44aced5-6d19-429a-8917-cd4229341433\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.538882 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z82r\" (UniqueName: \"kubernetes.io/projected/c44aced5-6d19-429a-8917-cd4229341433-kube-api-access-4z82r\") pod \"c44aced5-6d19-429a-8917-cd4229341433\" (UID: \"c44aced5-6d19-429a-8917-cd4229341433\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.543509 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c44aced5-6d19-429a-8917-cd4229341433-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c44aced5-6d19-429a-8917-cd4229341433" (UID: "c44aced5-6d19-429a-8917-cd4229341433"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.550025 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44aced5-6d19-429a-8917-cd4229341433-kube-api-access-4z82r" (OuterVolumeSpecName: "kube-api-access-4z82r") pod "c44aced5-6d19-429a-8917-cd4229341433" (UID: "c44aced5-6d19-429a-8917-cd4229341433"). InnerVolumeSpecName "kube-api-access-4z82r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.551563 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.570540 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.575430 5002 scope.go:117] "RemoveContainer" containerID="e31fd05bd7e517de1ef420971d431d6a2f3089efe18a744cd769acc8bbc26a81" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.578956 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.599241 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.614904 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.620882 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0539b-account-delete-t9blx" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.622763 5002 scope.go:117] "RemoveContainer" containerID="b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.640411 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfspk\" (UniqueName: \"kubernetes.io/projected/58de676b-7b73-4c04-b5d5-5de38a88072c-kube-api-access-sfspk\") pod \"58de676b-7b73-4c04-b5d5-5de38a88072c\" (UID: \"58de676b-7b73-4c04-b5d5-5de38a88072c\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.640511 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58de676b-7b73-4c04-b5d5-5de38a88072c-operator-scripts\") pod \"58de676b-7b73-4c04-b5d5-5de38a88072c\" (UID: \"58de676b-7b73-4c04-b5d5-5de38a88072c\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.640801 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c44aced5-6d19-429a-8917-cd4229341433-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.640836 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z82r\" (UniqueName: \"kubernetes.io/projected/c44aced5-6d19-429a-8917-cd4229341433-kube-api-access-4z82r\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.640890 5002 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.640929 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data podName:9278e14e-2524-4e42-b870-f493ea02ede8 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:31.640915475 +0000 UTC m=+1464.032966556 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data") pod "rabbitmq-cell1-server-0" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8") : configmap "rabbitmq-cell1-config-data" not found Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.643494 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58de676b-7b73-4c04-b5d5-5de38a88072c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58de676b-7b73-4c04-b5d5-5de38a88072c" (UID: "58de676b-7b73-4c04-b5d5-5de38a88072c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.655639 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58de676b-7b73-4c04-b5d5-5de38a88072c-kube-api-access-sfspk" (OuterVolumeSpecName: "kube-api-access-sfspk") pod "58de676b-7b73-4c04-b5d5-5de38a88072c" (UID: "58de676b-7b73-4c04-b5d5-5de38a88072c"). InnerVolumeSpecName "kube-api-access-sfspk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.657087 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi5809-account-delete-b9cvt" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.658176 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement60c5-account-delete-729k9" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.667855 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.678128 5002 scope.go:117] "RemoveContainer" containerID="b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8" Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.678546 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8\": container with ID starting with b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8 not found: ID does not exist" containerID="b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.678575 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8"} err="failed to get container status \"b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8\": rpc error: code = NotFound desc = could not find container \"b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8\": container with ID starting with b1619efb43e58f8e6a06eb0439aba518ccc393a1be761ea167d429fe1d32c4c8 not found: ID does not exist" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.678593 5002 scope.go:117] "RemoveContainer" containerID="177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.685284 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.721561 5002 scope.go:117] "RemoveContainer" containerID="47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.739330 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-47b4k" podUID="fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" containerName="ovn-controller" probeResult="failure" output="command timed out" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.742108 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58de676b-7b73-4c04-b5d5-5de38a88072c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.742132 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfspk\" (UniqueName: \"kubernetes.io/projected/58de676b-7b73-4c04-b5d5-5de38a88072c-kube-api-access-sfspk\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.747406 5002 scope.go:117] "RemoveContainer" containerID="177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780" Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.747827 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780\": container with ID starting with 177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780 not found: ID does not exist" containerID="177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.747871 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780"} err="failed to get container status \"177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780\": rpc error: code = NotFound desc = could not find container \"177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780\": container with ID starting with 177ca2f00057b9f494561460d93507dad5143107f0da44739f7456ccfec82780 not found: ID does not exist" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.747899 5002 scope.go:117] "RemoveContainer" containerID="47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9" Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.748135 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9\": container with ID starting with 47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9 not found: ID does not exist" containerID="47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.748161 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9"} err="failed to get container status \"47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9\": rpc error: code = NotFound desc = could not find container \"47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9\": container with ID starting with 47df4f8a2eceea5148332f48a2f1938fcdc34779680d4051e6a0db02acbf62a9 not found: ID does not exist" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.775689 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-47b4k" podUID="fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" containerName="ovn-controller" probeResult="failure" output=< Dec 09 10:25:23 crc kubenswrapper[5002]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Dec 09 10:25:23 crc kubenswrapper[5002]: > Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.791419 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.791687 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.791925 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.791957 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server" Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.792955 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.796796 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.797962 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.798010 5002 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovs-vswitchd" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.842796 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a67e154b-1de7-4e2b-9b87-049ea273fa01-operator-scripts\") pod \"a67e154b-1de7-4e2b-9b87-049ea273fa01\" (UID: \"a67e154b-1de7-4e2b-9b87-049ea273fa01\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.842866 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-operator-scripts\") pod \"f41619d4-24a3-46e4-9cb9-2e388f7cd36b\" (UID: \"f41619d4-24a3-46e4-9cb9-2e388f7cd36b\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.842976 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg4r8\" (UniqueName: \"kubernetes.io/projected/a67e154b-1de7-4e2b-9b87-049ea273fa01-kube-api-access-vg4r8\") pod \"a67e154b-1de7-4e2b-9b87-049ea273fa01\" (UID: \"a67e154b-1de7-4e2b-9b87-049ea273fa01\") " Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.843005 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjkq2\" (UniqueName: \"kubernetes.io/projected/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-kube-api-access-wjkq2\") pod \"f41619d4-24a3-46e4-9cb9-2e388f7cd36b\" (UID: \"f41619d4-24a3-46e4-9cb9-2e388f7cd36b\") " Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.843412 5002 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 10:25:23 crc kubenswrapper[5002]: E1209 10:25:23.843461 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data podName:58c08274-46ea-48be-a135-0c1174cd6135 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:31.843448652 +0000 UTC m=+1464.235499723 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data") pod "rabbitmq-server-0" (UID: "58c08274-46ea-48be-a135-0c1174cd6135") : configmap "rabbitmq-config-data" not found Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.844172 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a67e154b-1de7-4e2b-9b87-049ea273fa01-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a67e154b-1de7-4e2b-9b87-049ea273fa01" (UID: "a67e154b-1de7-4e2b-9b87-049ea273fa01"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.844545 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f41619d4-24a3-46e4-9cb9-2e388f7cd36b" (UID: "f41619d4-24a3-46e4-9cb9-2e388f7cd36b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.848158 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a67e154b-1de7-4e2b-9b87-049ea273fa01-kube-api-access-vg4r8" (OuterVolumeSpecName: "kube-api-access-vg4r8") pod "a67e154b-1de7-4e2b-9b87-049ea273fa01" (UID: "a67e154b-1de7-4e2b-9b87-049ea273fa01"). InnerVolumeSpecName "kube-api-access-vg4r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.851057 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-kube-api-access-wjkq2" (OuterVolumeSpecName: "kube-api-access-wjkq2") pod "f41619d4-24a3-46e4-9cb9-2e388f7cd36b" (UID: "f41619d4-24a3-46e4-9cb9-2e388f7cd36b"). InnerVolumeSpecName "kube-api-access-wjkq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.945543 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a67e154b-1de7-4e2b-9b87-049ea273fa01-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.945788 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.945798 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg4r8\" (UniqueName: \"kubernetes.io/projected/a67e154b-1de7-4e2b-9b87-049ea273fa01-kube-api-access-vg4r8\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:23 crc kubenswrapper[5002]: I1209 10:25:23.945806 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjkq2\" (UniqueName: \"kubernetes.io/projected/f41619d4-24a3-46e4-9cb9-2e388f7cd36b-kube-api-access-wjkq2\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.047317 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi5809-account-delete-b9cvt" event={"ID":"f41619d4-24a3-46e4-9cb9-2e388f7cd36b","Type":"ContainerDied","Data":"179df0062e8de5005d386674c0c65154f75bb2efd0bb9beba8dd1f9d2c20a0fa"} Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.047366 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="179df0062e8de5005d386674c0c65154f75bb2efd0bb9beba8dd1f9d2c20a0fa" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.047430 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi5809-account-delete-b9cvt" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.059862 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.068644 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0539b-account-delete-t9blx" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.078093 5002 generic.go:334] "Generic (PLEG): container finished" podID="9278e14e-2524-4e42-b870-f493ea02ede8" containerID="1faa363b9769f751a8c09fade1d2f2f3b3905666130dc1d039543eef99f84775" exitCode=0 Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.106626 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement60c5-account-delete-729k9" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.111232 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" path="/var/lib/kubelet/pods/0172d8ed-9ef1-4aac-b246-1b1ed0df87fc/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.112384 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02c94bee-a522-4ea6-85af-1ba68e174203" path="/var/lib/kubelet/pods/02c94bee-a522-4ea6-85af-1ba68e174203/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.113626 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf056c0-a496-4499-92c7-3b1300b4a29d" path="/var/lib/kubelet/pods/1bf056c0-a496-4499-92c7-3b1300b4a29d/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.115969 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e36e954-d9c1-41e3-8542-e8f300db90cb" path="/var/lib/kubelet/pods/1e36e954-d9c1-41e3-8542-e8f300db90cb/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.117041 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c454948fd-lwcxn" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.117242 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2adbbd67-ccdf-4444-b667-2b549bc200b5" path="/var/lib/kubelet/pods/2adbbd67-ccdf-4444-b667-2b549bc200b5/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.133042 5002 generic.go:334] "Generic (PLEG): container finished" podID="58c08274-46ea-48be-a135-0c1174cd6135" containerID="b05714ada64dee7eaed39017f863e151b219f928c230aa2f336910df9726668b" exitCode=0 Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.135219 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54351653-7ebd-40ba-8181-bb1023f18190" path="/var/lib/kubelet/pods/54351653-7ebd-40ba-8181-bb1023f18190/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.136738 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron1d25-account-delete-f87kn" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.140941 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-89c5cd4d5-cbdk5" podUID="e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.196:5353: i/o timeout" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.155057 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65df60b6-4049-47b6-9907-ebf76c151213" path="/var/lib/kubelet/pods/65df60b6-4049-47b6-9907-ebf76c151213/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.156147 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b5836b7-7b16-477f-9a20-f30032362374" path="/var/lib/kubelet/pods/6b5836b7-7b16-477f-9a20-f30032362374/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.157335 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae47b25-e6fd-451f-9827-72ee4e12e526" path="/var/lib/kubelet/pods/7ae47b25-e6fd-451f-9827-72ee4e12e526/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.158032 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6b9775f-22d1-413b-8d2f-1dbe890b582c" path="/var/lib/kubelet/pods/c6b9775f-22d1-413b-8d2f-1dbe890b582c/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.158723 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" path="/var/lib/kubelet/pods/cd8a7609-928f-4a68-9903-fa846e4baeda/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.160184 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f702a539-ec25-44d4-8629-97b3c5499b96" path="/var/lib/kubelet/pods/f702a539-ec25-44d4-8629-97b3c5499b96/volumes" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.164732 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi5809-account-delete-b9cvt"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.164767 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43512a9c-be3a-4c0e-a178-82c5a065acf4","Type":"ContainerDied","Data":"0bb1d4253cd67a6308db800435b668431a4ad39102bd21c8bdb2de49d5a470ee"} Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.164842 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi5809-account-delete-b9cvt"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.164895 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.164930 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0539b-account-delete-t9blx" event={"ID":"58de676b-7b73-4c04-b5d5-5de38a88072c","Type":"ContainerDied","Data":"ea887b59a65e1a0c46201433d9c42e1919e4d2130b12ab041d94bf4cb968f9a2"} Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.164968 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea887b59a65e1a0c46201433d9c42e1919e4d2130b12ab041d94bf4cb968f9a2" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.164996 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9278e14e-2524-4e42-b870-f493ea02ede8","Type":"ContainerDied","Data":"1faa363b9769f751a8c09fade1d2f2f3b3905666130dc1d039543eef99f84775"} Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.165015 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement60c5-account-delete-729k9" event={"ID":"a67e154b-1de7-4e2b-9b87-049ea273fa01","Type":"ContainerDied","Data":"99b62954faee487ae496d2100f99fd237cf97a90af6ebc5a45a93493d3275990"} Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.165056 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c454948fd-lwcxn" event={"ID":"a4061af7-7669-4bd4-a36c-6ec982e86753","Type":"ContainerDied","Data":"31693255ef31e683e54359ba3df04abd53715c4072183470e51814121ba2ebfc"} Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.165080 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"58c08274-46ea-48be-a135-0c1174cd6135","Type":"ContainerDied","Data":"b05714ada64dee7eaed39017f863e151b219f928c230aa2f336910df9726668b"} Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.165108 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron1d25-account-delete-f87kn" event={"ID":"c44aced5-6d19-429a-8917-cd4229341433","Type":"ContainerDied","Data":"815ca33aded12bd65e92a6114007217ffb908a39e3774f83c6f60f6185d4f244"} Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.165121 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="815ca33aded12bd65e92a6114007217ffb908a39e3774f83c6f60f6185d4f244" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.165145 5002 scope.go:117] "RemoveContainer" containerID="36bf5a63f64b1da8bf0d3200a657077d8683342ea2307df72c904532c9648a0a" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.168501 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.383907 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="b613f5a4-9369-45ae-8c2c-10e16e639999" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.186:6080/vnc_lite.html\": context deadline exceeded" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.417042 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.421549 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.423962 5002 scope.go:117] "RemoveContainer" containerID="3ee0c33be841aa47fb9e6a001724fc9d6d043c260bf489595a85fb2dccaf7988" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.432359 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0539b-account-delete-t9blx"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.474882 5002 scope.go:117] "RemoveContainer" containerID="2944a25a7c0f087015f80b3d4d12a8b2ffabefdcb9d6f6b684e88e4e6b57e2db" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.494413 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0539b-account-delete-t9blx"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.512892 5002 scope.go:117] "RemoveContainer" containerID="6cb449e2adfcabb9641ca2b98611d189bd76e94105b2edc6a7b7b41e8dbf68a0" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.523507 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c454948fd-lwcxn"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.529049 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5c454948fd-lwcxn"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.534961 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement60c5-account-delete-729k9"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.542682 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement60c5-account-delete-729k9"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.557681 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron1d25-account-delete-f87kn"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.562970 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron1d25-account-delete-f87kn"] Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572567 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9278e14e-2524-4e42-b870-f493ea02ede8-pod-info\") pod \"9278e14e-2524-4e42-b870-f493ea02ede8\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572633 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tck5z\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-kube-api-access-tck5z\") pod \"9278e14e-2524-4e42-b870-f493ea02ede8\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572663 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"58c08274-46ea-48be-a135-0c1174cd6135\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572689 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58c08274-46ea-48be-a135-0c1174cd6135-pod-info\") pod \"58c08274-46ea-48be-a135-0c1174cd6135\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572737 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w627k\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-kube-api-access-w627k\") pod \"58c08274-46ea-48be-a135-0c1174cd6135\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572760 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-tls\") pod \"58c08274-46ea-48be-a135-0c1174cd6135\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572787 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-plugins-conf\") pod \"9278e14e-2524-4e42-b870-f493ea02ede8\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572843 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-plugins\") pod \"9278e14e-2524-4e42-b870-f493ea02ede8\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572881 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-plugins\") pod \"58c08274-46ea-48be-a135-0c1174cd6135\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572906 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-tls\") pod \"9278e14e-2524-4e42-b870-f493ea02ede8\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572927 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-plugins-conf\") pod \"58c08274-46ea-48be-a135-0c1174cd6135\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572947 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-confd\") pod \"9278e14e-2524-4e42-b870-f493ea02ede8\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.572966 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-server-conf\") pod \"9278e14e-2524-4e42-b870-f493ea02ede8\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.573004 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9278e14e-2524-4e42-b870-f493ea02ede8-erlang-cookie-secret\") pod \"9278e14e-2524-4e42-b870-f493ea02ede8\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.573065 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data\") pod \"9278e14e-2524-4e42-b870-f493ea02ede8\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.573112 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9278e14e-2524-4e42-b870-f493ea02ede8\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.573148 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58c08274-46ea-48be-a135-0c1174cd6135-erlang-cookie-secret\") pod \"58c08274-46ea-48be-a135-0c1174cd6135\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.573169 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-server-conf\") pod \"58c08274-46ea-48be-a135-0c1174cd6135\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.573200 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-erlang-cookie\") pod \"58c08274-46ea-48be-a135-0c1174cd6135\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.573225 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data\") pod \"58c08274-46ea-48be-a135-0c1174cd6135\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.573245 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-erlang-cookie\") pod \"9278e14e-2524-4e42-b870-f493ea02ede8\" (UID: \"9278e14e-2524-4e42-b870-f493ea02ede8\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.573260 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-confd\") pod \"58c08274-46ea-48be-a135-0c1174cd6135\" (UID: \"58c08274-46ea-48be-a135-0c1174cd6135\") " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.574294 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "58c08274-46ea-48be-a135-0c1174cd6135" (UID: "58c08274-46ea-48be-a135-0c1174cd6135"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.576862 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "58c08274-46ea-48be-a135-0c1174cd6135" (UID: "58c08274-46ea-48be-a135-0c1174cd6135"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.577149 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9278e14e-2524-4e42-b870-f493ea02ede8" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.577478 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9278e14e-2524-4e42-b870-f493ea02ede8" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.577674 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "58c08274-46ea-48be-a135-0c1174cd6135" (UID: "58c08274-46ea-48be-a135-0c1174cd6135"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.578394 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9278e14e-2524-4e42-b870-f493ea02ede8" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.579213 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "9278e14e-2524-4e42-b870-f493ea02ede8" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.580083 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "58c08274-46ea-48be-a135-0c1174cd6135" (UID: "58c08274-46ea-48be-a135-0c1174cd6135"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.582786 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-kube-api-access-tck5z" (OuterVolumeSpecName: "kube-api-access-tck5z") pod "9278e14e-2524-4e42-b870-f493ea02ede8" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8"). InnerVolumeSpecName "kube-api-access-tck5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.583678 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9278e14e-2524-4e42-b870-f493ea02ede8-pod-info" (OuterVolumeSpecName: "pod-info") pod "9278e14e-2524-4e42-b870-f493ea02ede8" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.584075 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9278e14e-2524-4e42-b870-f493ea02ede8" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.584125 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "58c08274-46ea-48be-a135-0c1174cd6135" (UID: "58c08274-46ea-48be-a135-0c1174cd6135"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.586959 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-kube-api-access-w627k" (OuterVolumeSpecName: "kube-api-access-w627k") pod "58c08274-46ea-48be-a135-0c1174cd6135" (UID: "58c08274-46ea-48be-a135-0c1174cd6135"). InnerVolumeSpecName "kube-api-access-w627k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.587096 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9278e14e-2524-4e42-b870-f493ea02ede8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9278e14e-2524-4e42-b870-f493ea02ede8" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.595562 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c08274-46ea-48be-a135-0c1174cd6135-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "58c08274-46ea-48be-a135-0c1174cd6135" (UID: "58c08274-46ea-48be-a135-0c1174cd6135"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.596244 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/58c08274-46ea-48be-a135-0c1174cd6135-pod-info" (OuterVolumeSpecName: "pod-info") pod "58c08274-46ea-48be-a135-0c1174cd6135" (UID: "58c08274-46ea-48be-a135-0c1174cd6135"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.609105 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data" (OuterVolumeSpecName: "config-data") pod "58c08274-46ea-48be-a135-0c1174cd6135" (UID: "58c08274-46ea-48be-a135-0c1174cd6135"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.618901 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-server-conf" (OuterVolumeSpecName: "server-conf") pod "58c08274-46ea-48be-a135-0c1174cd6135" (UID: "58c08274-46ea-48be-a135-0c1174cd6135"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.618923 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data" (OuterVolumeSpecName: "config-data") pod "9278e14e-2524-4e42-b870-f493ea02ede8" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.626983 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-server-conf" (OuterVolumeSpecName: "server-conf") pod "9278e14e-2524-4e42-b870-f493ea02ede8" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.672457 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9278e14e-2524-4e42-b870-f493ea02ede8" (UID: "9278e14e-2524-4e42-b870-f493ea02ede8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.674974 5002 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9278e14e-2524-4e42-b870-f493ea02ede8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675011 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675044 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675057 5002 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58c08274-46ea-48be-a135-0c1174cd6135-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675068 5002 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-server-conf\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675079 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675090 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675102 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675111 5002 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9278e14e-2524-4e42-b870-f493ea02ede8-pod-info\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675121 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tck5z\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-kube-api-access-tck5z\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675137 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675148 5002 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58c08274-46ea-48be-a135-0c1174cd6135-pod-info\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675159 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w627k\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-kube-api-access-w627k\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675169 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675179 5002 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675191 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675200 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675210 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675219 5002 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58c08274-46ea-48be-a135-0c1174cd6135-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675228 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9278e14e-2524-4e42-b870-f493ea02ede8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.675238 5002 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9278e14e-2524-4e42-b870-f493ea02ede8-server-conf\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.689784 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.698785 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.705024 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "58c08274-46ea-48be-a135-0c1174cd6135" (UID: "58c08274-46ea-48be-a135-0c1174cd6135"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.777150 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.777188 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58c08274-46ea-48be-a135-0c1174cd6135-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: I1209 10:25:24.777202 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:24 crc kubenswrapper[5002]: E1209 10:25:24.980735 5002 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 09 10:25:24 crc kubenswrapper[5002]: E1209 10:25:24.982588 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:28.982570389 +0000 UTC m=+1461.374621470 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-config-data" not found Dec 09 10:25:24 crc kubenswrapper[5002]: E1209 10:25:24.980958 5002 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 09 10:25:24 crc kubenswrapper[5002]: E1209 10:25:24.982684 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:28.982664401 +0000 UTC m=+1461.374715482 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-config-data" not found Dec 09 10:25:24 crc kubenswrapper[5002]: E1209 10:25:24.983054 5002 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 10:25:24 crc kubenswrapper[5002]: E1209 10:25:24.983085 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts podName:7faabd78-c9ab-4397-aa4d-b8aaff302251 nodeName:}" failed. No retries permitted until 2025-12-09 10:25:28.983075662 +0000 UTC m=+1461.375126743 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts") pod "openstack-galera-0" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251") : configmap "openstack-scripts" not found Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.154863 5002 generic.go:334] "Generic (PLEG): container finished" podID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerID="2d1c701bf68c79d11c50d424397a24223cbdbf5471946d3e7f2ac92b66b2c778" exitCode=0 Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.154921 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5893e6fa-5b64-47e0-b8e1-f68baf27a65c","Type":"ContainerDied","Data":"2d1c701bf68c79d11c50d424397a24223cbdbf5471946d3e7f2ac92b66b2c778"} Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.157186 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9278e14e-2524-4e42-b870-f493ea02ede8","Type":"ContainerDied","Data":"4b5da4e1754b31552c6dc2cd68bb6dbdef670bc97ab8fcbe670e88f5f8083613"} Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.157232 5002 scope.go:117] "RemoveContainer" containerID="1faa363b9769f751a8c09fade1d2f2f3b3905666130dc1d039543eef99f84775" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.157340 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.166167 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"58c08274-46ea-48be-a135-0c1174cd6135","Type":"ContainerDied","Data":"faecc1405337e2a042e11321b6db4645aa07d1e217ad5911586ebd7bfc699e93"} Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.166175 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.166221 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.177974 5002 generic.go:334] "Generic (PLEG): container finished" podID="f514395b-6067-4e42-98e6-f3c5ac427982" containerID="6037ffe3713fc44574c5f932602a50f9c94b830e5d71df9a677e79d52c3571ba" exitCode=0 Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.178015 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d565f9c5b-d7trd" event={"ID":"f514395b-6067-4e42-98e6-f3c5ac427982","Type":"ContainerDied","Data":"6037ffe3713fc44574c5f932602a50f9c94b830e5d71df9a677e79d52c3571ba"} Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.216136 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.223018 5002 scope.go:117] "RemoveContainer" containerID="2354f84dc26ea366678ca4f5adfbc4ea21ccc99533838a486cd28b5710f9ea1c" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.233416 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.249649 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.257091 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.267493 5002 scope.go:117] "RemoveContainer" containerID="b05714ada64dee7eaed39017f863e151b219f928c230aa2f336910df9726668b" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.286514 5002 scope.go:117] "RemoveContainer" containerID="700088876c2e92d617598571a2ba75be5d5b0ca2bdb88d2688fcecc1a5db9a68" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.286847 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-public-tls-certs\") pod \"f514395b-6067-4e42-98e6-f3c5ac427982\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.286935 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-internal-tls-certs\") pod \"f514395b-6067-4e42-98e6-f3c5ac427982\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.286975 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-fernet-keys\") pod \"f514395b-6067-4e42-98e6-f3c5ac427982\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.287011 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-credential-keys\") pod \"f514395b-6067-4e42-98e6-f3c5ac427982\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.287051 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-combined-ca-bundle\") pod \"f514395b-6067-4e42-98e6-f3c5ac427982\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.287099 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-scripts\") pod \"f514395b-6067-4e42-98e6-f3c5ac427982\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.287120 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v2fw\" (UniqueName: \"kubernetes.io/projected/f514395b-6067-4e42-98e6-f3c5ac427982-kube-api-access-7v2fw\") pod \"f514395b-6067-4e42-98e6-f3c5ac427982\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.287220 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-config-data\") pod \"f514395b-6067-4e42-98e6-f3c5ac427982\" (UID: \"f514395b-6067-4e42-98e6-f3c5ac427982\") " Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.292998 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f514395b-6067-4e42-98e6-f3c5ac427982" (UID: "f514395b-6067-4e42-98e6-f3c5ac427982"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.293299 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f514395b-6067-4e42-98e6-f3c5ac427982-kube-api-access-7v2fw" (OuterVolumeSpecName: "kube-api-access-7v2fw") pod "f514395b-6067-4e42-98e6-f3c5ac427982" (UID: "f514395b-6067-4e42-98e6-f3c5ac427982"). InnerVolumeSpecName "kube-api-access-7v2fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.294472 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f514395b-6067-4e42-98e6-f3c5ac427982" (UID: "f514395b-6067-4e42-98e6-f3c5ac427982"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.295725 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-scripts" (OuterVolumeSpecName: "scripts") pod "f514395b-6067-4e42-98e6-f3c5ac427982" (UID: "f514395b-6067-4e42-98e6-f3c5ac427982"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.313026 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-config-data" (OuterVolumeSpecName: "config-data") pod "f514395b-6067-4e42-98e6-f3c5ac427982" (UID: "f514395b-6067-4e42-98e6-f3c5ac427982"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.315849 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f514395b-6067-4e42-98e6-f3c5ac427982" (UID: "f514395b-6067-4e42-98e6-f3c5ac427982"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.329151 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f514395b-6067-4e42-98e6-f3c5ac427982" (UID: "f514395b-6067-4e42-98e6-f3c5ac427982"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.337497 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f514395b-6067-4e42-98e6-f3c5ac427982" (UID: "f514395b-6067-4e42-98e6-f3c5ac427982"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.389898 5002 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.389931 5002 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.389945 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.389953 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.389964 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v2fw\" (UniqueName: \"kubernetes.io/projected/f514395b-6067-4e42-98e6-f3c5ac427982-kube-api-access-7v2fw\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.389971 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.389979 5002 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.389989 5002 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f514395b-6067-4e42-98e6-f3c5ac427982-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.392143 5002 scope.go:117] "RemoveContainer" containerID="6037ffe3713fc44574c5f932602a50f9c94b830e5d71df9a677e79d52c3571ba" Dec 09 10:25:25 crc kubenswrapper[5002]: E1209 10:25:25.698140 5002 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 09 10:25:25 crc kubenswrapper[5002]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-09T10:25:18Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 09 10:25:25 crc kubenswrapper[5002]: /etc/init.d/functions: line 589: 435 Alarm clock "$@" Dec 09 10:25:25 crc kubenswrapper[5002]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-47b4k" message=< Dec 09 10:25:25 crc kubenswrapper[5002]: Exiting ovn-controller (1) [FAILED] Dec 09 10:25:25 crc kubenswrapper[5002]: Killing ovn-controller (1) [ OK ] Dec 09 10:25:25 crc kubenswrapper[5002]: Killing ovn-controller (1) with SIGKILL [ OK ] Dec 09 10:25:25 crc kubenswrapper[5002]: 2025-12-09T10:25:18Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 09 10:25:25 crc kubenswrapper[5002]: /etc/init.d/functions: line 589: 435 Alarm clock "$@" Dec 09 10:25:25 crc kubenswrapper[5002]: > Dec 09 10:25:25 crc kubenswrapper[5002]: E1209 10:25:25.698186 5002 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 09 10:25:25 crc kubenswrapper[5002]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-09T10:25:18Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 09 10:25:25 crc kubenswrapper[5002]: /etc/init.d/functions: line 589: 435 Alarm clock "$@" Dec 09 10:25:25 crc kubenswrapper[5002]: > pod="openstack/ovn-controller-47b4k" podUID="fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" containerName="ovn-controller" containerID="cri-o://ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.698229 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-47b4k" podUID="fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" containerName="ovn-controller" containerID="cri-o://ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf" gracePeriod=22 Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.845769 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-8796r"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.870963 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-8796r"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.891051 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican73fb-account-delete-6zw8z"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.904241 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican73fb-account-delete-6zw8z"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.936246 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-73fb-account-create-update-49bs8"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.955466 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-73fb-account-create-update-49bs8"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.978651 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-kl927"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.991585 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.995050 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-kl927"] Dec 09 10:25:25 crc kubenswrapper[5002]: I1209 10:25:25.997534 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.002473 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cindere0a9-account-delete-b5zfk"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.006686 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default\") pod \"7faabd78-c9ab-4397-aa4d-b8aaff302251\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.006745 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd4lq\" (UniqueName: \"kubernetes.io/projected/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-kube-api-access-cd4lq\") pod \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.006956 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-galera-tls-certs\") pod \"7faabd78-c9ab-4397-aa4d-b8aaff302251\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.006986 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-sg-core-conf-yaml\") pod \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007011 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-scripts\") pod \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007032 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts\") pod \"7faabd78-c9ab-4397-aa4d-b8aaff302251\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007054 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-generated\") pod \"7faabd78-c9ab-4397-aa4d-b8aaff302251\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007074 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"7faabd78-c9ab-4397-aa4d-b8aaff302251\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007120 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz4zx\" (UniqueName: \"kubernetes.io/projected/7faabd78-c9ab-4397-aa4d-b8aaff302251-kube-api-access-wz4zx\") pod \"7faabd78-c9ab-4397-aa4d-b8aaff302251\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007211 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-combined-ca-bundle\") pod \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007240 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config\") pod \"7faabd78-c9ab-4397-aa4d-b8aaff302251\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007253 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-run-httpd\") pod \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007305 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-log-httpd\") pod \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007323 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-combined-ca-bundle\") pod \"7faabd78-c9ab-4397-aa4d-b8aaff302251\" (UID: \"7faabd78-c9ab-4397-aa4d-b8aaff302251\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007359 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-config-data\") pod \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.007422 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-ceilometer-tls-certs\") pod \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\" (UID: \"5893e6fa-5b64-47e0-b8e1-f68baf27a65c\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.011063 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5893e6fa-5b64-47e0-b8e1-f68baf27a65c" (UID: "5893e6fa-5b64-47e0-b8e1-f68baf27a65c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.013782 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5893e6fa-5b64-47e0-b8e1-f68baf27a65c" (UID: "5893e6fa-5b64-47e0-b8e1-f68baf27a65c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.013950 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "7faabd78-c9ab-4397-aa4d-b8aaff302251" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.014538 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7faabd78-c9ab-4397-aa4d-b8aaff302251" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.015237 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7faabd78-c9ab-4397-aa4d-b8aaff302251" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.015289 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cindere0a9-account-delete-b5zfk"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.015870 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "7faabd78-c9ab-4397-aa4d-b8aaff302251" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.023137 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e0a9-account-create-update-8w48l"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.026673 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7faabd78-c9ab-4397-aa4d-b8aaff302251-kube-api-access-wz4zx" (OuterVolumeSpecName: "kube-api-access-wz4zx") pod "7faabd78-c9ab-4397-aa4d-b8aaff302251" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251"). InnerVolumeSpecName "kube-api-access-wz4zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.042684 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-scripts" (OuterVolumeSpecName: "scripts") pod "5893e6fa-5b64-47e0-b8e1-f68baf27a65c" (UID: "5893e6fa-5b64-47e0-b8e1-f68baf27a65c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.043027 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-kube-api-access-cd4lq" (OuterVolumeSpecName: "kube-api-access-cd4lq") pod "5893e6fa-5b64-47e0-b8e1-f68baf27a65c" (UID: "5893e6fa-5b64-47e0-b8e1-f68baf27a65c"). InnerVolumeSpecName "kube-api-access-cd4lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.045955 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "7faabd78-c9ab-4397-aa4d-b8aaff302251" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.046272 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e0a9-account-create-update-8w48l"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.047361 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5893e6fa-5b64-47e0-b8e1-f68baf27a65c" (UID: "5893e6fa-5b64-47e0-b8e1-f68baf27a65c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.054288 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7faabd78-c9ab-4397-aa4d-b8aaff302251" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.073418 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37720060-6c72-494c-b89f-9525e48f9f8d" path="/var/lib/kubelet/pods/37720060-6c72-494c-b89f-9525e48f9f8d/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.073993 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43512a9c-be3a-4c0e-a178-82c5a065acf4" path="/var/lib/kubelet/pods/43512a9c-be3a-4c0e-a178-82c5a065acf4/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.074754 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c94da26-e330-42bc-b73c-3c0134b7924d" path="/var/lib/kubelet/pods/4c94da26-e330-42bc-b73c-3c0134b7924d/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.075657 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="518e1d88-71f4-4fe3-9ad6-f938249f1ae3" path="/var/lib/kubelet/pods/518e1d88-71f4-4fe3-9ad6-f938249f1ae3/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.076321 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c08274-46ea-48be-a135-0c1174cd6135" path="/var/lib/kubelet/pods/58c08274-46ea-48be-a135-0c1174cd6135/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.076897 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58de676b-7b73-4c04-b5d5-5de38a88072c" path="/var/lib/kubelet/pods/58de676b-7b73-4c04-b5d5-5de38a88072c/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.077912 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9278e14e-2524-4e42-b870-f493ea02ede8" path="/var/lib/kubelet/pods/9278e14e-2524-4e42-b870-f493ea02ede8/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.078445 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4061af7-7669-4bd4-a36c-6ec982e86753" path="/var/lib/kubelet/pods/a4061af7-7669-4bd4-a36c-6ec982e86753/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.078949 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a67e154b-1de7-4e2b-9b87-049ea273fa01" path="/var/lib/kubelet/pods/a67e154b-1de7-4e2b-9b87-049ea273fa01/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.081086 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae6c00ce-3152-42ae-890f-bb76aac103c5" path="/var/lib/kubelet/pods/ae6c00ce-3152-42ae-890f-bb76aac103c5/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.082002 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c44aced5-6d19-429a-8917-cd4229341433" path="/var/lib/kubelet/pods/c44aced5-6d19-429a-8917-cd4229341433/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.082446 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f41619d4-24a3-46e4-9cb9-2e388f7cd36b" path="/var/lib/kubelet/pods/f41619d4-24a3-46e4-9cb9-2e388f7cd36b/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.083373 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe95257d-a02e-4f04-a543-a2db08231043" path="/var/lib/kubelet/pods/fe95257d-a02e-4f04-a543-a2db08231043/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.084392 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff9b20e8-85f5-4a54-8505-50fc885caa71" path="/var/lib/kubelet/pods/ff9b20e8-85f5-4a54-8505-50fc885caa71/volumes" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.095567 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5893e6fa-5b64-47e0-b8e1-f68baf27a65c" (UID: "5893e6fa-5b64-47e0-b8e1-f68baf27a65c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.096948 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-47b4k_fdaeef31-a8f8-478a-86b0-4d0126eb7f3a/ovn-controller/0.log" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.097011 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.108423 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-combined-ca-bundle\") pod \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.108753 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-log-ovn\") pod \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.108846 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" (UID: "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.108930 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-ovn-controller-tls-certs\") pod \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.108951 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-scripts\") pod \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.109069 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "7faabd78-c9ab-4397-aa4d-b8aaff302251" (UID: "7faabd78-c9ab-4397-aa4d-b8aaff302251"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.110323 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-scripts" (OuterVolumeSpecName: "scripts") pod "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" (UID: "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.114196 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tf86\" (UniqueName: \"kubernetes.io/projected/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-kube-api-access-2tf86\") pod \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.114235 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run-ovn\") pod \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.114251 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run\") pod \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\" (UID: \"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a\") " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.114617 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" (UID: "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.114680 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run" (OuterVolumeSpecName: "var-run") pod "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" (UID: "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.117413 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-kube-api-access-2tf86" (OuterVolumeSpecName: "kube-api-access-2tf86") pod "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" (UID: "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a"). InnerVolumeSpecName "kube-api-access-2tf86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118156 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118216 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118228 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118237 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118258 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118299 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz4zx\" (UniqueName: \"kubernetes.io/projected/7faabd78-c9ab-4397-aa4d-b8aaff302251-kube-api-access-wz4zx\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118309 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tf86\" (UniqueName: \"kubernetes.io/projected/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-kube-api-access-2tf86\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118317 5002 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118326 5002 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118335 5002 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118371 5002 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118381 5002 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118389 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118398 5002 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118406 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd4lq\" (UniqueName: \"kubernetes.io/projected/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-kube-api-access-cd4lq\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118414 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7faabd78-c9ab-4397-aa4d-b8aaff302251-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118451 5002 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118461 5002 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faabd78-c9ab-4397-aa4d-b8aaff302251-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.118470 5002 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.133103 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-config-data" (OuterVolumeSpecName: "config-data") pod "5893e6fa-5b64-47e0-b8e1-f68baf27a65c" (UID: "5893e6fa-5b64-47e0-b8e1-f68baf27a65c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.134666 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.143623 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5893e6fa-5b64-47e0-b8e1-f68baf27a65c" (UID: "5893e6fa-5b64-47e0-b8e1-f68baf27a65c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.144084 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" (UID: "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.180653 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" (UID: "fdaeef31-a8f8-478a-86b0-4d0126eb7f3a"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.189001 5002 generic.go:334] "Generic (PLEG): container finished" podID="7faabd78-c9ab-4397-aa4d-b8aaff302251" containerID="6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5" exitCode=0 Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.189052 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7faabd78-c9ab-4397-aa4d-b8aaff302251","Type":"ContainerDied","Data":"6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5"} Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.189076 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7faabd78-c9ab-4397-aa4d-b8aaff302251","Type":"ContainerDied","Data":"38111b2f8e861bf57c5f0e322034fbc75eba84e214cb51fbd1f99f4f884df28a"} Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.189092 5002 scope.go:117] "RemoveContainer" containerID="6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.189199 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.199154 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5893e6fa-5b64-47e0-b8e1-f68baf27a65c","Type":"ContainerDied","Data":"f8771d85afadccb5dd59c206f45b23474d80ca51d4c15a745a5ccc0e50b4d6c6"} Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.199163 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.203799 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-47b4k_fdaeef31-a8f8-478a-86b0-4d0126eb7f3a/ovn-controller/0.log" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.203886 5002 generic.go:334] "Generic (PLEG): container finished" podID="fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" containerID="ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf" exitCode=137 Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.203937 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-47b4k" event={"ID":"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a","Type":"ContainerDied","Data":"ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf"} Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.203956 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-47b4k" event={"ID":"fdaeef31-a8f8-478a-86b0-4d0126eb7f3a","Type":"ContainerDied","Data":"6c3e11365d63634272c07c286c98d3e312d97e5f9e6503b754df302cc4b9bb4c"} Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.203997 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-47b4k" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.219670 5002 scope.go:117] "RemoveContainer" containerID="a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.220059 5002 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.220090 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.220103 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.220115 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5893e6fa-5b64-47e0-b8e1-f68baf27a65c-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.220126 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.221578 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d565f9c5b-d7trd" event={"ID":"f514395b-6067-4e42-98e6-f3c5ac427982","Type":"ContainerDied","Data":"4db14adbffe643510f2f241ba1a2554a6f2f31368c7be043902223832098dcb1"} Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.221675 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d565f9c5b-d7trd" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.255222 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.275299 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.279706 5002 scope.go:117] "RemoveContainer" containerID="6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5" Dec 09 10:25:26 crc kubenswrapper[5002]: E1209 10:25:26.280369 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5\": container with ID starting with 6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5 not found: ID does not exist" containerID="6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.280522 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5"} err="failed to get container status \"6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5\": rpc error: code = NotFound desc = could not find container \"6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5\": container with ID starting with 6bf1ad080014bf0300fd0f26c245a94c77d984fb37e7f168c414b8d47ddbdbd5 not found: ID does not exist" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.280548 5002 scope.go:117] "RemoveContainer" containerID="a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65" Dec 09 10:25:26 crc kubenswrapper[5002]: E1209 10:25:26.281168 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65\": container with ID starting with a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65 not found: ID does not exist" containerID="a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.281230 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65"} err="failed to get container status \"a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65\": rpc error: code = NotFound desc = could not find container \"a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65\": container with ID starting with a8f939cbf0cbdd995c3c995a86e97bc03e580217c614bac537583c6bbc3bbf65 not found: ID does not exist" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.281246 5002 scope.go:117] "RemoveContainer" containerID="7bf0075652dce88cf4c715938171cbd87da9f63720497ca4a0f0e4c414c5e29f" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.283516 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.289777 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.310352 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6d565f9c5b-d7trd"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.317444 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6d565f9c5b-d7trd"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.324163 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-47b4k"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.328761 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-47b4k"] Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.358334 5002 scope.go:117] "RemoveContainer" containerID="6b859d8bb2242febe0b904062dc96fd2d5ce25ae8ec8c45d40b3a0d97cab32ab" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.373342 5002 scope.go:117] "RemoveContainer" containerID="2d1c701bf68c79d11c50d424397a24223cbdbf5471946d3e7f2ac92b66b2c778" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.390453 5002 scope.go:117] "RemoveContainer" containerID="7a033c871bfeb20ac9488a0b5376331a6b527d79d3198b2f244e621f3b53eb12" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.406468 5002 scope.go:117] "RemoveContainer" containerID="ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.424385 5002 scope.go:117] "RemoveContainer" containerID="ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf" Dec 09 10:25:26 crc kubenswrapper[5002]: E1209 10:25:26.424921 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf\": container with ID starting with ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf not found: ID does not exist" containerID="ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.424952 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf"} err="failed to get container status \"ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf\": rpc error: code = NotFound desc = could not find container \"ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf\": container with ID starting with ee551edb8c3c440c83f6a20492db39ad0c2a16f4443309c6c4e6687cc8b138cf not found: ID does not exist" Dec 09 10:25:26 crc kubenswrapper[5002]: I1209 10:25:26.511401 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="02c94bee-a522-4ea6-85af-1ba68e174203" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.170:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 10:25:27 crc kubenswrapper[5002]: I1209 10:25:27.334575 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 10:25:27 crc kubenswrapper[5002]: I1209 10:25:27.334645 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 10:25:28 crc kubenswrapper[5002]: I1209 10:25:28.071332 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" path="/var/lib/kubelet/pods/5893e6fa-5b64-47e0-b8e1-f68baf27a65c/volumes" Dec 09 10:25:28 crc kubenswrapper[5002]: I1209 10:25:28.072834 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7faabd78-c9ab-4397-aa4d-b8aaff302251" path="/var/lib/kubelet/pods/7faabd78-c9ab-4397-aa4d-b8aaff302251/volumes" Dec 09 10:25:28 crc kubenswrapper[5002]: I1209 10:25:28.074698 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f514395b-6067-4e42-98e6-f3c5ac427982" path="/var/lib/kubelet/pods/f514395b-6067-4e42-98e6-f3c5ac427982/volumes" Dec 09 10:25:28 crc kubenswrapper[5002]: I1209 10:25:28.075605 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" path="/var/lib/kubelet/pods/fdaeef31-a8f8-478a-86b0-4d0126eb7f3a/volumes" Dec 09 10:25:28 crc kubenswrapper[5002]: E1209 10:25:28.792596 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:28 crc kubenswrapper[5002]: E1209 10:25:28.793664 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:28 crc kubenswrapper[5002]: E1209 10:25:28.793842 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:28 crc kubenswrapper[5002]: E1209 10:25:28.794497 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:28 crc kubenswrapper[5002]: E1209 10:25:28.794558 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server" Dec 09 10:25:28 crc kubenswrapper[5002]: E1209 10:25:28.795797 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:28 crc kubenswrapper[5002]: E1209 10:25:28.797931 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:28 crc kubenswrapper[5002]: E1209 10:25:28.797992 5002 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovs-vswitchd" Dec 09 10:25:33 crc kubenswrapper[5002]: E1209 10:25:33.791353 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:33 crc kubenswrapper[5002]: E1209 10:25:33.792687 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:33 crc kubenswrapper[5002]: E1209 10:25:33.793008 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:33 crc kubenswrapper[5002]: E1209 10:25:33.793155 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:33 crc kubenswrapper[5002]: E1209 10:25:33.793220 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server" Dec 09 10:25:33 crc kubenswrapper[5002]: E1209 10:25:33.794644 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:33 crc kubenswrapper[5002]: E1209 10:25:33.796351 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:33 crc kubenswrapper[5002]: E1209 10:25:33.796384 5002 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovs-vswitchd" Dec 09 10:25:37 crc kubenswrapper[5002]: I1209 10:25:37.966237 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:25:37 crc kubenswrapper[5002]: I1209 10:25:37.966312 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:25:37 crc kubenswrapper[5002]: I1209 10:25:37.966367 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:25:37 crc kubenswrapper[5002]: I1209 10:25:37.967088 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8882b3e4cc037c99de652a814b6e830546393f19945b2204e6e01c0052e460f5"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:25:37 crc kubenswrapper[5002]: I1209 10:25:37.967152 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://8882b3e4cc037c99de652a814b6e830546393f19945b2204e6e01c0052e460f5" gracePeriod=600 Dec 09 10:25:38 crc kubenswrapper[5002]: I1209 10:25:38.352857 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"8882b3e4cc037c99de652a814b6e830546393f19945b2204e6e01c0052e460f5"} Dec 09 10:25:38 crc kubenswrapper[5002]: I1209 10:25:38.352872 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="8882b3e4cc037c99de652a814b6e830546393f19945b2204e6e01c0052e460f5" exitCode=0 Dec 09 10:25:38 crc kubenswrapper[5002]: I1209 10:25:38.353707 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e"} Dec 09 10:25:38 crc kubenswrapper[5002]: I1209 10:25:38.353653 5002 scope.go:117] "RemoveContainer" containerID="3884e46cf25151268d65649fde8e75f33e599a76a13b5c73816d374f2399025a" Dec 09 10:25:38 crc kubenswrapper[5002]: E1209 10:25:38.792001 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:38 crc kubenswrapper[5002]: E1209 10:25:38.792739 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:38 crc kubenswrapper[5002]: E1209 10:25:38.793532 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:38 crc kubenswrapper[5002]: E1209 10:25:38.793548 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:38 crc kubenswrapper[5002]: E1209 10:25:38.793609 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server" Dec 09 10:25:38 crc kubenswrapper[5002]: E1209 10:25:38.796419 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:38 crc kubenswrapper[5002]: E1209 10:25:38.798061 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:38 crc kubenswrapper[5002]: E1209 10:25:38.798225 5002 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovs-vswitchd" Dec 09 10:25:39 crc kubenswrapper[5002]: I1209 10:25:39.897267 5002 generic.go:334] "Generic (PLEG): container finished" podID="41f46a2d-f158-497f-b61b-60f39c64149b" containerID="a034c8a435eb9196808c0d7f0f523d44f798925554bc10f46ac57bff50643ec3" exitCode=0 Dec 09 10:25:39 crc kubenswrapper[5002]: I1209 10:25:39.897753 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-857f77df5c-skx8f" event={"ID":"41f46a2d-f158-497f-b61b-60f39c64149b","Type":"ContainerDied","Data":"a034c8a435eb9196808c0d7f0f523d44f798925554bc10f46ac57bff50643ec3"} Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.130765 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.175282 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-ovndb-tls-certs\") pod \"41f46a2d-f158-497f-b61b-60f39c64149b\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.175324 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-internal-tls-certs\") pod \"41f46a2d-f158-497f-b61b-60f39c64149b\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.175360 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-config\") pod \"41f46a2d-f158-497f-b61b-60f39c64149b\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.175404 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-httpd-config\") pod \"41f46a2d-f158-497f-b61b-60f39c64149b\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.175461 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-combined-ca-bundle\") pod \"41f46a2d-f158-497f-b61b-60f39c64149b\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.175506 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-public-tls-certs\") pod \"41f46a2d-f158-497f-b61b-60f39c64149b\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.175526 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpvgz\" (UniqueName: \"kubernetes.io/projected/41f46a2d-f158-497f-b61b-60f39c64149b-kube-api-access-dpvgz\") pod \"41f46a2d-f158-497f-b61b-60f39c64149b\" (UID: \"41f46a2d-f158-497f-b61b-60f39c64149b\") " Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.181424 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f46a2d-f158-497f-b61b-60f39c64149b-kube-api-access-dpvgz" (OuterVolumeSpecName: "kube-api-access-dpvgz") pod "41f46a2d-f158-497f-b61b-60f39c64149b" (UID: "41f46a2d-f158-497f-b61b-60f39c64149b"). InnerVolumeSpecName "kube-api-access-dpvgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.197508 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "41f46a2d-f158-497f-b61b-60f39c64149b" (UID: "41f46a2d-f158-497f-b61b-60f39c64149b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.222221 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-config" (OuterVolumeSpecName: "config") pod "41f46a2d-f158-497f-b61b-60f39c64149b" (UID: "41f46a2d-f158-497f-b61b-60f39c64149b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.224110 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "41f46a2d-f158-497f-b61b-60f39c64149b" (UID: "41f46a2d-f158-497f-b61b-60f39c64149b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.225681 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "41f46a2d-f158-497f-b61b-60f39c64149b" (UID: "41f46a2d-f158-497f-b61b-60f39c64149b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.231761 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41f46a2d-f158-497f-b61b-60f39c64149b" (UID: "41f46a2d-f158-497f-b61b-60f39c64149b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.247331 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "41f46a2d-f158-497f-b61b-60f39c64149b" (UID: "41f46a2d-f158-497f-b61b-60f39c64149b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.277991 5002 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.278638 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpvgz\" (UniqueName: \"kubernetes.io/projected/41f46a2d-f158-497f-b61b-60f39c64149b-kube-api-access-dpvgz\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.278666 5002 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.278684 5002 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.278702 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.278718 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.278737 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f46a2d-f158-497f-b61b-60f39c64149b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.912606 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-857f77df5c-skx8f" event={"ID":"41f46a2d-f158-497f-b61b-60f39c64149b","Type":"ContainerDied","Data":"f0ade6f71ccd72abf043b302815419aaf0c2592c32a3cad155f97678f1d96565"} Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.912673 5002 scope.go:117] "RemoveContainer" containerID="a9f9119ac359c6cfc08e9e4ec057f2b160949a68cbe68fb1fa38fc53c004e69a" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.912709 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-857f77df5c-skx8f" Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.946410 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-857f77df5c-skx8f"] Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.953293 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-857f77df5c-skx8f"] Dec 09 10:25:40 crc kubenswrapper[5002]: I1209 10:25:40.955077 5002 scope.go:117] "RemoveContainer" containerID="a034c8a435eb9196808c0d7f0f523d44f798925554bc10f46ac57bff50643ec3" Dec 09 10:25:42 crc kubenswrapper[5002]: I1209 10:25:42.069281 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f46a2d-f158-497f-b61b-60f39c64149b" path="/var/lib/kubelet/pods/41f46a2d-f158-497f-b61b-60f39c64149b/volumes" Dec 09 10:25:43 crc kubenswrapper[5002]: E1209 10:25:43.791774 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:43 crc kubenswrapper[5002]: E1209 10:25:43.792803 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:43 crc kubenswrapper[5002]: E1209 10:25:43.793389 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:43 crc kubenswrapper[5002]: E1209 10:25:43.793442 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 10:25:43 crc kubenswrapper[5002]: E1209 10:25:43.793483 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server" Dec 09 10:25:43 crc kubenswrapper[5002]: E1209 10:25:43.795085 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:43 crc kubenswrapper[5002]: E1209 10:25:43.797160 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 10:25:43 crc kubenswrapper[5002]: E1209 10:25:43.797199 5002 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-g4kc8" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovs-vswitchd" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.734717 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.788766 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-cache\") pod \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.788903 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-lock\") pod \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.789011 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.789048 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwnbd\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-kube-api-access-wwnbd\") pod \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.789080 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift\") pod \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\" (UID: \"dfa166a7-dec2-453d-9cd9-f77d30f1636a\") " Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.789516 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-lock" (OuterVolumeSpecName: "lock") pod "dfa166a7-dec2-453d-9cd9-f77d30f1636a" (UID: "dfa166a7-dec2-453d-9cd9-f77d30f1636a"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.789676 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-cache" (OuterVolumeSpecName: "cache") pod "dfa166a7-dec2-453d-9cd9-f77d30f1636a" (UID: "dfa166a7-dec2-453d-9cd9-f77d30f1636a"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.795798 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-kube-api-access-wwnbd" (OuterVolumeSpecName: "kube-api-access-wwnbd") pod "dfa166a7-dec2-453d-9cd9-f77d30f1636a" (UID: "dfa166a7-dec2-453d-9cd9-f77d30f1636a"). InnerVolumeSpecName "kube-api-access-wwnbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.796928 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dfa166a7-dec2-453d-9cd9-f77d30f1636a" (UID: "dfa166a7-dec2-453d-9cd9-f77d30f1636a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.796954 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "swift") pod "dfa166a7-dec2-453d-9cd9-f77d30f1636a" (UID: "dfa166a7-dec2-453d-9cd9-f77d30f1636a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.891190 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.891490 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwnbd\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-kube-api-access-wwnbd\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.891505 5002 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dfa166a7-dec2-453d-9cd9-f77d30f1636a-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.891514 5002 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-cache\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.891524 5002 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dfa166a7-dec2-453d-9cd9-f77d30f1636a-lock\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.904473 5002 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.980187 5002 generic.go:334] "Generic (PLEG): container finished" podID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerID="76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f" exitCode=137 Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.980309 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f"} Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.980359 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.980408 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dfa166a7-dec2-453d-9cd9-f77d30f1636a","Type":"ContainerDied","Data":"2e0b06393713cb143da6941041054636068a7b1e4845e8f8db452660be757378"} Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.980456 5002 scope.go:117] "RemoveContainer" containerID="76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.982879 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-g4kc8_26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6/ovs-vswitchd/0.log" Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.986099 5002 generic.go:334] "Generic (PLEG): container finished" podID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" exitCode=137 Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.986142 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-g4kc8" event={"ID":"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6","Type":"ContainerDied","Data":"3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782"} Dec 09 10:25:47 crc kubenswrapper[5002]: I1209 10:25:47.992801 5002 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.005587 5002 scope.go:117] "RemoveContainer" containerID="7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.019746 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.027583 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.040065 5002 scope.go:117] "RemoveContainer" containerID="4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.060784 5002 scope.go:117] "RemoveContainer" containerID="1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.071209 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" path="/var/lib/kubelet/pods/dfa166a7-dec2-453d-9cd9-f77d30f1636a/volumes" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.084081 5002 scope.go:117] "RemoveContainer" containerID="e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.106732 5002 scope.go:117] "RemoveContainer" containerID="4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.127898 5002 scope.go:117] "RemoveContainer" containerID="64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.152055 5002 scope.go:117] "RemoveContainer" containerID="8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.242688 5002 scope.go:117] "RemoveContainer" containerID="9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.264000 5002 scope.go:117] "RemoveContainer" containerID="6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.283710 5002 scope.go:117] "RemoveContainer" containerID="8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.303102 5002 scope.go:117] "RemoveContainer" containerID="b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.330030 5002 scope.go:117] "RemoveContainer" containerID="2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.362765 5002 scope.go:117] "RemoveContainer" containerID="3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.380375 5002 scope.go:117] "RemoveContainer" containerID="572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.396899 5002 scope.go:117] "RemoveContainer" containerID="76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.397451 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f\": container with ID starting with 76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f not found: ID does not exist" containerID="76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.397478 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f"} err="failed to get container status \"76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f\": rpc error: code = NotFound desc = could not find container \"76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f\": container with ID starting with 76763b7766e9025115e42c4aebcef8bd5282beaf9f41d7c400aa31601a1ff35f not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.397499 5002 scope.go:117] "RemoveContainer" containerID="7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.397900 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20\": container with ID starting with 7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20 not found: ID does not exist" containerID="7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.397927 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20"} err="failed to get container status \"7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20\": rpc error: code = NotFound desc = could not find container \"7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20\": container with ID starting with 7f97157518e27503872febf7ddda3d551dbd7a2803115b464389b1571f0d9e20 not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.397942 5002 scope.go:117] "RemoveContainer" containerID="4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.398198 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033\": container with ID starting with 4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033 not found: ID does not exist" containerID="4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.398254 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033"} err="failed to get container status \"4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033\": rpc error: code = NotFound desc = could not find container \"4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033\": container with ID starting with 4430f2180524a0ee4235580c2b9d8df48e33a7e227458714f054c2d1a1d7f033 not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.398274 5002 scope.go:117] "RemoveContainer" containerID="1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.398548 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149\": container with ID starting with 1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149 not found: ID does not exist" containerID="1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.398566 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149"} err="failed to get container status \"1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149\": rpc error: code = NotFound desc = could not find container \"1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149\": container with ID starting with 1d1d17700b9060a9639585ef5fdeb4b80245971f6249c58f20c9f2f5a5cba149 not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.398579 5002 scope.go:117] "RemoveContainer" containerID="e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.398958 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb\": container with ID starting with e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb not found: ID does not exist" containerID="e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.399006 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb"} err="failed to get container status \"e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb\": rpc error: code = NotFound desc = could not find container \"e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb\": container with ID starting with e4190f8e2ba7bf2e5668737db294b27ff0a3fe18ec67358a90971c6b810a30cb not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.399036 5002 scope.go:117] "RemoveContainer" containerID="4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.399325 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e\": container with ID starting with 4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e not found: ID does not exist" containerID="4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.399346 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e"} err="failed to get container status \"4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e\": rpc error: code = NotFound desc = could not find container \"4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e\": container with ID starting with 4a883aac8893d883c07d84b09c58fe2a39a0f93405bc123e2e6b73703862158e not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.399361 5002 scope.go:117] "RemoveContainer" containerID="64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.399590 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda\": container with ID starting with 64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda not found: ID does not exist" containerID="64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.399613 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda"} err="failed to get container status \"64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda\": rpc error: code = NotFound desc = could not find container \"64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda\": container with ID starting with 64c1de5a8db5505ebb9ab646a19408256aae76bc990663c6b9d95baed247ddda not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.399627 5002 scope.go:117] "RemoveContainer" containerID="8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.399870 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec\": container with ID starting with 8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec not found: ID does not exist" containerID="8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.399886 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec"} err="failed to get container status \"8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec\": rpc error: code = NotFound desc = could not find container \"8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec\": container with ID starting with 8c060664face312f2e3371362872261fcd8f50e7bd540c37d281ee014c188bec not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.399905 5002 scope.go:117] "RemoveContainer" containerID="9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.400137 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04\": container with ID starting with 9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04 not found: ID does not exist" containerID="9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.400154 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04"} err="failed to get container status \"9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04\": rpc error: code = NotFound desc = could not find container \"9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04\": container with ID starting with 9c6faf241f54027209b7cce0e0fc9faf46499ba3ee26bb97a888eb1ca008dd04 not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.400165 5002 scope.go:117] "RemoveContainer" containerID="6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.400435 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423\": container with ID starting with 6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423 not found: ID does not exist" containerID="6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.400466 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423"} err="failed to get container status \"6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423\": rpc error: code = NotFound desc = could not find container \"6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423\": container with ID starting with 6475b21d3689ac93f09493f21c1ff2efd22fc48ca484d69e1663bc5217a41423 not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.400485 5002 scope.go:117] "RemoveContainer" containerID="8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.401069 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364\": container with ID starting with 8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364 not found: ID does not exist" containerID="8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.401097 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364"} err="failed to get container status \"8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364\": rpc error: code = NotFound desc = could not find container \"8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364\": container with ID starting with 8ef48e223ac981dd68fad5f23cdc81f9ff45600e033082f26b51348da7a3e364 not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.401111 5002 scope.go:117] "RemoveContainer" containerID="b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.401382 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567\": container with ID starting with b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567 not found: ID does not exist" containerID="b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.401406 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567"} err="failed to get container status \"b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567\": rpc error: code = NotFound desc = could not find container \"b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567\": container with ID starting with b75afaa2bb61d58afc5343f8437dbcd68fa039ca4733d2850879069289c78567 not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.401420 5002 scope.go:117] "RemoveContainer" containerID="2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.401614 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d\": container with ID starting with 2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d not found: ID does not exist" containerID="2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.401634 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d"} err="failed to get container status \"2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d\": rpc error: code = NotFound desc = could not find container \"2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d\": container with ID starting with 2a8bccd609bab5986054b408416dade80cc7b8cf1c6ce51df003879ca5e0a92d not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.401646 5002 scope.go:117] "RemoveContainer" containerID="3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.401854 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47\": container with ID starting with 3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47 not found: ID does not exist" containerID="3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.401877 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47"} err="failed to get container status \"3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47\": rpc error: code = NotFound desc = could not find container \"3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47\": container with ID starting with 3f70ac59be273071ab01746bf90703513be9b442ef9be9f9e0ab6ecb0f2e0e47 not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.401896 5002 scope.go:117] "RemoveContainer" containerID="572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3" Dec 09 10:25:48 crc kubenswrapper[5002]: E1209 10:25:48.402135 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3\": container with ID starting with 572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3 not found: ID does not exist" containerID="572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.402165 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3"} err="failed to get container status \"572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3\": rpc error: code = NotFound desc = could not find container \"572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3\": container with ID starting with 572968954f44aa3432e15bf47ef3b7d45a9cac349101fcd97fb7b56fd86110b3 not found: ID does not exist" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.491449 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-g4kc8_26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6/ovs-vswitchd/0.log" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.492156 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.644353 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2xpx\" (UniqueName: \"kubernetes.io/projected/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-kube-api-access-x2xpx\") pod \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.644479 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-run\") pod \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.644566 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-run" (OuterVolumeSpecName: "var-run") pod "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" (UID: "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.644515 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-etc-ovs\") pod \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.644659 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" (UID: "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.644691 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-scripts\") pod \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.645177 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-lib" (OuterVolumeSpecName: "var-lib") pod "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" (UID: "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.645702 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-scripts" (OuterVolumeSpecName: "scripts") pod "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" (UID: "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.645880 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-lib\") pod \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.645923 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-log\") pod \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\" (UID: \"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6\") " Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.646013 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-log" (OuterVolumeSpecName: "var-log") pod "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" (UID: "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.646435 5002 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-lib\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.646456 5002 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-log\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.646465 5002 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.646473 5002 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.646481 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.648035 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-kube-api-access-x2xpx" (OuterVolumeSpecName: "kube-api-access-x2xpx") pod "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" (UID: "26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6"). InnerVolumeSpecName "kube-api-access-x2xpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:25:48 crc kubenswrapper[5002]: I1209 10:25:48.748042 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2xpx\" (UniqueName: \"kubernetes.io/projected/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6-kube-api-access-x2xpx\") on node \"crc\" DevicePath \"\"" Dec 09 10:25:49 crc kubenswrapper[5002]: I1209 10:25:49.002183 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-g4kc8_26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6/ovs-vswitchd/0.log" Dec 09 10:25:49 crc kubenswrapper[5002]: I1209 10:25:49.003661 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-g4kc8" event={"ID":"26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6","Type":"ContainerDied","Data":"a56e9525efca8beac81f7fa45f9a7a566642078a4686d2d31f413c7f8c689519"} Dec 09 10:25:49 crc kubenswrapper[5002]: I1209 10:25:49.003683 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-g4kc8" Dec 09 10:25:49 crc kubenswrapper[5002]: I1209 10:25:49.003764 5002 scope.go:117] "RemoveContainer" containerID="3f18346c6d45cdce8933113ee6ff0f64d79183a978ac856ba561f2eb32009782" Dec 09 10:25:49 crc kubenswrapper[5002]: I1209 10:25:49.036441 5002 scope.go:117] "RemoveContainer" containerID="5c2d2c6137f09a0249a92e685f777a88211e7a367963f717a3fe59c6940a69a5" Dec 09 10:25:49 crc kubenswrapper[5002]: I1209 10:25:49.057074 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-g4kc8"] Dec 09 10:25:49 crc kubenswrapper[5002]: I1209 10:25:49.063498 5002 scope.go:117] "RemoveContainer" containerID="23d2c226434058f19c40b285671296768eb145dc5740c1444ad73c806ee85ca4" Dec 09 10:25:49 crc kubenswrapper[5002]: I1209 10:25:49.063940 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-g4kc8"] Dec 09 10:25:50 crc kubenswrapper[5002]: I1209 10:25:50.076502 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" path="/var/lib/kubelet/pods/26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6/volumes" Dec 09 10:25:50 crc kubenswrapper[5002]: I1209 10:25:50.332219 5002 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode0a5beb3-4401-42b8-b8e3-4d2af995a4d0"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode0a5beb3-4401-42b8-b8e3-4d2af995a4d0] : Timed out while waiting for systemd to remove kubepods-besteffort-pode0a5beb3_4401_42b8_b8e3_4d2af995a4d0.slice" Dec 09 10:25:54 crc kubenswrapper[5002]: I1209 10:25:54.414739 5002 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","poda67e154b-1de7-4e2b-9b87-049ea273fa01"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort poda67e154b-1de7-4e2b-9b87-049ea273fa01] : Timed out while waiting for systemd to remove kubepods-besteffort-poda67e154b_1de7_4e2b_9b87_049ea273fa01.slice" Dec 09 10:26:42 crc kubenswrapper[5002]: I1209 10:26:42.299383 5002 scope.go:117] "RemoveContainer" containerID="476934525d50946ec2cb341c1e2b13e82d6ab250a0f2cde25a67c44f24103f24" Dec 09 10:26:42 crc kubenswrapper[5002]: I1209 10:26:42.339458 5002 scope.go:117] "RemoveContainer" containerID="94d514a6ebe6c1eff84f9f88c7ad35fe0027d8ec7ab6c85385089b506517f141" Dec 09 10:26:42 crc kubenswrapper[5002]: I1209 10:26:42.377696 5002 scope.go:117] "RemoveContainer" containerID="a20c1f002f7d8bcb4ece970ae57f34515b9903a80f22bc0993c57aaa705415a2" Dec 09 10:26:42 crc kubenswrapper[5002]: I1209 10:26:42.414571 5002 scope.go:117] "RemoveContainer" containerID="bf71ec6fe28fd4abe441e77ddd99cb0e8c7339fa9a0888289ab5a90b48904a26" Dec 09 10:26:42 crc kubenswrapper[5002]: I1209 10:26:42.472193 5002 scope.go:117] "RemoveContainer" containerID="2b16c2489d05cd8e096c967062632f622a1d7e67f83b1054e468488b01f8969f" Dec 09 10:26:42 crc kubenswrapper[5002]: I1209 10:26:42.491536 5002 scope.go:117] "RemoveContainer" containerID="468110e57eae6321d8b757c466c1d5d8849d2adc61dbc25d3bbb142325cb38a6" Dec 09 10:26:42 crc kubenswrapper[5002]: I1209 10:26:42.527013 5002 scope.go:117] "RemoveContainer" containerID="fa7bfbca45b21b5d83224de4e38c3dfca159e2b30e9c84bbb5739ee8d9f10e3d" Dec 09 10:26:42 crc kubenswrapper[5002]: I1209 10:26:42.548049 5002 scope.go:117] "RemoveContainer" containerID="13f99fc09c85191228c9afb1481413c4572d18c67027baca455d6c30a57f6d07" Dec 09 10:26:42 crc kubenswrapper[5002]: I1209 10:26:42.572388 5002 scope.go:117] "RemoveContainer" containerID="2bc5e6430f12b450876b8742c6de7a12ca3be054e352f48252fd07dd06cf20d1" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.773550 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9sldb"] Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774296 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7faabd78-c9ab-4397-aa4d-b8aaff302251" containerName="mysql-bootstrap" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774315 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7faabd78-c9ab-4397-aa4d-b8aaff302251" containerName="mysql-bootstrap" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774338 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-auditor" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774347 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-auditor" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774366 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovs-vswitchd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774376 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovs-vswitchd" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774391 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ddce94-6333-4233-951d-571a761b708f" containerName="barbican-keystone-listener-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774400 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ddce94-6333-4233-951d-571a761b708f" containerName="barbican-keystone-listener-log" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774415 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f702a539-ec25-44d4-8629-97b3c5499b96" containerName="cinder-scheduler" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774423 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f702a539-ec25-44d4-8629-97b3c5499b96" containerName="cinder-scheduler" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774445 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f41619d4-24a3-46e4-9cb9-2e388f7cd36b" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774456 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f41619d4-24a3-46e4-9cb9-2e388f7cd36b" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774475 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9278e14e-2524-4e42-b870-f493ea02ede8" containerName="setup-container" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774484 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9278e14e-2524-4e42-b870-f493ea02ede8" containerName="setup-container" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774495 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f7675b-6614-4e41-86e6-364b7f04664e" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774504 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f7675b-6614-4e41-86e6-364b7f04664e" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774519 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65df60b6-4049-47b6-9907-ebf76c151213" containerName="glance-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774528 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="65df60b6-4049-47b6-9907-ebf76c151213" containerName="glance-log" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774540 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f46a2d-f158-497f-b61b-60f39c64149b" containerName="neutron-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774548 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f46a2d-f158-497f-b61b-60f39c64149b" containerName="neutron-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774559 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="swift-recon-cron" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774567 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="swift-recon-cron" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774577 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322c0304-1696-43fb-9225-a709e7e2ea89" containerName="ovsdbserver-nb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774585 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="322c0304-1696-43fb-9225-a709e7e2ea89" containerName="ovsdbserver-nb" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774597 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf056c0-a496-4499-92c7-3b1300b4a29d" containerName="proxy-server" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774606 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf056c0-a496-4499-92c7-3b1300b4a29d" containerName="proxy-server" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774617 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a9794a-b66f-40d4-9e70-efc6a0a72d83" containerName="nova-cell1-conductor-conductor" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774625 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a9794a-b66f-40d4-9e70-efc6a0a72d83" containerName="nova-cell1-conductor-conductor" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774639 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae47b25-e6fd-451f-9827-72ee4e12e526" containerName="nova-cell0-conductor-conductor" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774647 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae47b25-e6fd-451f-9827-72ee4e12e526" containerName="nova-cell0-conductor-conductor" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774663 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-updater" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774671 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-updater" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774683 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-server" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774693 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-server" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774711 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43512a9c-be3a-4c0e-a178-82c5a065acf4" containerName="nova-scheduler-scheduler" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774722 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="43512a9c-be3a-4c0e-a178-82c5a065acf4" containerName="nova-scheduler-scheduler" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774736 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774744 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774759 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-updater" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774768 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-updater" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774776 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" containerName="ovn-controller" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774784 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" containerName="ovn-controller" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774796 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5836b7-7b16-477f-9a20-f30032362374" containerName="nova-api-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774804 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5836b7-7b16-477f-9a20-f30032362374" containerName="nova-api-api" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774845 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54351653-7ebd-40ba-8181-bb1023f18190" containerName="glance-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774858 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="54351653-7ebd-40ba-8181-bb1023f18190" containerName="glance-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774870 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44aced5-6d19-429a-8917-cd4229341433" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774881 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44aced5-6d19-429a-8917-cd4229341433" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774893 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7faabd78-c9ab-4397-aa4d-b8aaff302251" containerName="galera" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774901 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7faabd78-c9ab-4397-aa4d-b8aaff302251" containerName="galera" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774913 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774920 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774935 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-replicator" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774943 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-replicator" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774958 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="sg-core" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774966 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="sg-core" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.774980 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cb84ad-b399-4fe4-9631-e481dfa75aed" containerName="ovsdbserver-sb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.774988 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cb84ad-b399-4fe4-9631-e481dfa75aed" containerName="ovsdbserver-sb" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775003 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f514395b-6067-4e42-98e6-f3c5ac427982" containerName="keystone-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775010 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f514395b-6067-4e42-98e6-f3c5ac427982" containerName="keystone-api" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775026 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" containerName="placement-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775034 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" containerName="placement-api" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775051 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e36e954-d9c1-41e3-8542-e8f300db90cb" containerName="memcached" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775059 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e36e954-d9c1-41e3-8542-e8f300db90cb" containerName="memcached" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775071 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58de676b-7b73-4c04-b5d5-5de38a88072c" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775079 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="58de676b-7b73-4c04-b5d5-5de38a88072c" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775089 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-expirer" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775096 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-expirer" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775111 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="ceilometer-central-agent" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775119 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="ceilometer-central-agent" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775134 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c94bee-a522-4ea6-85af-1ba68e174203" containerName="cinder-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775142 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c94bee-a522-4ea6-85af-1ba68e174203" containerName="cinder-api" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775152 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-reaper" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775160 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-reaper" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775170 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="ceilometer-notification-agent" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775178 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="ceilometer-notification-agent" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775189 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe95257d-a02e-4f04-a543-a2db08231043" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775198 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe95257d-a02e-4f04-a543-a2db08231043" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775242 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-server" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775253 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-server" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775267 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-metadata" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775276 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-metadata" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775286 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae6c00ce-3152-42ae-890f-bb76aac103c5" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775296 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae6c00ce-3152-42ae-890f-bb76aac103c5" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775308 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9278e14e-2524-4e42-b870-f493ea02ede8" containerName="rabbitmq" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775317 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9278e14e-2524-4e42-b870-f493ea02ede8" containerName="rabbitmq" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775334 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-server" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775343 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-server" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775361 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322c0304-1696-43fb-9225-a709e7e2ea89" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775371 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="322c0304-1696-43fb-9225-a709e7e2ea89" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775380 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerName="barbican-api-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775389 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerName="barbican-api-log" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775406 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d100f321-6fe6-4eb3-a00c-50b9ff5e2861" containerName="galera" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775416 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d100f321-6fe6-4eb3-a00c-50b9ff5e2861" containerName="galera" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775425 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f702a539-ec25-44d4-8629-97b3c5499b96" containerName="probe" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775433 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f702a539-ec25-44d4-8629-97b3c5499b96" containerName="probe" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775443 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b9775f-22d1-413b-8d2f-1dbe890b582c" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775452 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b9775f-22d1-413b-8d2f-1dbe890b582c" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775467 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b613f5a4-9369-45ae-8c2c-10e16e639999" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775478 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b613f5a4-9369-45ae-8c2c-10e16e639999" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775498 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server-init" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775508 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server-init" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775518 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c08274-46ea-48be-a135-0c1174cd6135" containerName="setup-container" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775527 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c08274-46ea-48be-a135-0c1174cd6135" containerName="setup-container" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775542 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cb84ad-b399-4fe4-9631-e481dfa75aed" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775550 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cb84ad-b399-4fe4-9631-e481dfa75aed" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775558 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54351653-7ebd-40ba-8181-bb1023f18190" containerName="glance-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775566 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="54351653-7ebd-40ba-8181-bb1023f18190" containerName="glance-log" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775578 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d100f321-6fe6-4eb3-a00c-50b9ff5e2861" containerName="mysql-bootstrap" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775587 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d100f321-6fe6-4eb3-a00c-50b9ff5e2861" containerName="mysql-bootstrap" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775599 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a67e154b-1de7-4e2b-9b87-049ea273fa01" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775607 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67e154b-1de7-4e2b-9b87-049ea273fa01" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775621 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" containerName="placement-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775629 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" containerName="placement-log" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775642 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ddce94-6333-4233-951d-571a761b708f" containerName="barbican-keystone-listener" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775650 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ddce94-6333-4233-951d-571a761b708f" containerName="barbican-keystone-listener" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775662 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="proxy-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775672 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="proxy-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775689 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-replicator" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775699 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-replicator" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775717 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" containerName="dnsmasq-dns" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775728 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" containerName="dnsmasq-dns" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775746 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9198258-4919-4ade-88ba-4a0773b32012" containerName="barbican-worker-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775756 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9198258-4919-4ade-88ba-4a0773b32012" containerName="barbican-worker-log" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775767 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65df60b6-4049-47b6-9907-ebf76c151213" containerName="glance-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775775 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="65df60b6-4049-47b6-9907-ebf76c151213" containerName="glance-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775786 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerName="barbican-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775794 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerName="barbican-api" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775808 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5836b7-7b16-477f-9a20-f30032362374" containerName="nova-api-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775842 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5836b7-7b16-477f-9a20-f30032362374" containerName="nova-api-log" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775858 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-auditor" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775868 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-auditor" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775882 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="rsync" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775890 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="rsync" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775903 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9198258-4919-4ade-88ba-4a0773b32012" containerName="barbican-worker" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775911 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9198258-4919-4ade-88ba-4a0773b32012" containerName="barbican-worker" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775925 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c94bee-a522-4ea6-85af-1ba68e174203" containerName="cinder-api-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775933 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c94bee-a522-4ea6-85af-1ba68e174203" containerName="cinder-api-log" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775943 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2adbbd67-ccdf-4444-b667-2b549bc200b5" containerName="kube-state-metrics" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775952 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="2adbbd67-ccdf-4444-b667-2b549bc200b5" containerName="kube-state-metrics" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775969 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" containerName="init" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.775979 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" containerName="init" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.775994 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-replicator" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776006 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-replicator" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.776023 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" containerName="ovn-northd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776033 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" containerName="ovn-northd" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.776045 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f46a2d-f158-497f-b61b-60f39c64149b" containerName="neutron-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776053 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f46a2d-f158-497f-b61b-60f39c64149b" containerName="neutron-api" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.776067 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-auditor" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776075 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-auditor" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.776089 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776097 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-log" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.776109 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf056c0-a496-4499-92c7-3b1300b4a29d" containerName="proxy-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776117 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf056c0-a496-4499-92c7-3b1300b4a29d" containerName="proxy-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: E1209 10:26:47.776128 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c08274-46ea-48be-a135-0c1174cd6135" containerName="rabbitmq" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776138 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c08274-46ea-48be-a135-0c1174cd6135" containerName="rabbitmq" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776338 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-updater" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776355 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerName="barbican-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776371 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776383 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="ceilometer-central-agent" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776399 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4061af7-7669-4bd4-a36c-6ec982e86753" containerName="barbican-api-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776418 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cb84ad-b399-4fe4-9631-e481dfa75aed" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776437 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="65df60b6-4049-47b6-9907-ebf76c151213" containerName="glance-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776446 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-auditor" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776460 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c94bee-a522-4ea6-85af-1ba68e174203" containerName="cinder-api-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776474 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-replicator" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776488 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-updater" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776497 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f514395b-6067-4e42-98e6-f3c5ac427982" containerName="keystone-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776509 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="322c0304-1696-43fb-9225-a709e7e2ea89" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776518 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-expirer" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776528 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="container-server" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776538 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-server" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776553 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae6c00ce-3152-42ae-890f-bb76aac103c5" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776566 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="58de676b-7b73-4c04-b5d5-5de38a88072c" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776576 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-auditor" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776587 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovsdb-server" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776599 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-replicator" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776611 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="rsync" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776623 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b9775f-22d1-413b-8d2f-1dbe890b582c" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776635 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9198258-4919-4ade-88ba-4a0773b32012" containerName="barbican-worker-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776643 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bf056c0-a496-4499-92c7-3b1300b4a29d" containerName="proxy-server" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776651 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-server" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776665 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="proxy-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776674 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="54351653-7ebd-40ba-8181-bb1023f18190" containerName="glance-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776681 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c94bee-a522-4ea6-85af-1ba68e174203" containerName="cinder-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776693 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f46a2d-f158-497f-b61b-60f39c64149b" containerName="neutron-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776702 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f41619d4-24a3-46e4-9cb9-2e388f7cd36b" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776717 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7faabd78-c9ab-4397-aa4d-b8aaff302251" containerName="galera" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776731 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe95257d-a02e-4f04-a543-a2db08231043" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776746 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9198258-4919-4ade-88ba-4a0773b32012" containerName="barbican-worker" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776761 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdaeef31-a8f8-478a-86b0-4d0126eb7f3a" containerName="ovn-controller" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776772 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d100f321-6fe6-4eb3-a00c-50b9ff5e2861" containerName="galera" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776786 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ddce94-6333-4233-951d-571a761b708f" containerName="barbican-keystone-listener-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776800 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a9794a-b66f-40d4-9e70-efc6a0a72d83" containerName="nova-cell1-conductor-conductor" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776837 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a5beb3-4401-42b8-b8e3-4d2af995a4d0" containerName="dnsmasq-dns" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776851 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bf056c0-a496-4499-92c7-3b1300b4a29d" containerName="proxy-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776867 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" containerName="placement-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776877 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-replicator" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776887 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="ceilometer-notification-agent" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776897 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="b613f5a4-9369-45ae-8c2c-10e16e639999" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776911 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="9278e14e-2524-4e42-b870-f493ea02ede8" containerName="rabbitmq" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776920 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="65df60b6-4049-47b6-9907-ebf76c151213" containerName="glance-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776933 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e36e954-d9c1-41e3-8542-e8f300db90cb" containerName="memcached" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776947 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d7a6fc-ab80-43cf-8ee4-bf9fd88089e6" containerName="ovs-vswitchd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776957 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="322c0304-1696-43fb-9225-a709e7e2ea89" containerName="ovsdbserver-nb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776968 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="swift-recon-cron" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776982 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.776996 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cb84ad-b399-4fe4-9631-e481dfa75aed" containerName="ovsdbserver-sb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777014 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd8a7609-928f-4a68-9903-fa846e4baeda" containerName="nova-metadata-metadata" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777030 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c08274-46ea-48be-a135-0c1174cd6135" containerName="rabbitmq" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777043 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="object-auditor" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777058 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f702a539-ec25-44d4-8629-97b3c5499b96" containerName="cinder-scheduler" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777076 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="54351653-7ebd-40ba-8181-bb1023f18190" containerName="glance-httpd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777091 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="5893e6fa-5b64-47e0-b8e1-f68baf27a65c" containerName="sg-core" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777105 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5836b7-7b16-477f-9a20-f30032362374" containerName="nova-api-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777121 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44aced5-6d19-429a-8917-cd4229341433" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777135 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fbd6d1-d87d-45a2-9bca-0f25f3daca0c" containerName="ovn-northd" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777150 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f46a2d-f158-497f-b61b-60f39c64149b" containerName="neutron-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777164 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae47b25-e6fd-451f-9827-72ee4e12e526" containerName="nova-cell0-conductor-conductor" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777187 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="0172d8ed-9ef1-4aac-b246-1b1ed0df87fc" containerName="placement-api" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777202 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f702a539-ec25-44d4-8629-97b3c5499b96" containerName="probe" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777215 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="2adbbd67-ccdf-4444-b667-2b549bc200b5" containerName="kube-state-metrics" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777228 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="43512a9c-be3a-4c0e-a178-82c5a065acf4" containerName="nova-scheduler-scheduler" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777243 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a67e154b-1de7-4e2b-9b87-049ea273fa01" containerName="mariadb-account-delete" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777254 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5836b7-7b16-477f-9a20-f30032362374" containerName="nova-api-log" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777270 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa166a7-dec2-453d-9cd9-f77d30f1636a" containerName="account-reaper" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777287 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f7675b-6614-4e41-86e6-364b7f04664e" containerName="openstack-network-exporter" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.777299 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ddce94-6333-4233-951d-571a761b708f" containerName="barbican-keystone-listener" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.778707 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.788589 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9sldb"] Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.884973 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-catalog-content\") pod \"certified-operators-9sldb\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.885343 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-utilities\") pod \"certified-operators-9sldb\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.885374 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks7fd\" (UniqueName: \"kubernetes.io/projected/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-kube-api-access-ks7fd\") pod \"certified-operators-9sldb\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.986212 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-catalog-content\") pod \"certified-operators-9sldb\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.986273 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-utilities\") pod \"certified-operators-9sldb\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.986297 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks7fd\" (UniqueName: \"kubernetes.io/projected/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-kube-api-access-ks7fd\") pod \"certified-operators-9sldb\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.986793 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-catalog-content\") pod \"certified-operators-9sldb\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:47 crc kubenswrapper[5002]: I1209 10:26:47.987129 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-utilities\") pod \"certified-operators-9sldb\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:48 crc kubenswrapper[5002]: I1209 10:26:48.007729 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks7fd\" (UniqueName: \"kubernetes.io/projected/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-kube-api-access-ks7fd\") pod \"certified-operators-9sldb\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:48 crc kubenswrapper[5002]: I1209 10:26:48.107647 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:48 crc kubenswrapper[5002]: I1209 10:26:48.607509 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9sldb"] Dec 09 10:26:48 crc kubenswrapper[5002]: W1209 10:26:48.612027 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ab2081e_bd5a_4e2e_89a2_e8849ff66d10.slice/crio-c2dae35913422d469f72bfb60c161cf41b4c3bb8ba4908d57372b786c27cde0e WatchSource:0}: Error finding container c2dae35913422d469f72bfb60c161cf41b4c3bb8ba4908d57372b786c27cde0e: Status 404 returned error can't find the container with id c2dae35913422d469f72bfb60c161cf41b4c3bb8ba4908d57372b786c27cde0e Dec 09 10:26:49 crc kubenswrapper[5002]: I1209 10:26:49.608916 5002 generic.go:334] "Generic (PLEG): container finished" podID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" containerID="9923c31ab3259577cb1c924fa67a8f150be5ead9e0b0ecc89dba116c26f22ad2" exitCode=0 Dec 09 10:26:49 crc kubenswrapper[5002]: I1209 10:26:49.609010 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sldb" event={"ID":"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10","Type":"ContainerDied","Data":"9923c31ab3259577cb1c924fa67a8f150be5ead9e0b0ecc89dba116c26f22ad2"} Dec 09 10:26:49 crc kubenswrapper[5002]: I1209 10:26:49.609321 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sldb" event={"ID":"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10","Type":"ContainerStarted","Data":"c2dae35913422d469f72bfb60c161cf41b4c3bb8ba4908d57372b786c27cde0e"} Dec 09 10:26:50 crc kubenswrapper[5002]: I1209 10:26:50.623237 5002 generic.go:334] "Generic (PLEG): container finished" podID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" containerID="68e14db49a4a1aae02e06574ac7c585a1039a44bc19c9765aea58c70d249afff" exitCode=0 Dec 09 10:26:50 crc kubenswrapper[5002]: I1209 10:26:50.623341 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sldb" event={"ID":"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10","Type":"ContainerDied","Data":"68e14db49a4a1aae02e06574ac7c585a1039a44bc19c9765aea58c70d249afff"} Dec 09 10:26:51 crc kubenswrapper[5002]: I1209 10:26:51.637297 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sldb" event={"ID":"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10","Type":"ContainerStarted","Data":"de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de"} Dec 09 10:26:51 crc kubenswrapper[5002]: I1209 10:26:51.664910 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9sldb" podStartSLOduration=2.935852359 podStartE2EDuration="4.664886854s" podCreationTimestamp="2025-12-09 10:26:47 +0000 UTC" firstStartedPulling="2025-12-09 10:26:49.61052272 +0000 UTC m=+1542.002573801" lastFinishedPulling="2025-12-09 10:26:51.339557215 +0000 UTC m=+1543.731608296" observedRunningTime="2025-12-09 10:26:51.658981385 +0000 UTC m=+1544.051032476" watchObservedRunningTime="2025-12-09 10:26:51.664886854 +0000 UTC m=+1544.056937935" Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.555497 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kdsdd"] Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.558202 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.570487 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdsdd"] Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.627322 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89nlb\" (UniqueName: \"kubernetes.io/projected/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-kube-api-access-89nlb\") pod \"community-operators-kdsdd\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.627439 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-catalog-content\") pod \"community-operators-kdsdd\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.627510 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-utilities\") pod \"community-operators-kdsdd\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.729206 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89nlb\" (UniqueName: \"kubernetes.io/projected/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-kube-api-access-89nlb\") pod \"community-operators-kdsdd\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.729311 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-catalog-content\") pod \"community-operators-kdsdd\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.729369 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-utilities\") pod \"community-operators-kdsdd\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.729961 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-utilities\") pod \"community-operators-kdsdd\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.730019 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-catalog-content\") pod \"community-operators-kdsdd\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.758891 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89nlb\" (UniqueName: \"kubernetes.io/projected/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-kube-api-access-89nlb\") pod \"community-operators-kdsdd\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:26:57 crc kubenswrapper[5002]: I1209 10:26:57.887328 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:26:58 crc kubenswrapper[5002]: I1209 10:26:58.112062 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:58 crc kubenswrapper[5002]: I1209 10:26:58.112122 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:58 crc kubenswrapper[5002]: I1209 10:26:58.168747 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:58 crc kubenswrapper[5002]: I1209 10:26:58.429355 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdsdd"] Dec 09 10:26:58 crc kubenswrapper[5002]: I1209 10:26:58.694844 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdsdd" event={"ID":"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8","Type":"ContainerStarted","Data":"42afe7439160e3292923f5d67d098f2d7d92d01bd9e2ac0c06d4dd6ca0789fb4"} Dec 09 10:26:58 crc kubenswrapper[5002]: I1209 10:26:58.750389 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:26:59 crc kubenswrapper[5002]: I1209 10:26:59.721008 5002 generic.go:334] "Generic (PLEG): container finished" podID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" containerID="f9feb7981e4a132e516a00f68864af15408cab190345076cd66e4d538c09514c" exitCode=0 Dec 09 10:26:59 crc kubenswrapper[5002]: I1209 10:26:59.721144 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdsdd" event={"ID":"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8","Type":"ContainerDied","Data":"f9feb7981e4a132e516a00f68864af15408cab190345076cd66e4d538c09514c"} Dec 09 10:27:00 crc kubenswrapper[5002]: I1209 10:27:00.534634 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9sldb"] Dec 09 10:27:00 crc kubenswrapper[5002]: I1209 10:27:00.731363 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdsdd" event={"ID":"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8","Type":"ContainerStarted","Data":"600185575ee1f640ca6600893ee02966a8235e4a9fe743a0e95d8af31547be8b"} Dec 09 10:27:00 crc kubenswrapper[5002]: I1209 10:27:00.731549 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9sldb" podUID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" containerName="registry-server" containerID="cri-o://de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de" gracePeriod=2 Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.644742 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.684376 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-catalog-content\") pod \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.684454 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-utilities\") pod \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.684569 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks7fd\" (UniqueName: \"kubernetes.io/projected/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-kube-api-access-ks7fd\") pod \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\" (UID: \"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10\") " Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.685530 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-utilities" (OuterVolumeSpecName: "utilities") pod "8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" (UID: "8ab2081e-bd5a-4e2e-89a2-e8849ff66d10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.689960 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-kube-api-access-ks7fd" (OuterVolumeSpecName: "kube-api-access-ks7fd") pod "8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" (UID: "8ab2081e-bd5a-4e2e-89a2-e8849ff66d10"). InnerVolumeSpecName "kube-api-access-ks7fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.735509 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" (UID: "8ab2081e-bd5a-4e2e-89a2-e8849ff66d10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.741160 5002 generic.go:334] "Generic (PLEG): container finished" podID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" containerID="de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de" exitCode=0 Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.741212 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sldb" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.741239 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sldb" event={"ID":"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10","Type":"ContainerDied","Data":"de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de"} Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.741278 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sldb" event={"ID":"8ab2081e-bd5a-4e2e-89a2-e8849ff66d10","Type":"ContainerDied","Data":"c2dae35913422d469f72bfb60c161cf41b4c3bb8ba4908d57372b786c27cde0e"} Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.741299 5002 scope.go:117] "RemoveContainer" containerID="de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.746434 5002 generic.go:334] "Generic (PLEG): container finished" podID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" containerID="600185575ee1f640ca6600893ee02966a8235e4a9fe743a0e95d8af31547be8b" exitCode=0 Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.746470 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdsdd" event={"ID":"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8","Type":"ContainerDied","Data":"600185575ee1f640ca6600893ee02966a8235e4a9fe743a0e95d8af31547be8b"} Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.771968 5002 scope.go:117] "RemoveContainer" containerID="68e14db49a4a1aae02e06574ac7c585a1039a44bc19c9765aea58c70d249afff" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.786099 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.786301 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.786362 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks7fd\" (UniqueName: \"kubernetes.io/projected/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10-kube-api-access-ks7fd\") on node \"crc\" DevicePath \"\"" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.788526 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9sldb"] Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.793331 5002 scope.go:117] "RemoveContainer" containerID="9923c31ab3259577cb1c924fa67a8f150be5ead9e0b0ecc89dba116c26f22ad2" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.793865 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9sldb"] Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.837566 5002 scope.go:117] "RemoveContainer" containerID="de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de" Dec 09 10:27:01 crc kubenswrapper[5002]: E1209 10:27:01.838106 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de\": container with ID starting with de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de not found: ID does not exist" containerID="de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.838135 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de"} err="failed to get container status \"de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de\": rpc error: code = NotFound desc = could not find container \"de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de\": container with ID starting with de64f3b7ba3d00ed67697babc4826ee8a99bee041e8d38342d414aafc793a0de not found: ID does not exist" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.838154 5002 scope.go:117] "RemoveContainer" containerID="68e14db49a4a1aae02e06574ac7c585a1039a44bc19c9765aea58c70d249afff" Dec 09 10:27:01 crc kubenswrapper[5002]: E1209 10:27:01.838655 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e14db49a4a1aae02e06574ac7c585a1039a44bc19c9765aea58c70d249afff\": container with ID starting with 68e14db49a4a1aae02e06574ac7c585a1039a44bc19c9765aea58c70d249afff not found: ID does not exist" containerID="68e14db49a4a1aae02e06574ac7c585a1039a44bc19c9765aea58c70d249afff" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.838704 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e14db49a4a1aae02e06574ac7c585a1039a44bc19c9765aea58c70d249afff"} err="failed to get container status \"68e14db49a4a1aae02e06574ac7c585a1039a44bc19c9765aea58c70d249afff\": rpc error: code = NotFound desc = could not find container \"68e14db49a4a1aae02e06574ac7c585a1039a44bc19c9765aea58c70d249afff\": container with ID starting with 68e14db49a4a1aae02e06574ac7c585a1039a44bc19c9765aea58c70d249afff not found: ID does not exist" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.838736 5002 scope.go:117] "RemoveContainer" containerID="9923c31ab3259577cb1c924fa67a8f150be5ead9e0b0ecc89dba116c26f22ad2" Dec 09 10:27:01 crc kubenswrapper[5002]: E1209 10:27:01.839101 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9923c31ab3259577cb1c924fa67a8f150be5ead9e0b0ecc89dba116c26f22ad2\": container with ID starting with 9923c31ab3259577cb1c924fa67a8f150be5ead9e0b0ecc89dba116c26f22ad2 not found: ID does not exist" containerID="9923c31ab3259577cb1c924fa67a8f150be5ead9e0b0ecc89dba116c26f22ad2" Dec 09 10:27:01 crc kubenswrapper[5002]: I1209 10:27:01.839131 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9923c31ab3259577cb1c924fa67a8f150be5ead9e0b0ecc89dba116c26f22ad2"} err="failed to get container status \"9923c31ab3259577cb1c924fa67a8f150be5ead9e0b0ecc89dba116c26f22ad2\": rpc error: code = NotFound desc = could not find container \"9923c31ab3259577cb1c924fa67a8f150be5ead9e0b0ecc89dba116c26f22ad2\": container with ID starting with 9923c31ab3259577cb1c924fa67a8f150be5ead9e0b0ecc89dba116c26f22ad2 not found: ID does not exist" Dec 09 10:27:02 crc kubenswrapper[5002]: I1209 10:27:02.077176 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" path="/var/lib/kubelet/pods/8ab2081e-bd5a-4e2e-89a2-e8849ff66d10/volumes" Dec 09 10:27:03 crc kubenswrapper[5002]: I1209 10:27:03.778181 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdsdd" event={"ID":"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8","Type":"ContainerStarted","Data":"77f18a4f56b562baf3622161f022982c41db1c07226bf11595a819db66b832b2"} Dec 09 10:27:03 crc kubenswrapper[5002]: I1209 10:27:03.802547 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kdsdd" podStartSLOduration=3.639097511 podStartE2EDuration="6.802529406s" podCreationTimestamp="2025-12-09 10:26:57 +0000 UTC" firstStartedPulling="2025-12-09 10:26:59.723342299 +0000 UTC m=+1552.115393380" lastFinishedPulling="2025-12-09 10:27:02.886774154 +0000 UTC m=+1555.278825275" observedRunningTime="2025-12-09 10:27:03.800757268 +0000 UTC m=+1556.192808369" watchObservedRunningTime="2025-12-09 10:27:03.802529406 +0000 UTC m=+1556.194580487" Dec 09 10:27:07 crc kubenswrapper[5002]: I1209 10:27:07.888463 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:27:07 crc kubenswrapper[5002]: I1209 10:27:07.888849 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:27:07 crc kubenswrapper[5002]: I1209 10:27:07.940617 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:27:08 crc kubenswrapper[5002]: I1209 10:27:08.876540 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:27:08 crc kubenswrapper[5002]: I1209 10:27:08.919650 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdsdd"] Dec 09 10:27:10 crc kubenswrapper[5002]: I1209 10:27:10.835414 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kdsdd" podUID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" containerName="registry-server" containerID="cri-o://77f18a4f56b562baf3622161f022982c41db1c07226bf11595a819db66b832b2" gracePeriod=2 Dec 09 10:27:11 crc kubenswrapper[5002]: I1209 10:27:11.848151 5002 generic.go:334] "Generic (PLEG): container finished" podID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" containerID="77f18a4f56b562baf3622161f022982c41db1c07226bf11595a819db66b832b2" exitCode=0 Dec 09 10:27:11 crc kubenswrapper[5002]: I1209 10:27:11.848220 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdsdd" event={"ID":"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8","Type":"ContainerDied","Data":"77f18a4f56b562baf3622161f022982c41db1c07226bf11595a819db66b832b2"} Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.396431 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.433287 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-catalog-content\") pod \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.433432 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89nlb\" (UniqueName: \"kubernetes.io/projected/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-kube-api-access-89nlb\") pod \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.434509 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-utilities\") pod \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\" (UID: \"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8\") " Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.435569 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-utilities" (OuterVolumeSpecName: "utilities") pod "36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" (UID: "36e69eb4-a603-4a81-b85d-d5e76ec9eaa8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.439291 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-kube-api-access-89nlb" (OuterVolumeSpecName: "kube-api-access-89nlb") pod "36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" (UID: "36e69eb4-a603-4a81-b85d-d5e76ec9eaa8"). InnerVolumeSpecName "kube-api-access-89nlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.481437 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" (UID: "36e69eb4-a603-4a81-b85d-d5e76ec9eaa8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.536581 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.536623 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89nlb\" (UniqueName: \"kubernetes.io/projected/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-kube-api-access-89nlb\") on node \"crc\" DevicePath \"\"" Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.536636 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.859758 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdsdd" event={"ID":"36e69eb4-a603-4a81-b85d-d5e76ec9eaa8","Type":"ContainerDied","Data":"42afe7439160e3292923f5d67d098f2d7d92d01bd9e2ac0c06d4dd6ca0789fb4"} Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.859835 5002 scope.go:117] "RemoveContainer" containerID="77f18a4f56b562baf3622161f022982c41db1c07226bf11595a819db66b832b2" Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.859878 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdsdd" Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.902301 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdsdd"] Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.904488 5002 scope.go:117] "RemoveContainer" containerID="600185575ee1f640ca6600893ee02966a8235e4a9fe743a0e95d8af31547be8b" Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.908582 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kdsdd"] Dec 09 10:27:12 crc kubenswrapper[5002]: I1209 10:27:12.932372 5002 scope.go:117] "RemoveContainer" containerID="f9feb7981e4a132e516a00f68864af15408cab190345076cd66e4d538c09514c" Dec 09 10:27:14 crc kubenswrapper[5002]: I1209 10:27:14.069406 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" path="/var/lib/kubelet/pods/36e69eb4-a603-4a81-b85d-d5e76ec9eaa8/volumes" Dec 09 10:27:42 crc kubenswrapper[5002]: I1209 10:27:42.747412 5002 scope.go:117] "RemoveContainer" containerID="b04ded4e2cd792d948ee1a8ccbcfbfc4efad912a0818fcb71735aedb71fee7dc" Dec 09 10:27:42 crc kubenswrapper[5002]: I1209 10:27:42.941720 5002 scope.go:117] "RemoveContainer" containerID="3e5dbfb4b382b7b6d2a3eddf1c13961cc5734c5034d5bf7a485b13ed5c7407e1" Dec 09 10:27:42 crc kubenswrapper[5002]: I1209 10:27:42.982732 5002 scope.go:117] "RemoveContainer" containerID="b11f0e175cf62d7332d70370bdf186828c934bc8e83952200f4524367cd0c303" Dec 09 10:27:43 crc kubenswrapper[5002]: I1209 10:27:43.020520 5002 scope.go:117] "RemoveContainer" containerID="c57d634a0d27647736fc38f55ce26634bc1bc214a5344d82a1f49282d5bfad44" Dec 09 10:27:43 crc kubenswrapper[5002]: I1209 10:27:43.053760 5002 scope.go:117] "RemoveContainer" containerID="75a1a6ebfe287494a07162022a2dce2dfa8561b0452360830b376abd08ad6489" Dec 09 10:27:43 crc kubenswrapper[5002]: I1209 10:27:43.090527 5002 scope.go:117] "RemoveContainer" containerID="51397172bc6109b173eaa3e4fefc729d3fc17a5efc791a6027086db757378d60" Dec 09 10:27:43 crc kubenswrapper[5002]: I1209 10:27:43.125966 5002 scope.go:117] "RemoveContainer" containerID="27d444bfbd27fdffdca54947d3c047948faf97cbca7f3940b0b42fe2b1c91d86" Dec 09 10:27:43 crc kubenswrapper[5002]: I1209 10:27:43.152744 5002 scope.go:117] "RemoveContainer" containerID="62986b9fc31c4835230b3d39f29d40e56e1c5bc39d5bdb95ff2621f76b756f46" Dec 09 10:27:43 crc kubenswrapper[5002]: I1209 10:27:43.178473 5002 scope.go:117] "RemoveContainer" containerID="e7cb3a713e4627a3bf966f071a06f11e0cf188dfd353e6767a883425d62ba995" Dec 09 10:27:43 crc kubenswrapper[5002]: I1209 10:27:43.209340 5002 scope.go:117] "RemoveContainer" containerID="ab9110ac35a8f003734d1fb5eccd9029fa547e73a8c6c4310717cc3e133cd2b9" Dec 09 10:27:43 crc kubenswrapper[5002]: I1209 10:27:43.225493 5002 scope.go:117] "RemoveContainer" containerID="91af6e127a8ca7009fb6fa302ad55f99230b8bcca01665a6d992e98fca89cf49" Dec 09 10:27:43 crc kubenswrapper[5002]: I1209 10:27:43.242352 5002 scope.go:117] "RemoveContainer" containerID="b33f5e5fc465757e0c888b2e806d84dfbc4b581048d50595ef724d4fadc97425" Dec 09 10:27:43 crc kubenswrapper[5002]: I1209 10:27:43.262023 5002 scope.go:117] "RemoveContainer" containerID="20ae5f344909f1778bd970bb441b4e8f054eb7da8d7d4f81a65ab39091b3eb8e" Dec 09 10:28:07 crc kubenswrapper[5002]: I1209 10:28:07.965211 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:28:07 crc kubenswrapper[5002]: I1209 10:28:07.966021 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:28:37 crc kubenswrapper[5002]: I1209 10:28:37.964368 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:28:37 crc kubenswrapper[5002]: I1209 10:28:37.965032 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:28:43 crc kubenswrapper[5002]: I1209 10:28:43.477294 5002 scope.go:117] "RemoveContainer" containerID="80fb2d8343721af50993af7f62d309f89b78d2c32763dd86d1a7cc7005ef0630" Dec 09 10:28:43 crc kubenswrapper[5002]: I1209 10:28:43.503485 5002 scope.go:117] "RemoveContainer" containerID="fe070c5525f3715e03b76a0a5d8f0d0b37ad9652906b4ae316a2dffee62d2026" Dec 09 10:28:43 crc kubenswrapper[5002]: I1209 10:28:43.555085 5002 scope.go:117] "RemoveContainer" containerID="0dd06cfd3c38da8c60bac35260c461aa9a32defee6ab2c78a1bf7739889b67d1" Dec 09 10:28:43 crc kubenswrapper[5002]: I1209 10:28:43.608826 5002 scope.go:117] "RemoveContainer" containerID="35dbaf5da172afd498a3a3aa82dd037ea9d3b17dd4326cebfa8d0dd4cd5c6087" Dec 09 10:28:43 crc kubenswrapper[5002]: I1209 10:28:43.649973 5002 scope.go:117] "RemoveContainer" containerID="d7e7a1030d81d816b82f4b7af798ec8f997a497e96c6a80bd2f695c81a769f47" Dec 09 10:28:43 crc kubenswrapper[5002]: I1209 10:28:43.677415 5002 scope.go:117] "RemoveContainer" containerID="132aa42ef57343a9c295da67543e7ab3de22debb6d40a328a3c23a77cc245928" Dec 09 10:28:43 crc kubenswrapper[5002]: I1209 10:28:43.707467 5002 scope.go:117] "RemoveContainer" containerID="b258185414bf070e956ceb655c2f9a8dbe7006c691135bd2c0d10ae21cf772c6" Dec 09 10:29:07 crc kubenswrapper[5002]: I1209 10:29:07.964550 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:29:07 crc kubenswrapper[5002]: I1209 10:29:07.965459 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:29:07 crc kubenswrapper[5002]: I1209 10:29:07.965552 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:29:07 crc kubenswrapper[5002]: I1209 10:29:07.966742 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:29:07 crc kubenswrapper[5002]: I1209 10:29:07.966900 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" gracePeriod=600 Dec 09 10:29:08 crc kubenswrapper[5002]: I1209 10:29:08.897006 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" exitCode=0 Dec 09 10:29:08 crc kubenswrapper[5002]: I1209 10:29:08.897098 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e"} Dec 09 10:29:08 crc kubenswrapper[5002]: I1209 10:29:08.897523 5002 scope.go:117] "RemoveContainer" containerID="8882b3e4cc037c99de652a814b6e830546393f19945b2204e6e01c0052e460f5" Dec 09 10:29:09 crc kubenswrapper[5002]: E1209 10:29:09.242931 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:29:09 crc kubenswrapper[5002]: I1209 10:29:09.910615 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:29:09 crc kubenswrapper[5002]: E1209 10:29:09.910971 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:29:23 crc kubenswrapper[5002]: I1209 10:29:23.059985 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:29:23 crc kubenswrapper[5002]: E1209 10:29:23.060688 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:29:38 crc kubenswrapper[5002]: I1209 10:29:38.064376 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:29:38 crc kubenswrapper[5002]: E1209 10:29:38.066801 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:29:43 crc kubenswrapper[5002]: I1209 10:29:43.856573 5002 scope.go:117] "RemoveContainer" containerID="a481319e648432bf60e9db982024715b443dfc82b14f612845eaada599692612" Dec 09 10:29:43 crc kubenswrapper[5002]: I1209 10:29:43.909223 5002 scope.go:117] "RemoveContainer" containerID="53b30c4b17869586d3e315fd81ac0d1c658ddca2fa36d24ba53232a45f2431eb" Dec 09 10:29:43 crc kubenswrapper[5002]: I1209 10:29:43.948916 5002 scope.go:117] "RemoveContainer" containerID="1c44f0683b18f3fea9dd649810fa726f3f7e8a9a0fc2da1309f38beff1d26e5b" Dec 09 10:29:53 crc kubenswrapper[5002]: I1209 10:29:53.060171 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:29:53 crc kubenswrapper[5002]: E1209 10:29:53.061076 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.162732 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f"] Dec 09 10:30:00 crc kubenswrapper[5002]: E1209 10:30:00.164052 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" containerName="registry-server" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.164087 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" containerName="registry-server" Dec 09 10:30:00 crc kubenswrapper[5002]: E1209 10:30:00.164110 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" containerName="extract-utilities" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.164128 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" containerName="extract-utilities" Dec 09 10:30:00 crc kubenswrapper[5002]: E1209 10:30:00.164187 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" containerName="extract-content" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.164205 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" containerName="extract-content" Dec 09 10:30:00 crc kubenswrapper[5002]: E1209 10:30:00.164235 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" containerName="registry-server" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.164251 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" containerName="registry-server" Dec 09 10:30:00 crc kubenswrapper[5002]: E1209 10:30:00.164304 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" containerName="extract-content" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.164320 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" containerName="extract-content" Dec 09 10:30:00 crc kubenswrapper[5002]: E1209 10:30:00.164351 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" containerName="extract-utilities" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.164369 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" containerName="extract-utilities" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.164639 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab2081e-bd5a-4e2e-89a2-e8849ff66d10" containerName="registry-server" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.164668 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="36e69eb4-a603-4a81-b85d-d5e76ec9eaa8" containerName="registry-server" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.165580 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.169119 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.170153 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.174770 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f"] Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.254081 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34a2e581-9142-4731-ba35-b89fc3efa4fa-config-volume\") pod \"collect-profiles-29421270-m9c9f\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.254200 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7zpt\" (UniqueName: \"kubernetes.io/projected/34a2e581-9142-4731-ba35-b89fc3efa4fa-kube-api-access-b7zpt\") pod \"collect-profiles-29421270-m9c9f\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.254242 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34a2e581-9142-4731-ba35-b89fc3efa4fa-secret-volume\") pod \"collect-profiles-29421270-m9c9f\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.355569 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34a2e581-9142-4731-ba35-b89fc3efa4fa-config-volume\") pod \"collect-profiles-29421270-m9c9f\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.355659 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7zpt\" (UniqueName: \"kubernetes.io/projected/34a2e581-9142-4731-ba35-b89fc3efa4fa-kube-api-access-b7zpt\") pod \"collect-profiles-29421270-m9c9f\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.355712 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34a2e581-9142-4731-ba35-b89fc3efa4fa-secret-volume\") pod \"collect-profiles-29421270-m9c9f\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.357042 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34a2e581-9142-4731-ba35-b89fc3efa4fa-config-volume\") pod \"collect-profiles-29421270-m9c9f\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.366193 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34a2e581-9142-4731-ba35-b89fc3efa4fa-secret-volume\") pod \"collect-profiles-29421270-m9c9f\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.374614 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7zpt\" (UniqueName: \"kubernetes.io/projected/34a2e581-9142-4731-ba35-b89fc3efa4fa-kube-api-access-b7zpt\") pod \"collect-profiles-29421270-m9c9f\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.504343 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:00 crc kubenswrapper[5002]: I1209 10:30:00.953164 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f"] Dec 09 10:30:01 crc kubenswrapper[5002]: I1209 10:30:01.329250 5002 generic.go:334] "Generic (PLEG): container finished" podID="34a2e581-9142-4731-ba35-b89fc3efa4fa" containerID="aa6fcbf12e04623fa63ea94c7e083da4f095f361fe544fe064077ad55174d680" exitCode=0 Dec 09 10:30:01 crc kubenswrapper[5002]: I1209 10:30:01.329367 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" event={"ID":"34a2e581-9142-4731-ba35-b89fc3efa4fa","Type":"ContainerDied","Data":"aa6fcbf12e04623fa63ea94c7e083da4f095f361fe544fe064077ad55174d680"} Dec 09 10:30:01 crc kubenswrapper[5002]: I1209 10:30:01.329570 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" event={"ID":"34a2e581-9142-4731-ba35-b89fc3efa4fa","Type":"ContainerStarted","Data":"dcb53569c7f14b98275d119a5783e73f7e78837c74086c98f03455154f3baaf0"} Dec 09 10:30:02 crc kubenswrapper[5002]: I1209 10:30:02.589264 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:02 crc kubenswrapper[5002]: I1209 10:30:02.687919 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34a2e581-9142-4731-ba35-b89fc3efa4fa-secret-volume\") pod \"34a2e581-9142-4731-ba35-b89fc3efa4fa\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " Dec 09 10:30:02 crc kubenswrapper[5002]: I1209 10:30:02.688049 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7zpt\" (UniqueName: \"kubernetes.io/projected/34a2e581-9142-4731-ba35-b89fc3efa4fa-kube-api-access-b7zpt\") pod \"34a2e581-9142-4731-ba35-b89fc3efa4fa\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " Dec 09 10:30:02 crc kubenswrapper[5002]: I1209 10:30:02.688091 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34a2e581-9142-4731-ba35-b89fc3efa4fa-config-volume\") pod \"34a2e581-9142-4731-ba35-b89fc3efa4fa\" (UID: \"34a2e581-9142-4731-ba35-b89fc3efa4fa\") " Dec 09 10:30:02 crc kubenswrapper[5002]: I1209 10:30:02.689020 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34a2e581-9142-4731-ba35-b89fc3efa4fa-config-volume" (OuterVolumeSpecName: "config-volume") pod "34a2e581-9142-4731-ba35-b89fc3efa4fa" (UID: "34a2e581-9142-4731-ba35-b89fc3efa4fa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:30:02 crc kubenswrapper[5002]: I1209 10:30:02.692630 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a2e581-9142-4731-ba35-b89fc3efa4fa-kube-api-access-b7zpt" (OuterVolumeSpecName: "kube-api-access-b7zpt") pod "34a2e581-9142-4731-ba35-b89fc3efa4fa" (UID: "34a2e581-9142-4731-ba35-b89fc3efa4fa"). InnerVolumeSpecName "kube-api-access-b7zpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:30:02 crc kubenswrapper[5002]: I1209 10:30:02.693785 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a2e581-9142-4731-ba35-b89fc3efa4fa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "34a2e581-9142-4731-ba35-b89fc3efa4fa" (UID: "34a2e581-9142-4731-ba35-b89fc3efa4fa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:30:02 crc kubenswrapper[5002]: I1209 10:30:02.790275 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34a2e581-9142-4731-ba35-b89fc3efa4fa-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 10:30:02 crc kubenswrapper[5002]: I1209 10:30:02.790329 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7zpt\" (UniqueName: \"kubernetes.io/projected/34a2e581-9142-4731-ba35-b89fc3efa4fa-kube-api-access-b7zpt\") on node \"crc\" DevicePath \"\"" Dec 09 10:30:02 crc kubenswrapper[5002]: I1209 10:30:02.790349 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34a2e581-9142-4731-ba35-b89fc3efa4fa-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 10:30:03 crc kubenswrapper[5002]: I1209 10:30:03.350454 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" event={"ID":"34a2e581-9142-4731-ba35-b89fc3efa4fa","Type":"ContainerDied","Data":"dcb53569c7f14b98275d119a5783e73f7e78837c74086c98f03455154f3baaf0"} Dec 09 10:30:03 crc kubenswrapper[5002]: I1209 10:30:03.350522 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcb53569c7f14b98275d119a5783e73f7e78837c74086c98f03455154f3baaf0" Dec 09 10:30:03 crc kubenswrapper[5002]: I1209 10:30:03.350540 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f" Dec 09 10:30:07 crc kubenswrapper[5002]: I1209 10:30:07.059486 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:30:07 crc kubenswrapper[5002]: E1209 10:30:07.059925 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:30:21 crc kubenswrapper[5002]: I1209 10:30:21.059758 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:30:21 crc kubenswrapper[5002]: E1209 10:30:21.060596 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:30:35 crc kubenswrapper[5002]: I1209 10:30:35.060627 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:30:35 crc kubenswrapper[5002]: E1209 10:30:35.061997 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:30:44 crc kubenswrapper[5002]: I1209 10:30:44.038707 5002 scope.go:117] "RemoveContainer" containerID="cec7c3fa420c540edced66346b2b76b161b34be979d648e08ae7cb019c22e6e1" Dec 09 10:30:48 crc kubenswrapper[5002]: I1209 10:30:48.064320 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:30:48 crc kubenswrapper[5002]: E1209 10:30:48.065095 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:31:03 crc kubenswrapper[5002]: I1209 10:31:03.060111 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:31:03 crc kubenswrapper[5002]: E1209 10:31:03.060930 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:31:17 crc kubenswrapper[5002]: I1209 10:31:17.060755 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:31:17 crc kubenswrapper[5002]: E1209 10:31:17.061849 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:31:29 crc kubenswrapper[5002]: I1209 10:31:29.060523 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:31:29 crc kubenswrapper[5002]: E1209 10:31:29.061351 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:31:42 crc kubenswrapper[5002]: I1209 10:31:42.060493 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:31:42 crc kubenswrapper[5002]: E1209 10:31:42.061290 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:31:44 crc kubenswrapper[5002]: I1209 10:31:44.145527 5002 scope.go:117] "RemoveContainer" containerID="2cb6f2c5af24125d366be1b3d0454fead9863b253da1642f513cc98b1fc0571b" Dec 09 10:31:44 crc kubenswrapper[5002]: I1209 10:31:44.170069 5002 scope.go:117] "RemoveContainer" containerID="095377d58ea1601d42a04fa4489e5820cf9a568c32d93720234b6c5bcf8a9454" Dec 09 10:31:44 crc kubenswrapper[5002]: I1209 10:31:44.194285 5002 scope.go:117] "RemoveContainer" containerID="00a44392d7da93c53644215f968e2a020ff72b6f8259eac37e70a2e51e250090" Dec 09 10:31:44 crc kubenswrapper[5002]: I1209 10:31:44.223170 5002 scope.go:117] "RemoveContainer" containerID="1f38de5ee96eed384a1278e773754b632caf13ae00809e5b9356f6454c15cea2" Dec 09 10:31:44 crc kubenswrapper[5002]: I1209 10:31:44.242687 5002 scope.go:117] "RemoveContainer" containerID="57ce805df00a89c4c24d7fe42e68c89ab49ef78b1eebf20d9fece98c54c67e2f" Dec 09 10:31:44 crc kubenswrapper[5002]: I1209 10:31:44.269694 5002 scope.go:117] "RemoveContainer" containerID="3b5492a7c894b209b3a8a3190a110940978e60b5672a1763ad5a6607cc93171f" Dec 09 10:31:55 crc kubenswrapper[5002]: I1209 10:31:55.061203 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:31:55 crc kubenswrapper[5002]: E1209 10:31:55.061968 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:32:07 crc kubenswrapper[5002]: I1209 10:32:07.060945 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:32:07 crc kubenswrapper[5002]: E1209 10:32:07.061998 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:32:18 crc kubenswrapper[5002]: I1209 10:32:18.064664 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:32:18 crc kubenswrapper[5002]: E1209 10:32:18.066555 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:32:30 crc kubenswrapper[5002]: I1209 10:32:30.060620 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:32:30 crc kubenswrapper[5002]: E1209 10:32:30.061088 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:32:43 crc kubenswrapper[5002]: I1209 10:32:43.060727 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:32:43 crc kubenswrapper[5002]: E1209 10:32:43.061614 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:32:58 crc kubenswrapper[5002]: I1209 10:32:58.064505 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:32:58 crc kubenswrapper[5002]: E1209 10:32:58.065273 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:33:10 crc kubenswrapper[5002]: I1209 10:33:10.060775 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:33:10 crc kubenswrapper[5002]: E1209 10:33:10.061640 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:33:23 crc kubenswrapper[5002]: I1209 10:33:23.060806 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:33:23 crc kubenswrapper[5002]: E1209 10:33:23.061686 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:33:37 crc kubenswrapper[5002]: I1209 10:33:37.060182 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:33:37 crc kubenswrapper[5002]: E1209 10:33:37.061073 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:33:52 crc kubenswrapper[5002]: I1209 10:33:52.060256 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:33:52 crc kubenswrapper[5002]: E1209 10:33:52.061028 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:34:07 crc kubenswrapper[5002]: I1209 10:34:07.060646 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:34:07 crc kubenswrapper[5002]: E1209 10:34:07.061547 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:34:20 crc kubenswrapper[5002]: I1209 10:34:20.060139 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:34:21 crc kubenswrapper[5002]: I1209 10:34:21.582848 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"3790327e397a17602262effb7ad54351ff4d450f68ad25cca18f8b3766e0d75d"} Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.191510 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-69rlr"] Dec 09 10:34:28 crc kubenswrapper[5002]: E1209 10:34:28.196098 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a2e581-9142-4731-ba35-b89fc3efa4fa" containerName="collect-profiles" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.196354 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a2e581-9142-4731-ba35-b89fc3efa4fa" containerName="collect-profiles" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.196739 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a2e581-9142-4731-ba35-b89fc3efa4fa" containerName="collect-profiles" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.198209 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.214783 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69rlr"] Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.321094 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cvwq\" (UniqueName: \"kubernetes.io/projected/181215c3-1125-494e-a3ae-ac2bf8379d26-kube-api-access-9cvwq\") pod \"redhat-marketplace-69rlr\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.321214 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-catalog-content\") pod \"redhat-marketplace-69rlr\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.321282 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-utilities\") pod \"redhat-marketplace-69rlr\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.386623 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9jkb7"] Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.388150 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.410242 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jkb7"] Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.422602 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-catalog-content\") pod \"redhat-marketplace-69rlr\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.422685 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-utilities\") pod \"redhat-marketplace-69rlr\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.422756 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cvwq\" (UniqueName: \"kubernetes.io/projected/181215c3-1125-494e-a3ae-ac2bf8379d26-kube-api-access-9cvwq\") pod \"redhat-marketplace-69rlr\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.423564 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-catalog-content\") pod \"redhat-marketplace-69rlr\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.423990 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-utilities\") pod \"redhat-marketplace-69rlr\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.450719 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cvwq\" (UniqueName: \"kubernetes.io/projected/181215c3-1125-494e-a3ae-ac2bf8379d26-kube-api-access-9cvwq\") pod \"redhat-marketplace-69rlr\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.516031 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.523739 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-utilities\") pod \"redhat-operators-9jkb7\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.523789 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxmzh\" (UniqueName: \"kubernetes.io/projected/a7d72680-c522-44c0-af9f-7da83af73e04-kube-api-access-gxmzh\") pod \"redhat-operators-9jkb7\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.523876 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-catalog-content\") pod \"redhat-operators-9jkb7\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.625301 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-utilities\") pod \"redhat-operators-9jkb7\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.625588 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxmzh\" (UniqueName: \"kubernetes.io/projected/a7d72680-c522-44c0-af9f-7da83af73e04-kube-api-access-gxmzh\") pod \"redhat-operators-9jkb7\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.625650 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-catalog-content\") pod \"redhat-operators-9jkb7\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.626204 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-catalog-content\") pod \"redhat-operators-9jkb7\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.626422 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-utilities\") pod \"redhat-operators-9jkb7\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.659619 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxmzh\" (UniqueName: \"kubernetes.io/projected/a7d72680-c522-44c0-af9f-7da83af73e04-kube-api-access-gxmzh\") pod \"redhat-operators-9jkb7\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.702310 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.988308 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jkb7"] Dec 09 10:34:28 crc kubenswrapper[5002]: W1209 10:34:28.993217 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod181215c3_1125_494e_a3ae_ac2bf8379d26.slice/crio-e677fe570300988ce4f89a556afbb54be88ba1225f5fbf1e019513be5bb1c0d9 WatchSource:0}: Error finding container e677fe570300988ce4f89a556afbb54be88ba1225f5fbf1e019513be5bb1c0d9: Status 404 returned error can't find the container with id e677fe570300988ce4f89a556afbb54be88ba1225f5fbf1e019513be5bb1c0d9 Dec 09 10:34:28 crc kubenswrapper[5002]: I1209 10:34:28.993573 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69rlr"] Dec 09 10:34:29 crc kubenswrapper[5002]: I1209 10:34:29.651019 5002 generic.go:334] "Generic (PLEG): container finished" podID="a7d72680-c522-44c0-af9f-7da83af73e04" containerID="d3d460e173ee4071c998a8aac21c80191413a4b5e20d8d3f0fbb4fad2c14ca11" exitCode=0 Dec 09 10:34:29 crc kubenswrapper[5002]: I1209 10:34:29.651082 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jkb7" event={"ID":"a7d72680-c522-44c0-af9f-7da83af73e04","Type":"ContainerDied","Data":"d3d460e173ee4071c998a8aac21c80191413a4b5e20d8d3f0fbb4fad2c14ca11"} Dec 09 10:34:29 crc kubenswrapper[5002]: I1209 10:34:29.651308 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jkb7" event={"ID":"a7d72680-c522-44c0-af9f-7da83af73e04","Type":"ContainerStarted","Data":"49fa92a56618bd415567844910d714c3b613853460bff8cb3582faa5ae5b7843"} Dec 09 10:34:29 crc kubenswrapper[5002]: I1209 10:34:29.652799 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 10:34:29 crc kubenswrapper[5002]: I1209 10:34:29.654872 5002 generic.go:334] "Generic (PLEG): container finished" podID="181215c3-1125-494e-a3ae-ac2bf8379d26" containerID="d2db945b0b6c5b16a9651aaed143d9ec78ac0c361102375a6ebd9add142ae21e" exitCode=0 Dec 09 10:34:29 crc kubenswrapper[5002]: I1209 10:34:29.654911 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69rlr" event={"ID":"181215c3-1125-494e-a3ae-ac2bf8379d26","Type":"ContainerDied","Data":"d2db945b0b6c5b16a9651aaed143d9ec78ac0c361102375a6ebd9add142ae21e"} Dec 09 10:34:29 crc kubenswrapper[5002]: I1209 10:34:29.654936 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69rlr" event={"ID":"181215c3-1125-494e-a3ae-ac2bf8379d26","Type":"ContainerStarted","Data":"e677fe570300988ce4f89a556afbb54be88ba1225f5fbf1e019513be5bb1c0d9"} Dec 09 10:34:30 crc kubenswrapper[5002]: I1209 10:34:30.664879 5002 generic.go:334] "Generic (PLEG): container finished" podID="181215c3-1125-494e-a3ae-ac2bf8379d26" containerID="12edf9164061ec7cecddc134fa82ebf069486589dbf2f144b4385d02d083df12" exitCode=0 Dec 09 10:34:30 crc kubenswrapper[5002]: I1209 10:34:30.664948 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69rlr" event={"ID":"181215c3-1125-494e-a3ae-ac2bf8379d26","Type":"ContainerDied","Data":"12edf9164061ec7cecddc134fa82ebf069486589dbf2f144b4385d02d083df12"} Dec 09 10:34:31 crc kubenswrapper[5002]: I1209 10:34:31.683798 5002 generic.go:334] "Generic (PLEG): container finished" podID="a7d72680-c522-44c0-af9f-7da83af73e04" containerID="5870b5a6fee2b93bea92af57b0185ce6df5df50d3ecd7a786b261f9f2a279529" exitCode=0 Dec 09 10:34:31 crc kubenswrapper[5002]: I1209 10:34:31.683876 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jkb7" event={"ID":"a7d72680-c522-44c0-af9f-7da83af73e04","Type":"ContainerDied","Data":"5870b5a6fee2b93bea92af57b0185ce6df5df50d3ecd7a786b261f9f2a279529"} Dec 09 10:34:34 crc kubenswrapper[5002]: I1209 10:34:34.706668 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jkb7" event={"ID":"a7d72680-c522-44c0-af9f-7da83af73e04","Type":"ContainerStarted","Data":"516f77e7463df6e6c98ec54ed0787d7fb3542be3e22af8140c508ea6e8542cff"} Dec 09 10:34:34 crc kubenswrapper[5002]: I1209 10:34:34.709029 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69rlr" event={"ID":"181215c3-1125-494e-a3ae-ac2bf8379d26","Type":"ContainerStarted","Data":"d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168"} Dec 09 10:34:34 crc kubenswrapper[5002]: I1209 10:34:34.739641 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9jkb7" podStartSLOduration=3.169434816 podStartE2EDuration="6.739615642s" podCreationTimestamp="2025-12-09 10:34:28 +0000 UTC" firstStartedPulling="2025-12-09 10:34:29.652509941 +0000 UTC m=+2002.044561022" lastFinishedPulling="2025-12-09 10:34:33.222690757 +0000 UTC m=+2005.614741848" observedRunningTime="2025-12-09 10:34:34.733426706 +0000 UTC m=+2007.125477797" watchObservedRunningTime="2025-12-09 10:34:34.739615642 +0000 UTC m=+2007.131666723" Dec 09 10:34:34 crc kubenswrapper[5002]: I1209 10:34:34.768486 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-69rlr" podStartSLOduration=3.230421454 podStartE2EDuration="6.768463767s" podCreationTimestamp="2025-12-09 10:34:28 +0000 UTC" firstStartedPulling="2025-12-09 10:34:29.656510968 +0000 UTC m=+2002.048562049" lastFinishedPulling="2025-12-09 10:34:33.194553261 +0000 UTC m=+2005.586604362" observedRunningTime="2025-12-09 10:34:34.763436832 +0000 UTC m=+2007.155487923" watchObservedRunningTime="2025-12-09 10:34:34.768463767 +0000 UTC m=+2007.160514848" Dec 09 10:34:38 crc kubenswrapper[5002]: I1209 10:34:38.516880 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:38 crc kubenswrapper[5002]: I1209 10:34:38.517383 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:38 crc kubenswrapper[5002]: I1209 10:34:38.571346 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:38 crc kubenswrapper[5002]: I1209 10:34:38.702868 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:38 crc kubenswrapper[5002]: I1209 10:34:38.703015 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:38 crc kubenswrapper[5002]: I1209 10:34:38.789120 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:39 crc kubenswrapper[5002]: I1209 10:34:39.753463 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9jkb7" podUID="a7d72680-c522-44c0-af9f-7da83af73e04" containerName="registry-server" probeResult="failure" output=< Dec 09 10:34:39 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 10:34:39 crc kubenswrapper[5002]: > Dec 09 10:34:40 crc kubenswrapper[5002]: I1209 10:34:40.777349 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-69rlr"] Dec 09 10:34:41 crc kubenswrapper[5002]: I1209 10:34:41.762349 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-69rlr" podUID="181215c3-1125-494e-a3ae-ac2bf8379d26" containerName="registry-server" containerID="cri-o://d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168" gracePeriod=2 Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.680168 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.773830 5002 generic.go:334] "Generic (PLEG): container finished" podID="181215c3-1125-494e-a3ae-ac2bf8379d26" containerID="d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168" exitCode=0 Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.773875 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69rlr" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.773873 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69rlr" event={"ID":"181215c3-1125-494e-a3ae-ac2bf8379d26","Type":"ContainerDied","Data":"d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168"} Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.773970 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69rlr" event={"ID":"181215c3-1125-494e-a3ae-ac2bf8379d26","Type":"ContainerDied","Data":"e677fe570300988ce4f89a556afbb54be88ba1225f5fbf1e019513be5bb1c0d9"} Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.774018 5002 scope.go:117] "RemoveContainer" containerID="d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.794005 5002 scope.go:117] "RemoveContainer" containerID="12edf9164061ec7cecddc134fa82ebf069486589dbf2f144b4385d02d083df12" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.814681 5002 scope.go:117] "RemoveContainer" containerID="d2db945b0b6c5b16a9651aaed143d9ec78ac0c361102375a6ebd9add142ae21e" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.852299 5002 scope.go:117] "RemoveContainer" containerID="d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168" Dec 09 10:34:42 crc kubenswrapper[5002]: E1209 10:34:42.854389 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168\": container with ID starting with d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168 not found: ID does not exist" containerID="d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.854453 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168"} err="failed to get container status \"d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168\": rpc error: code = NotFound desc = could not find container \"d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168\": container with ID starting with d3922aaa46d52a79b7acbb12c7dad9eadec798f8907ccb8495c6dcb756447168 not found: ID does not exist" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.854488 5002 scope.go:117] "RemoveContainer" containerID="12edf9164061ec7cecddc134fa82ebf069486589dbf2f144b4385d02d083df12" Dec 09 10:34:42 crc kubenswrapper[5002]: E1209 10:34:42.855021 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12edf9164061ec7cecddc134fa82ebf069486589dbf2f144b4385d02d083df12\": container with ID starting with 12edf9164061ec7cecddc134fa82ebf069486589dbf2f144b4385d02d083df12 not found: ID does not exist" containerID="12edf9164061ec7cecddc134fa82ebf069486589dbf2f144b4385d02d083df12" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.855059 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12edf9164061ec7cecddc134fa82ebf069486589dbf2f144b4385d02d083df12"} err="failed to get container status \"12edf9164061ec7cecddc134fa82ebf069486589dbf2f144b4385d02d083df12\": rpc error: code = NotFound desc = could not find container \"12edf9164061ec7cecddc134fa82ebf069486589dbf2f144b4385d02d083df12\": container with ID starting with 12edf9164061ec7cecddc134fa82ebf069486589dbf2f144b4385d02d083df12 not found: ID does not exist" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.855101 5002 scope.go:117] "RemoveContainer" containerID="d2db945b0b6c5b16a9651aaed143d9ec78ac0c361102375a6ebd9add142ae21e" Dec 09 10:34:42 crc kubenswrapper[5002]: E1209 10:34:42.855415 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2db945b0b6c5b16a9651aaed143d9ec78ac0c361102375a6ebd9add142ae21e\": container with ID starting with d2db945b0b6c5b16a9651aaed143d9ec78ac0c361102375a6ebd9add142ae21e not found: ID does not exist" containerID="d2db945b0b6c5b16a9651aaed143d9ec78ac0c361102375a6ebd9add142ae21e" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.855445 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2db945b0b6c5b16a9651aaed143d9ec78ac0c361102375a6ebd9add142ae21e"} err="failed to get container status \"d2db945b0b6c5b16a9651aaed143d9ec78ac0c361102375a6ebd9add142ae21e\": rpc error: code = NotFound desc = could not find container \"d2db945b0b6c5b16a9651aaed143d9ec78ac0c361102375a6ebd9add142ae21e\": container with ID starting with d2db945b0b6c5b16a9651aaed143d9ec78ac0c361102375a6ebd9add142ae21e not found: ID does not exist" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.862921 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-catalog-content\") pod \"181215c3-1125-494e-a3ae-ac2bf8379d26\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.862997 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cvwq\" (UniqueName: \"kubernetes.io/projected/181215c3-1125-494e-a3ae-ac2bf8379d26-kube-api-access-9cvwq\") pod \"181215c3-1125-494e-a3ae-ac2bf8379d26\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.863071 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-utilities\") pod \"181215c3-1125-494e-a3ae-ac2bf8379d26\" (UID: \"181215c3-1125-494e-a3ae-ac2bf8379d26\") " Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.864244 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-utilities" (OuterVolumeSpecName: "utilities") pod "181215c3-1125-494e-a3ae-ac2bf8379d26" (UID: "181215c3-1125-494e-a3ae-ac2bf8379d26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.868463 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181215c3-1125-494e-a3ae-ac2bf8379d26-kube-api-access-9cvwq" (OuterVolumeSpecName: "kube-api-access-9cvwq") pod "181215c3-1125-494e-a3ae-ac2bf8379d26" (UID: "181215c3-1125-494e-a3ae-ac2bf8379d26"). InnerVolumeSpecName "kube-api-access-9cvwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.886648 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "181215c3-1125-494e-a3ae-ac2bf8379d26" (UID: "181215c3-1125-494e-a3ae-ac2bf8379d26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.964251 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.964288 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cvwq\" (UniqueName: \"kubernetes.io/projected/181215c3-1125-494e-a3ae-ac2bf8379d26-kube-api-access-9cvwq\") on node \"crc\" DevicePath \"\"" Dec 09 10:34:42 crc kubenswrapper[5002]: I1209 10:34:42.964302 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181215c3-1125-494e-a3ae-ac2bf8379d26-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:34:43 crc kubenswrapper[5002]: I1209 10:34:43.107295 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-69rlr"] Dec 09 10:34:43 crc kubenswrapper[5002]: I1209 10:34:43.113509 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-69rlr"] Dec 09 10:34:44 crc kubenswrapper[5002]: I1209 10:34:44.072223 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181215c3-1125-494e-a3ae-ac2bf8379d26" path="/var/lib/kubelet/pods/181215c3-1125-494e-a3ae-ac2bf8379d26/volumes" Dec 09 10:34:48 crc kubenswrapper[5002]: I1209 10:34:48.748960 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:48 crc kubenswrapper[5002]: I1209 10:34:48.792178 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:51 crc kubenswrapper[5002]: I1209 10:34:51.576128 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jkb7"] Dec 09 10:34:51 crc kubenswrapper[5002]: I1209 10:34:51.576679 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9jkb7" podUID="a7d72680-c522-44c0-af9f-7da83af73e04" containerName="registry-server" containerID="cri-o://516f77e7463df6e6c98ec54ed0787d7fb3542be3e22af8140c508ea6e8542cff" gracePeriod=2 Dec 09 10:34:52 crc kubenswrapper[5002]: I1209 10:34:52.879034 5002 generic.go:334] "Generic (PLEG): container finished" podID="a7d72680-c522-44c0-af9f-7da83af73e04" containerID="516f77e7463df6e6c98ec54ed0787d7fb3542be3e22af8140c508ea6e8542cff" exitCode=0 Dec 09 10:34:52 crc kubenswrapper[5002]: I1209 10:34:52.879078 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jkb7" event={"ID":"a7d72680-c522-44c0-af9f-7da83af73e04","Type":"ContainerDied","Data":"516f77e7463df6e6c98ec54ed0787d7fb3542be3e22af8140c508ea6e8542cff"} Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.197619 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.325033 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxmzh\" (UniqueName: \"kubernetes.io/projected/a7d72680-c522-44c0-af9f-7da83af73e04-kube-api-access-gxmzh\") pod \"a7d72680-c522-44c0-af9f-7da83af73e04\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.325193 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-catalog-content\") pod \"a7d72680-c522-44c0-af9f-7da83af73e04\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.325221 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-utilities\") pod \"a7d72680-c522-44c0-af9f-7da83af73e04\" (UID: \"a7d72680-c522-44c0-af9f-7da83af73e04\") " Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.327436 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-utilities" (OuterVolumeSpecName: "utilities") pod "a7d72680-c522-44c0-af9f-7da83af73e04" (UID: "a7d72680-c522-44c0-af9f-7da83af73e04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.329506 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d72680-c522-44c0-af9f-7da83af73e04-kube-api-access-gxmzh" (OuterVolumeSpecName: "kube-api-access-gxmzh") pod "a7d72680-c522-44c0-af9f-7da83af73e04" (UID: "a7d72680-c522-44c0-af9f-7da83af73e04"). InnerVolumeSpecName "kube-api-access-gxmzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.426295 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.426335 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxmzh\" (UniqueName: \"kubernetes.io/projected/a7d72680-c522-44c0-af9f-7da83af73e04-kube-api-access-gxmzh\") on node \"crc\" DevicePath \"\"" Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.474895 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7d72680-c522-44c0-af9f-7da83af73e04" (UID: "a7d72680-c522-44c0-af9f-7da83af73e04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.528739 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d72680-c522-44c0-af9f-7da83af73e04-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.889292 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jkb7" event={"ID":"a7d72680-c522-44c0-af9f-7da83af73e04","Type":"ContainerDied","Data":"49fa92a56618bd415567844910d714c3b613853460bff8cb3582faa5ae5b7843"} Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.889329 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jkb7" Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.889347 5002 scope.go:117] "RemoveContainer" containerID="516f77e7463df6e6c98ec54ed0787d7fb3542be3e22af8140c508ea6e8542cff" Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.907793 5002 scope.go:117] "RemoveContainer" containerID="5870b5a6fee2b93bea92af57b0185ce6df5df50d3ecd7a786b261f9f2a279529" Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.918209 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jkb7"] Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.925180 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9jkb7"] Dec 09 10:34:53 crc kubenswrapper[5002]: I1209 10:34:53.948221 5002 scope.go:117] "RemoveContainer" containerID="d3d460e173ee4071c998a8aac21c80191413a4b5e20d8d3f0fbb4fad2c14ca11" Dec 09 10:34:54 crc kubenswrapper[5002]: I1209 10:34:54.070197 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d72680-c522-44c0-af9f-7da83af73e04" path="/var/lib/kubelet/pods/a7d72680-c522-44c0-af9f-7da83af73e04/volumes" Dec 09 10:36:37 crc kubenswrapper[5002]: I1209 10:36:37.964294 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:36:37 crc kubenswrapper[5002]: I1209 10:36:37.964991 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:37:07 crc kubenswrapper[5002]: I1209 10:37:07.965296 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:37:07 crc kubenswrapper[5002]: I1209 10:37:07.965938 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:37:37 crc kubenswrapper[5002]: I1209 10:37:37.965126 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:37:37 crc kubenswrapper[5002]: I1209 10:37:37.965553 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:37:37 crc kubenswrapper[5002]: I1209 10:37:37.965595 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:37:37 crc kubenswrapper[5002]: I1209 10:37:37.966186 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3790327e397a17602262effb7ad54351ff4d450f68ad25cca18f8b3766e0d75d"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:37:37 crc kubenswrapper[5002]: I1209 10:37:37.966253 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://3790327e397a17602262effb7ad54351ff4d450f68ad25cca18f8b3766e0d75d" gracePeriod=600 Dec 09 10:37:38 crc kubenswrapper[5002]: I1209 10:37:38.293512 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="3790327e397a17602262effb7ad54351ff4d450f68ad25cca18f8b3766e0d75d" exitCode=0 Dec 09 10:37:38 crc kubenswrapper[5002]: I1209 10:37:38.293563 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"3790327e397a17602262effb7ad54351ff4d450f68ad25cca18f8b3766e0d75d"} Dec 09 10:37:38 crc kubenswrapper[5002]: I1209 10:37:38.293931 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53"} Dec 09 10:37:38 crc kubenswrapper[5002]: I1209 10:37:38.293955 5002 scope.go:117] "RemoveContainer" containerID="b603401504c120d86f4291f9eecacafd874b783a1acad3cc5e6c3c01f22fd43e" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.101963 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-svr4h"] Dec 09 10:38:11 crc kubenswrapper[5002]: E1209 10:38:11.103409 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d72680-c522-44c0-af9f-7da83af73e04" containerName="registry-server" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.103429 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d72680-c522-44c0-af9f-7da83af73e04" containerName="registry-server" Dec 09 10:38:11 crc kubenswrapper[5002]: E1209 10:38:11.103448 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181215c3-1125-494e-a3ae-ac2bf8379d26" containerName="extract-utilities" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.103456 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="181215c3-1125-494e-a3ae-ac2bf8379d26" containerName="extract-utilities" Dec 09 10:38:11 crc kubenswrapper[5002]: E1209 10:38:11.103468 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d72680-c522-44c0-af9f-7da83af73e04" containerName="extract-utilities" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.103475 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d72680-c522-44c0-af9f-7da83af73e04" containerName="extract-utilities" Dec 09 10:38:11 crc kubenswrapper[5002]: E1209 10:38:11.103495 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d72680-c522-44c0-af9f-7da83af73e04" containerName="extract-content" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.103503 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d72680-c522-44c0-af9f-7da83af73e04" containerName="extract-content" Dec 09 10:38:11 crc kubenswrapper[5002]: E1209 10:38:11.103518 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181215c3-1125-494e-a3ae-ac2bf8379d26" containerName="registry-server" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.103526 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="181215c3-1125-494e-a3ae-ac2bf8379d26" containerName="registry-server" Dec 09 10:38:11 crc kubenswrapper[5002]: E1209 10:38:11.103549 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181215c3-1125-494e-a3ae-ac2bf8379d26" containerName="extract-content" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.103557 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="181215c3-1125-494e-a3ae-ac2bf8379d26" containerName="extract-content" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.103722 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d72680-c522-44c0-af9f-7da83af73e04" containerName="registry-server" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.103739 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="181215c3-1125-494e-a3ae-ac2bf8379d26" containerName="registry-server" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.105112 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.127164 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-svr4h"] Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.164222 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9q2q\" (UniqueName: \"kubernetes.io/projected/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-kube-api-access-l9q2q\") pod \"certified-operators-svr4h\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.164292 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-catalog-content\") pod \"certified-operators-svr4h\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.164318 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-utilities\") pod \"certified-operators-svr4h\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.264930 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9q2q\" (UniqueName: \"kubernetes.io/projected/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-kube-api-access-l9q2q\") pod \"certified-operators-svr4h\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.264979 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-catalog-content\") pod \"certified-operators-svr4h\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.265000 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-utilities\") pod \"certified-operators-svr4h\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.265512 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-utilities\") pod \"certified-operators-svr4h\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.265578 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-catalog-content\") pod \"certified-operators-svr4h\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.285403 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9q2q\" (UniqueName: \"kubernetes.io/projected/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-kube-api-access-l9q2q\") pod \"certified-operators-svr4h\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.433802 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:11 crc kubenswrapper[5002]: I1209 10:38:11.714735 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-svr4h"] Dec 09 10:38:12 crc kubenswrapper[5002]: I1209 10:38:12.552160 5002 generic.go:334] "Generic (PLEG): container finished" podID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" containerID="33e749c41ba224ee9e7f22810cb853f393b93442890a6466b83e4d291286904c" exitCode=0 Dec 09 10:38:12 crc kubenswrapper[5002]: I1209 10:38:12.552224 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svr4h" event={"ID":"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8","Type":"ContainerDied","Data":"33e749c41ba224ee9e7f22810cb853f393b93442890a6466b83e4d291286904c"} Dec 09 10:38:12 crc kubenswrapper[5002]: I1209 10:38:12.552464 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svr4h" event={"ID":"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8","Type":"ContainerStarted","Data":"89b0dfe4617c00e5aa8037902f0893887fdeb0ebe7dfb10a0a022874129b86b3"} Dec 09 10:38:13 crc kubenswrapper[5002]: I1209 10:38:13.564429 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svr4h" event={"ID":"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8","Type":"ContainerStarted","Data":"f68ad120dd00ba68c7861f57c8214beda8d703f15d9d68fd761336ed3a8fcf35"} Dec 09 10:38:14 crc kubenswrapper[5002]: I1209 10:38:14.578462 5002 generic.go:334] "Generic (PLEG): container finished" podID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" containerID="f68ad120dd00ba68c7861f57c8214beda8d703f15d9d68fd761336ed3a8fcf35" exitCode=0 Dec 09 10:38:14 crc kubenswrapper[5002]: I1209 10:38:14.578542 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svr4h" event={"ID":"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8","Type":"ContainerDied","Data":"f68ad120dd00ba68c7861f57c8214beda8d703f15d9d68fd761336ed3a8fcf35"} Dec 09 10:38:16 crc kubenswrapper[5002]: I1209 10:38:16.593834 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svr4h" event={"ID":"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8","Type":"ContainerStarted","Data":"ca730e6e571d38cd66e396e0515a7dd8d15f398afcccc68a775d297893921518"} Dec 09 10:38:16 crc kubenswrapper[5002]: I1209 10:38:16.621102 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-svr4h" podStartSLOduration=2.6862169639999998 podStartE2EDuration="5.621083457s" podCreationTimestamp="2025-12-09 10:38:11 +0000 UTC" firstStartedPulling="2025-12-09 10:38:12.553842113 +0000 UTC m=+2224.945893194" lastFinishedPulling="2025-12-09 10:38:15.488708586 +0000 UTC m=+2227.880759687" observedRunningTime="2025-12-09 10:38:16.615290832 +0000 UTC m=+2229.007341943" watchObservedRunningTime="2025-12-09 10:38:16.621083457 +0000 UTC m=+2229.013134538" Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.678282 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kjhjt"] Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.682277 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.690270 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kjhjt"] Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.856094 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-utilities\") pod \"community-operators-kjhjt\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.856178 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-catalog-content\") pod \"community-operators-kjhjt\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.856242 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s55sl\" (UniqueName: \"kubernetes.io/projected/937441cb-575f-4ab9-a63b-9eb51731c4a9-kube-api-access-s55sl\") pod \"community-operators-kjhjt\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.957557 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-utilities\") pod \"community-operators-kjhjt\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.957619 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-catalog-content\") pod \"community-operators-kjhjt\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.957649 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s55sl\" (UniqueName: \"kubernetes.io/projected/937441cb-575f-4ab9-a63b-9eb51731c4a9-kube-api-access-s55sl\") pod \"community-operators-kjhjt\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.958448 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-utilities\") pod \"community-operators-kjhjt\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.958563 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-catalog-content\") pod \"community-operators-kjhjt\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:17 crc kubenswrapper[5002]: I1209 10:38:17.988803 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s55sl\" (UniqueName: \"kubernetes.io/projected/937441cb-575f-4ab9-a63b-9eb51731c4a9-kube-api-access-s55sl\") pod \"community-operators-kjhjt\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:18 crc kubenswrapper[5002]: I1209 10:38:18.001579 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:18 crc kubenswrapper[5002]: I1209 10:38:18.497463 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kjhjt"] Dec 09 10:38:18 crc kubenswrapper[5002]: W1209 10:38:18.503365 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod937441cb_575f_4ab9_a63b_9eb51731c4a9.slice/crio-cf5ca29cfcafcdfa9e5858e26d7303a90a60b4965ddc143cc4c76ad088b4e2d9 WatchSource:0}: Error finding container cf5ca29cfcafcdfa9e5858e26d7303a90a60b4965ddc143cc4c76ad088b4e2d9: Status 404 returned error can't find the container with id cf5ca29cfcafcdfa9e5858e26d7303a90a60b4965ddc143cc4c76ad088b4e2d9 Dec 09 10:38:18 crc kubenswrapper[5002]: I1209 10:38:18.607032 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kjhjt" event={"ID":"937441cb-575f-4ab9-a63b-9eb51731c4a9","Type":"ContainerStarted","Data":"cf5ca29cfcafcdfa9e5858e26d7303a90a60b4965ddc143cc4c76ad088b4e2d9"} Dec 09 10:38:20 crc kubenswrapper[5002]: I1209 10:38:20.621939 5002 generic.go:334] "Generic (PLEG): container finished" podID="937441cb-575f-4ab9-a63b-9eb51731c4a9" containerID="394fbf0414e22369cb866ca6f41f29126f596f6cb7d409d860771cb6b3728194" exitCode=0 Dec 09 10:38:20 crc kubenswrapper[5002]: I1209 10:38:20.622012 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kjhjt" event={"ID":"937441cb-575f-4ab9-a63b-9eb51731c4a9","Type":"ContainerDied","Data":"394fbf0414e22369cb866ca6f41f29126f596f6cb7d409d860771cb6b3728194"} Dec 09 10:38:21 crc kubenswrapper[5002]: I1209 10:38:21.434123 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:21 crc kubenswrapper[5002]: I1209 10:38:21.434501 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:21 crc kubenswrapper[5002]: I1209 10:38:21.489351 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:21 crc kubenswrapper[5002]: I1209 10:38:21.647537 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kjhjt" event={"ID":"937441cb-575f-4ab9-a63b-9eb51731c4a9","Type":"ContainerStarted","Data":"da3b09e9ac9ce7d241fe62b1d9f5993b0f55fa2a6b4c8d747e8255084b67507c"} Dec 09 10:38:21 crc kubenswrapper[5002]: I1209 10:38:21.689944 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:22 crc kubenswrapper[5002]: I1209 10:38:22.656507 5002 generic.go:334] "Generic (PLEG): container finished" podID="937441cb-575f-4ab9-a63b-9eb51731c4a9" containerID="da3b09e9ac9ce7d241fe62b1d9f5993b0f55fa2a6b4c8d747e8255084b67507c" exitCode=0 Dec 09 10:38:22 crc kubenswrapper[5002]: I1209 10:38:22.656554 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kjhjt" event={"ID":"937441cb-575f-4ab9-a63b-9eb51731c4a9","Type":"ContainerDied","Data":"da3b09e9ac9ce7d241fe62b1d9f5993b0f55fa2a6b4c8d747e8255084b67507c"} Dec 09 10:38:23 crc kubenswrapper[5002]: I1209 10:38:23.475792 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-svr4h"] Dec 09 10:38:23 crc kubenswrapper[5002]: I1209 10:38:23.664779 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-svr4h" podUID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" containerName="registry-server" containerID="cri-o://ca730e6e571d38cd66e396e0515a7dd8d15f398afcccc68a775d297893921518" gracePeriod=2 Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.700956 5002 generic.go:334] "Generic (PLEG): container finished" podID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" containerID="ca730e6e571d38cd66e396e0515a7dd8d15f398afcccc68a775d297893921518" exitCode=0 Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.701533 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svr4h" event={"ID":"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8","Type":"ContainerDied","Data":"ca730e6e571d38cd66e396e0515a7dd8d15f398afcccc68a775d297893921518"} Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.704908 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kjhjt" event={"ID":"937441cb-575f-4ab9-a63b-9eb51731c4a9","Type":"ContainerStarted","Data":"f008173f676d6ee54139e060e721d1959ced485d9f51128c135b1d6a3123331b"} Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.750282 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kjhjt" podStartSLOduration=4.377805914 podStartE2EDuration="9.750260062s" podCreationTimestamp="2025-12-09 10:38:17 +0000 UTC" firstStartedPulling="2025-12-09 10:38:20.62424113 +0000 UTC m=+2233.016292211" lastFinishedPulling="2025-12-09 10:38:25.996695278 +0000 UTC m=+2238.388746359" observedRunningTime="2025-12-09 10:38:26.729573716 +0000 UTC m=+2239.121624807" watchObservedRunningTime="2025-12-09 10:38:26.750260062 +0000 UTC m=+2239.142311143" Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.770299 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.886923 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-catalog-content\") pod \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.887116 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-utilities\") pod \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.887171 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9q2q\" (UniqueName: \"kubernetes.io/projected/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-kube-api-access-l9q2q\") pod \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\" (UID: \"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8\") " Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.888651 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-utilities" (OuterVolumeSpecName: "utilities") pod "df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" (UID: "df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.895338 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-kube-api-access-l9q2q" (OuterVolumeSpecName: "kube-api-access-l9q2q") pod "df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" (UID: "df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8"). InnerVolumeSpecName "kube-api-access-l9q2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.943678 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" (UID: "df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.989167 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.989216 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9q2q\" (UniqueName: \"kubernetes.io/projected/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-kube-api-access-l9q2q\") on node \"crc\" DevicePath \"\"" Dec 09 10:38:26 crc kubenswrapper[5002]: I1209 10:38:26.989231 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:38:27 crc kubenswrapper[5002]: I1209 10:38:27.716645 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svr4h" event={"ID":"df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8","Type":"ContainerDied","Data":"89b0dfe4617c00e5aa8037902f0893887fdeb0ebe7dfb10a0a022874129b86b3"} Dec 09 10:38:27 crc kubenswrapper[5002]: I1209 10:38:27.716698 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svr4h" Dec 09 10:38:27 crc kubenswrapper[5002]: I1209 10:38:27.716712 5002 scope.go:117] "RemoveContainer" containerID="ca730e6e571d38cd66e396e0515a7dd8d15f398afcccc68a775d297893921518" Dec 09 10:38:27 crc kubenswrapper[5002]: I1209 10:38:27.737474 5002 scope.go:117] "RemoveContainer" containerID="f68ad120dd00ba68c7861f57c8214beda8d703f15d9d68fd761336ed3a8fcf35" Dec 09 10:38:27 crc kubenswrapper[5002]: I1209 10:38:27.752386 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-svr4h"] Dec 09 10:38:27 crc kubenswrapper[5002]: I1209 10:38:27.757421 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-svr4h"] Dec 09 10:38:27 crc kubenswrapper[5002]: I1209 10:38:27.768757 5002 scope.go:117] "RemoveContainer" containerID="33e749c41ba224ee9e7f22810cb853f393b93442890a6466b83e4d291286904c" Dec 09 10:38:28 crc kubenswrapper[5002]: I1209 10:38:28.002650 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:28 crc kubenswrapper[5002]: I1209 10:38:28.002717 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:28 crc kubenswrapper[5002]: I1209 10:38:28.071506 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" path="/var/lib/kubelet/pods/df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8/volumes" Dec 09 10:38:28 crc kubenswrapper[5002]: I1209 10:38:28.073254 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:38 crc kubenswrapper[5002]: I1209 10:38:38.074081 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:38 crc kubenswrapper[5002]: I1209 10:38:38.125141 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kjhjt"] Dec 09 10:38:38 crc kubenswrapper[5002]: I1209 10:38:38.849466 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kjhjt" podUID="937441cb-575f-4ab9-a63b-9eb51731c4a9" containerName="registry-server" containerID="cri-o://f008173f676d6ee54139e060e721d1959ced485d9f51128c135b1d6a3123331b" gracePeriod=2 Dec 09 10:38:39 crc kubenswrapper[5002]: I1209 10:38:39.863883 5002 generic.go:334] "Generic (PLEG): container finished" podID="937441cb-575f-4ab9-a63b-9eb51731c4a9" containerID="f008173f676d6ee54139e060e721d1959ced485d9f51128c135b1d6a3123331b" exitCode=0 Dec 09 10:38:39 crc kubenswrapper[5002]: I1209 10:38:39.863972 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kjhjt" event={"ID":"937441cb-575f-4ab9-a63b-9eb51731c4a9","Type":"ContainerDied","Data":"f008173f676d6ee54139e060e721d1959ced485d9f51128c135b1d6a3123331b"} Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.425231 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.592163 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s55sl\" (UniqueName: \"kubernetes.io/projected/937441cb-575f-4ab9-a63b-9eb51731c4a9-kube-api-access-s55sl\") pod \"937441cb-575f-4ab9-a63b-9eb51731c4a9\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.592251 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-catalog-content\") pod \"937441cb-575f-4ab9-a63b-9eb51731c4a9\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.592315 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-utilities\") pod \"937441cb-575f-4ab9-a63b-9eb51731c4a9\" (UID: \"937441cb-575f-4ab9-a63b-9eb51731c4a9\") " Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.593397 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-utilities" (OuterVolumeSpecName: "utilities") pod "937441cb-575f-4ab9-a63b-9eb51731c4a9" (UID: "937441cb-575f-4ab9-a63b-9eb51731c4a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.597741 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/937441cb-575f-4ab9-a63b-9eb51731c4a9-kube-api-access-s55sl" (OuterVolumeSpecName: "kube-api-access-s55sl") pod "937441cb-575f-4ab9-a63b-9eb51731c4a9" (UID: "937441cb-575f-4ab9-a63b-9eb51731c4a9"). InnerVolumeSpecName "kube-api-access-s55sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.640103 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "937441cb-575f-4ab9-a63b-9eb51731c4a9" (UID: "937441cb-575f-4ab9-a63b-9eb51731c4a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.693767 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.693804 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937441cb-575f-4ab9-a63b-9eb51731c4a9-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.693833 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s55sl\" (UniqueName: \"kubernetes.io/projected/937441cb-575f-4ab9-a63b-9eb51731c4a9-kube-api-access-s55sl\") on node \"crc\" DevicePath \"\"" Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.876315 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kjhjt" event={"ID":"937441cb-575f-4ab9-a63b-9eb51731c4a9","Type":"ContainerDied","Data":"cf5ca29cfcafcdfa9e5858e26d7303a90a60b4965ddc143cc4c76ad088b4e2d9"} Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.876367 5002 scope.go:117] "RemoveContainer" containerID="f008173f676d6ee54139e060e721d1959ced485d9f51128c135b1d6a3123331b" Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.876382 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kjhjt" Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.905472 5002 scope.go:117] "RemoveContainer" containerID="da3b09e9ac9ce7d241fe62b1d9f5993b0f55fa2a6b4c8d747e8255084b67507c" Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.917720 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kjhjt"] Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.927173 5002 scope.go:117] "RemoveContainer" containerID="394fbf0414e22369cb866ca6f41f29126f596f6cb7d409d860771cb6b3728194" Dec 09 10:38:40 crc kubenswrapper[5002]: I1209 10:38:40.927894 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kjhjt"] Dec 09 10:38:42 crc kubenswrapper[5002]: I1209 10:38:42.074541 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="937441cb-575f-4ab9-a63b-9eb51731c4a9" path="/var/lib/kubelet/pods/937441cb-575f-4ab9-a63b-9eb51731c4a9/volumes" Dec 09 10:40:07 crc kubenswrapper[5002]: I1209 10:40:07.965068 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:40:07 crc kubenswrapper[5002]: I1209 10:40:07.965966 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:40:37 crc kubenswrapper[5002]: I1209 10:40:37.965287 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:40:37 crc kubenswrapper[5002]: I1209 10:40:37.966032 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:41:07 crc kubenswrapper[5002]: I1209 10:41:07.964987 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:41:07 crc kubenswrapper[5002]: I1209 10:41:07.965644 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:41:07 crc kubenswrapper[5002]: I1209 10:41:07.965796 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:41:07 crc kubenswrapper[5002]: I1209 10:41:07.966459 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:41:07 crc kubenswrapper[5002]: I1209 10:41:07.966511 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" gracePeriod=600 Dec 09 10:41:08 crc kubenswrapper[5002]: E1209 10:41:08.108085 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:41:09 crc kubenswrapper[5002]: I1209 10:41:09.015556 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" exitCode=0 Dec 09 10:41:09 crc kubenswrapper[5002]: I1209 10:41:09.015611 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53"} Dec 09 10:41:09 crc kubenswrapper[5002]: I1209 10:41:09.015658 5002 scope.go:117] "RemoveContainer" containerID="3790327e397a17602262effb7ad54351ff4d450f68ad25cca18f8b3766e0d75d" Dec 09 10:41:09 crc kubenswrapper[5002]: I1209 10:41:09.016236 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:41:09 crc kubenswrapper[5002]: E1209 10:41:09.016505 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:41:23 crc kubenswrapper[5002]: I1209 10:41:23.060703 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:41:23 crc kubenswrapper[5002]: E1209 10:41:23.061438 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:41:37 crc kubenswrapper[5002]: I1209 10:41:37.061041 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:41:37 crc kubenswrapper[5002]: E1209 10:41:37.061935 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:41:52 crc kubenswrapper[5002]: I1209 10:41:52.060533 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:41:52 crc kubenswrapper[5002]: E1209 10:41:52.061410 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:42:04 crc kubenswrapper[5002]: I1209 10:42:04.060640 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:42:04 crc kubenswrapper[5002]: E1209 10:42:04.061489 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:42:18 crc kubenswrapper[5002]: I1209 10:42:18.063945 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:42:18 crc kubenswrapper[5002]: E1209 10:42:18.064786 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:42:29 crc kubenswrapper[5002]: I1209 10:42:29.060316 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:42:29 crc kubenswrapper[5002]: E1209 10:42:29.061161 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:42:41 crc kubenswrapper[5002]: I1209 10:42:41.060872 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:42:41 crc kubenswrapper[5002]: E1209 10:42:41.061761 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:42:53 crc kubenswrapper[5002]: I1209 10:42:53.061243 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:42:53 crc kubenswrapper[5002]: E1209 10:42:53.064328 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:43:04 crc kubenswrapper[5002]: I1209 10:43:04.060292 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:43:04 crc kubenswrapper[5002]: E1209 10:43:04.061099 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:43:16 crc kubenswrapper[5002]: I1209 10:43:16.059969 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:43:16 crc kubenswrapper[5002]: E1209 10:43:16.060782 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:43:28 crc kubenswrapper[5002]: I1209 10:43:28.066389 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:43:28 crc kubenswrapper[5002]: E1209 10:43:28.067349 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:43:40 crc kubenswrapper[5002]: I1209 10:43:40.061491 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:43:40 crc kubenswrapper[5002]: E1209 10:43:40.062541 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:43:53 crc kubenswrapper[5002]: I1209 10:43:53.059932 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:43:53 crc kubenswrapper[5002]: E1209 10:43:53.060927 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:44:04 crc kubenswrapper[5002]: I1209 10:44:04.060423 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:44:04 crc kubenswrapper[5002]: E1209 10:44:04.061247 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:44:17 crc kubenswrapper[5002]: I1209 10:44:17.061292 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:44:17 crc kubenswrapper[5002]: E1209 10:44:17.062462 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:44:32 crc kubenswrapper[5002]: I1209 10:44:32.060245 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:44:32 crc kubenswrapper[5002]: E1209 10:44:32.061081 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.252862 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-85tbw"] Dec 09 10:44:38 crc kubenswrapper[5002]: E1209 10:44:38.253906 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" containerName="extract-utilities" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.253929 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" containerName="extract-utilities" Dec 09 10:44:38 crc kubenswrapper[5002]: E1209 10:44:38.253976 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937441cb-575f-4ab9-a63b-9eb51731c4a9" containerName="extract-utilities" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.253989 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="937441cb-575f-4ab9-a63b-9eb51731c4a9" containerName="extract-utilities" Dec 09 10:44:38 crc kubenswrapper[5002]: E1209 10:44:38.254017 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" containerName="registry-server" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.254029 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" containerName="registry-server" Dec 09 10:44:38 crc kubenswrapper[5002]: E1209 10:44:38.254042 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937441cb-575f-4ab9-a63b-9eb51731c4a9" containerName="registry-server" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.254053 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="937441cb-575f-4ab9-a63b-9eb51731c4a9" containerName="registry-server" Dec 09 10:44:38 crc kubenswrapper[5002]: E1209 10:44:38.254070 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937441cb-575f-4ab9-a63b-9eb51731c4a9" containerName="extract-content" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.254081 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="937441cb-575f-4ab9-a63b-9eb51731c4a9" containerName="extract-content" Dec 09 10:44:38 crc kubenswrapper[5002]: E1209 10:44:38.254097 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" containerName="extract-content" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.254108 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" containerName="extract-content" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.254335 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7fa3b0-7a6e-43d0-93ad-ddec2176d1a8" containerName="registry-server" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.254378 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="937441cb-575f-4ab9-a63b-9eb51731c4a9" containerName="registry-server" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.256147 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.265184 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-85tbw"] Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.385765 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfs92\" (UniqueName: \"kubernetes.io/projected/025ca558-ee3a-47bb-ae57-f9658d29a256-kube-api-access-wfs92\") pod \"redhat-marketplace-85tbw\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.386160 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-utilities\") pod \"redhat-marketplace-85tbw\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.386188 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-catalog-content\") pod \"redhat-marketplace-85tbw\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.487785 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-utilities\") pod \"redhat-marketplace-85tbw\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.487870 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-catalog-content\") pod \"redhat-marketplace-85tbw\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.487957 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfs92\" (UniqueName: \"kubernetes.io/projected/025ca558-ee3a-47bb-ae57-f9658d29a256-kube-api-access-wfs92\") pod \"redhat-marketplace-85tbw\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.488293 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-utilities\") pod \"redhat-marketplace-85tbw\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.488343 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-catalog-content\") pod \"redhat-marketplace-85tbw\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.507398 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfs92\" (UniqueName: \"kubernetes.io/projected/025ca558-ee3a-47bb-ae57-f9658d29a256-kube-api-access-wfs92\") pod \"redhat-marketplace-85tbw\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:38 crc kubenswrapper[5002]: I1209 10:44:38.576614 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:39 crc kubenswrapper[5002]: I1209 10:44:39.041001 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-85tbw"] Dec 09 10:44:39 crc kubenswrapper[5002]: I1209 10:44:39.564022 5002 generic.go:334] "Generic (PLEG): container finished" podID="025ca558-ee3a-47bb-ae57-f9658d29a256" containerID="6c289f9dae3efecadfbdda25f2188c3e99e8b2b7cd5c2848e0333ba25cbd213e" exitCode=0 Dec 09 10:44:39 crc kubenswrapper[5002]: I1209 10:44:39.564240 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85tbw" event={"ID":"025ca558-ee3a-47bb-ae57-f9658d29a256","Type":"ContainerDied","Data":"6c289f9dae3efecadfbdda25f2188c3e99e8b2b7cd5c2848e0333ba25cbd213e"} Dec 09 10:44:39 crc kubenswrapper[5002]: I1209 10:44:39.564361 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85tbw" event={"ID":"025ca558-ee3a-47bb-ae57-f9658d29a256","Type":"ContainerStarted","Data":"0d588bf8e881a0667e8463a96e9167bb8b14d57625adc441e8fa4ace895cede5"} Dec 09 10:44:39 crc kubenswrapper[5002]: I1209 10:44:39.567480 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 10:44:41 crc kubenswrapper[5002]: I1209 10:44:41.584895 5002 generic.go:334] "Generic (PLEG): container finished" podID="025ca558-ee3a-47bb-ae57-f9658d29a256" containerID="406be8215917dd194d7da6621c29ac3bbd04bbad657ecb1c7f98cf66ebf95eac" exitCode=0 Dec 09 10:44:41 crc kubenswrapper[5002]: I1209 10:44:41.585003 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85tbw" event={"ID":"025ca558-ee3a-47bb-ae57-f9658d29a256","Type":"ContainerDied","Data":"406be8215917dd194d7da6621c29ac3bbd04bbad657ecb1c7f98cf66ebf95eac"} Dec 09 10:44:42 crc kubenswrapper[5002]: I1209 10:44:42.596159 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85tbw" event={"ID":"025ca558-ee3a-47bb-ae57-f9658d29a256","Type":"ContainerStarted","Data":"34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec"} Dec 09 10:44:42 crc kubenswrapper[5002]: I1209 10:44:42.616680 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-85tbw" podStartSLOduration=2.212850226 podStartE2EDuration="4.61665289s" podCreationTimestamp="2025-12-09 10:44:38 +0000 UTC" firstStartedPulling="2025-12-09 10:44:39.56717732 +0000 UTC m=+2611.959228401" lastFinishedPulling="2025-12-09 10:44:41.970979984 +0000 UTC m=+2614.363031065" observedRunningTime="2025-12-09 10:44:42.61291785 +0000 UTC m=+2615.004968951" watchObservedRunningTime="2025-12-09 10:44:42.61665289 +0000 UTC m=+2615.008703981" Dec 09 10:44:43 crc kubenswrapper[5002]: I1209 10:44:43.060773 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:44:43 crc kubenswrapper[5002]: E1209 10:44:43.061087 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:44:48 crc kubenswrapper[5002]: I1209 10:44:48.577505 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:48 crc kubenswrapper[5002]: I1209 10:44:48.578303 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:48 crc kubenswrapper[5002]: I1209 10:44:48.625384 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:48 crc kubenswrapper[5002]: I1209 10:44:48.687792 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:48 crc kubenswrapper[5002]: I1209 10:44:48.875858 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-85tbw"] Dec 09 10:44:50 crc kubenswrapper[5002]: I1209 10:44:50.659165 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-85tbw" podUID="025ca558-ee3a-47bb-ae57-f9658d29a256" containerName="registry-server" containerID="cri-o://34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec" gracePeriod=2 Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.581241 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.667282 5002 generic.go:334] "Generic (PLEG): container finished" podID="025ca558-ee3a-47bb-ae57-f9658d29a256" containerID="34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec" exitCode=0 Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.667329 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85tbw" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.667331 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85tbw" event={"ID":"025ca558-ee3a-47bb-ae57-f9658d29a256","Type":"ContainerDied","Data":"34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec"} Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.667450 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85tbw" event={"ID":"025ca558-ee3a-47bb-ae57-f9658d29a256","Type":"ContainerDied","Data":"0d588bf8e881a0667e8463a96e9167bb8b14d57625adc441e8fa4ace895cede5"} Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.667504 5002 scope.go:117] "RemoveContainer" containerID="34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.678808 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfs92\" (UniqueName: \"kubernetes.io/projected/025ca558-ee3a-47bb-ae57-f9658d29a256-kube-api-access-wfs92\") pod \"025ca558-ee3a-47bb-ae57-f9658d29a256\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.679051 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-utilities\") pod \"025ca558-ee3a-47bb-ae57-f9658d29a256\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.679086 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-catalog-content\") pod \"025ca558-ee3a-47bb-ae57-f9658d29a256\" (UID: \"025ca558-ee3a-47bb-ae57-f9658d29a256\") " Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.680113 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-utilities" (OuterVolumeSpecName: "utilities") pod "025ca558-ee3a-47bb-ae57-f9658d29a256" (UID: "025ca558-ee3a-47bb-ae57-f9658d29a256"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.685295 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025ca558-ee3a-47bb-ae57-f9658d29a256-kube-api-access-wfs92" (OuterVolumeSpecName: "kube-api-access-wfs92") pod "025ca558-ee3a-47bb-ae57-f9658d29a256" (UID: "025ca558-ee3a-47bb-ae57-f9658d29a256"). InnerVolumeSpecName "kube-api-access-wfs92". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.691546 5002 scope.go:117] "RemoveContainer" containerID="406be8215917dd194d7da6621c29ac3bbd04bbad657ecb1c7f98cf66ebf95eac" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.709837 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "025ca558-ee3a-47bb-ae57-f9658d29a256" (UID: "025ca558-ee3a-47bb-ae57-f9658d29a256"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.733793 5002 scope.go:117] "RemoveContainer" containerID="6c289f9dae3efecadfbdda25f2188c3e99e8b2b7cd5c2848e0333ba25cbd213e" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.766482 5002 scope.go:117] "RemoveContainer" containerID="34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec" Dec 09 10:44:51 crc kubenswrapper[5002]: E1209 10:44:51.767292 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec\": container with ID starting with 34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec not found: ID does not exist" containerID="34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.767400 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec"} err="failed to get container status \"34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec\": rpc error: code = NotFound desc = could not find container \"34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec\": container with ID starting with 34a5c82ebf69e570e28e1a71a7f61c9cbc6a40d5e2856983db6f90fb86c1e9ec not found: ID does not exist" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.767446 5002 scope.go:117] "RemoveContainer" containerID="406be8215917dd194d7da6621c29ac3bbd04bbad657ecb1c7f98cf66ebf95eac" Dec 09 10:44:51 crc kubenswrapper[5002]: E1209 10:44:51.768266 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"406be8215917dd194d7da6621c29ac3bbd04bbad657ecb1c7f98cf66ebf95eac\": container with ID starting with 406be8215917dd194d7da6621c29ac3bbd04bbad657ecb1c7f98cf66ebf95eac not found: ID does not exist" containerID="406be8215917dd194d7da6621c29ac3bbd04bbad657ecb1c7f98cf66ebf95eac" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.768310 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"406be8215917dd194d7da6621c29ac3bbd04bbad657ecb1c7f98cf66ebf95eac"} err="failed to get container status \"406be8215917dd194d7da6621c29ac3bbd04bbad657ecb1c7f98cf66ebf95eac\": rpc error: code = NotFound desc = could not find container \"406be8215917dd194d7da6621c29ac3bbd04bbad657ecb1c7f98cf66ebf95eac\": container with ID starting with 406be8215917dd194d7da6621c29ac3bbd04bbad657ecb1c7f98cf66ebf95eac not found: ID does not exist" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.768342 5002 scope.go:117] "RemoveContainer" containerID="6c289f9dae3efecadfbdda25f2188c3e99e8b2b7cd5c2848e0333ba25cbd213e" Dec 09 10:44:51 crc kubenswrapper[5002]: E1209 10:44:51.768779 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c289f9dae3efecadfbdda25f2188c3e99e8b2b7cd5c2848e0333ba25cbd213e\": container with ID starting with 6c289f9dae3efecadfbdda25f2188c3e99e8b2b7cd5c2848e0333ba25cbd213e not found: ID does not exist" containerID="6c289f9dae3efecadfbdda25f2188c3e99e8b2b7cd5c2848e0333ba25cbd213e" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.768844 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c289f9dae3efecadfbdda25f2188c3e99e8b2b7cd5c2848e0333ba25cbd213e"} err="failed to get container status \"6c289f9dae3efecadfbdda25f2188c3e99e8b2b7cd5c2848e0333ba25cbd213e\": rpc error: code = NotFound desc = could not find container \"6c289f9dae3efecadfbdda25f2188c3e99e8b2b7cd5c2848e0333ba25cbd213e\": container with ID starting with 6c289f9dae3efecadfbdda25f2188c3e99e8b2b7cd5c2848e0333ba25cbd213e not found: ID does not exist" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.780942 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfs92\" (UniqueName: \"kubernetes.io/projected/025ca558-ee3a-47bb-ae57-f9658d29a256-kube-api-access-wfs92\") on node \"crc\" DevicePath \"\"" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.780979 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:44:51 crc kubenswrapper[5002]: I1209 10:44:51.780993 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025ca558-ee3a-47bb-ae57-f9658d29a256-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:44:52 crc kubenswrapper[5002]: I1209 10:44:52.012540 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-85tbw"] Dec 09 10:44:52 crc kubenswrapper[5002]: I1209 10:44:52.021423 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-85tbw"] Dec 09 10:44:52 crc kubenswrapper[5002]: I1209 10:44:52.072754 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025ca558-ee3a-47bb-ae57-f9658d29a256" path="/var/lib/kubelet/pods/025ca558-ee3a-47bb-ae57-f9658d29a256/volumes" Dec 09 10:44:55 crc kubenswrapper[5002]: I1209 10:44:55.061149 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:44:55 crc kubenswrapper[5002]: E1209 10:44:55.062010 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.149187 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt"] Dec 09 10:45:00 crc kubenswrapper[5002]: E1209 10:45:00.150875 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025ca558-ee3a-47bb-ae57-f9658d29a256" containerName="extract-utilities" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.150955 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="025ca558-ee3a-47bb-ae57-f9658d29a256" containerName="extract-utilities" Dec 09 10:45:00 crc kubenswrapper[5002]: E1209 10:45:00.151061 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025ca558-ee3a-47bb-ae57-f9658d29a256" containerName="registry-server" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.151128 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="025ca558-ee3a-47bb-ae57-f9658d29a256" containerName="registry-server" Dec 09 10:45:00 crc kubenswrapper[5002]: E1209 10:45:00.151510 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025ca558-ee3a-47bb-ae57-f9658d29a256" containerName="extract-content" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.151577 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="025ca558-ee3a-47bb-ae57-f9658d29a256" containerName="extract-content" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.151868 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="025ca558-ee3a-47bb-ae57-f9658d29a256" containerName="registry-server" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.152491 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.156538 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.156575 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.161380 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt"] Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.202800 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf8ce2eb-0991-40ef-8aa1-2bb668739592-secret-volume\") pod \"collect-profiles-29421285-gnmlt\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.203027 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94wqs\" (UniqueName: \"kubernetes.io/projected/bf8ce2eb-0991-40ef-8aa1-2bb668739592-kube-api-access-94wqs\") pod \"collect-profiles-29421285-gnmlt\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.203250 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf8ce2eb-0991-40ef-8aa1-2bb668739592-config-volume\") pod \"collect-profiles-29421285-gnmlt\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.304040 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf8ce2eb-0991-40ef-8aa1-2bb668739592-config-volume\") pod \"collect-profiles-29421285-gnmlt\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.304098 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf8ce2eb-0991-40ef-8aa1-2bb668739592-secret-volume\") pod \"collect-profiles-29421285-gnmlt\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.304146 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94wqs\" (UniqueName: \"kubernetes.io/projected/bf8ce2eb-0991-40ef-8aa1-2bb668739592-kube-api-access-94wqs\") pod \"collect-profiles-29421285-gnmlt\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.305657 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf8ce2eb-0991-40ef-8aa1-2bb668739592-config-volume\") pod \"collect-profiles-29421285-gnmlt\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.313393 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf8ce2eb-0991-40ef-8aa1-2bb668739592-secret-volume\") pod \"collect-profiles-29421285-gnmlt\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.326566 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94wqs\" (UniqueName: \"kubernetes.io/projected/bf8ce2eb-0991-40ef-8aa1-2bb668739592-kube-api-access-94wqs\") pod \"collect-profiles-29421285-gnmlt\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.474879 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:00 crc kubenswrapper[5002]: I1209 10:45:00.934696 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt"] Dec 09 10:45:01 crc kubenswrapper[5002]: I1209 10:45:01.739645 5002 generic.go:334] "Generic (PLEG): container finished" podID="bf8ce2eb-0991-40ef-8aa1-2bb668739592" containerID="07303139730ec5d256bff4a041d4b5957d21f0bad9f3bfd59e40a44f3190736a" exitCode=0 Dec 09 10:45:01 crc kubenswrapper[5002]: I1209 10:45:01.739716 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" event={"ID":"bf8ce2eb-0991-40ef-8aa1-2bb668739592","Type":"ContainerDied","Data":"07303139730ec5d256bff4a041d4b5957d21f0bad9f3bfd59e40a44f3190736a"} Dec 09 10:45:01 crc kubenswrapper[5002]: I1209 10:45:01.739910 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" event={"ID":"bf8ce2eb-0991-40ef-8aa1-2bb668739592","Type":"ContainerStarted","Data":"7d672e597ba379c930635c70f61826e8c3e6efc8b465c2352039919b026b0358"} Dec 09 10:45:02 crc kubenswrapper[5002]: I1209 10:45:02.990629 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.054390 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf8ce2eb-0991-40ef-8aa1-2bb668739592-config-volume\") pod \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.054479 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94wqs\" (UniqueName: \"kubernetes.io/projected/bf8ce2eb-0991-40ef-8aa1-2bb668739592-kube-api-access-94wqs\") pod \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.054545 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf8ce2eb-0991-40ef-8aa1-2bb668739592-secret-volume\") pod \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\" (UID: \"bf8ce2eb-0991-40ef-8aa1-2bb668739592\") " Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.055061 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8ce2eb-0991-40ef-8aa1-2bb668739592-config-volume" (OuterVolumeSpecName: "config-volume") pod "bf8ce2eb-0991-40ef-8aa1-2bb668739592" (UID: "bf8ce2eb-0991-40ef-8aa1-2bb668739592"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.059720 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf8ce2eb-0991-40ef-8aa1-2bb668739592-kube-api-access-94wqs" (OuterVolumeSpecName: "kube-api-access-94wqs") pod "bf8ce2eb-0991-40ef-8aa1-2bb668739592" (UID: "bf8ce2eb-0991-40ef-8aa1-2bb668739592"). InnerVolumeSpecName "kube-api-access-94wqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.061250 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf8ce2eb-0991-40ef-8aa1-2bb668739592-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bf8ce2eb-0991-40ef-8aa1-2bb668739592" (UID: "bf8ce2eb-0991-40ef-8aa1-2bb668739592"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.155884 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94wqs\" (UniqueName: \"kubernetes.io/projected/bf8ce2eb-0991-40ef-8aa1-2bb668739592-kube-api-access-94wqs\") on node \"crc\" DevicePath \"\"" Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.155920 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf8ce2eb-0991-40ef-8aa1-2bb668739592-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.155933 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf8ce2eb-0991-40ef-8aa1-2bb668739592-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.753553 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" event={"ID":"bf8ce2eb-0991-40ef-8aa1-2bb668739592","Type":"ContainerDied","Data":"7d672e597ba379c930635c70f61826e8c3e6efc8b465c2352039919b026b0358"} Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.754117 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d672e597ba379c930635c70f61826e8c3e6efc8b465c2352039919b026b0358" Dec 09 10:45:03 crc kubenswrapper[5002]: I1209 10:45:03.753603 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt" Dec 09 10:45:04 crc kubenswrapper[5002]: I1209 10:45:04.059077 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz"] Dec 09 10:45:04 crc kubenswrapper[5002]: I1209 10:45:04.068398 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421240-5jmzz"] Dec 09 10:45:06 crc kubenswrapper[5002]: I1209 10:45:06.070114 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e408545-84a5-4b62-ab02-e213a58d1c53" path="/var/lib/kubelet/pods/6e408545-84a5-4b62-ab02-e213a58d1c53/volumes" Dec 09 10:45:10 crc kubenswrapper[5002]: I1209 10:45:10.060482 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:45:10 crc kubenswrapper[5002]: E1209 10:45:10.061192 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:45:25 crc kubenswrapper[5002]: I1209 10:45:25.060735 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:45:25 crc kubenswrapper[5002]: E1209 10:45:25.061553 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:45:40 crc kubenswrapper[5002]: I1209 10:45:40.062018 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:45:40 crc kubenswrapper[5002]: E1209 10:45:40.063615 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:45:44 crc kubenswrapper[5002]: I1209 10:45:44.594300 5002 scope.go:117] "RemoveContainer" containerID="07d5e13d633c647f14baf9f5c3a61ce86e7e8f34a3d20dcdfa800af8231d6639" Dec 09 10:45:51 crc kubenswrapper[5002]: I1209 10:45:51.060871 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:45:51 crc kubenswrapper[5002]: E1209 10:45:51.062480 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:46:04 crc kubenswrapper[5002]: I1209 10:46:04.060343 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:46:04 crc kubenswrapper[5002]: E1209 10:46:04.061133 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:46:16 crc kubenswrapper[5002]: I1209 10:46:16.060518 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:46:16 crc kubenswrapper[5002]: I1209 10:46:16.413435 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"093aa43a31cfe27a0c2f247ef2dbf753766dc1b0af20d4afbacdd52b8dadfe16"} Dec 09 10:48:26 crc kubenswrapper[5002]: I1209 10:48:26.887158 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-54h9m"] Dec 09 10:48:26 crc kubenswrapper[5002]: E1209 10:48:26.888022 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8ce2eb-0991-40ef-8aa1-2bb668739592" containerName="collect-profiles" Dec 09 10:48:26 crc kubenswrapper[5002]: I1209 10:48:26.888039 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8ce2eb-0991-40ef-8aa1-2bb668739592" containerName="collect-profiles" Dec 09 10:48:26 crc kubenswrapper[5002]: I1209 10:48:26.888227 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf8ce2eb-0991-40ef-8aa1-2bb668739592" containerName="collect-profiles" Dec 09 10:48:26 crc kubenswrapper[5002]: I1209 10:48:26.889471 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:26 crc kubenswrapper[5002]: I1209 10:48:26.904632 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54h9m"] Dec 09 10:48:26 crc kubenswrapper[5002]: I1209 10:48:26.996627 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-catalog-content\") pod \"certified-operators-54h9m\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:26 crc kubenswrapper[5002]: I1209 10:48:26.996724 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmt5q\" (UniqueName: \"kubernetes.io/projected/bc316257-fecc-4af4-9c2f-f352ed45a81a-kube-api-access-xmt5q\") pod \"certified-operators-54h9m\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:26 crc kubenswrapper[5002]: I1209 10:48:26.996762 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-utilities\") pod \"certified-operators-54h9m\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.085860 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x9btx"] Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.087380 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.098201 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-catalog-content\") pod \"certified-operators-54h9m\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.098246 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x9btx"] Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.098268 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmt5q\" (UniqueName: \"kubernetes.io/projected/bc316257-fecc-4af4-9c2f-f352ed45a81a-kube-api-access-xmt5q\") pod \"certified-operators-54h9m\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.098306 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-utilities\") pod \"certified-operators-54h9m\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.098778 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-utilities\") pod \"certified-operators-54h9m\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.099037 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-catalog-content\") pod \"certified-operators-54h9m\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.124286 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmt5q\" (UniqueName: \"kubernetes.io/projected/bc316257-fecc-4af4-9c2f-f352ed45a81a-kube-api-access-xmt5q\") pod \"certified-operators-54h9m\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.200150 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-utilities\") pod \"community-operators-x9btx\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.200228 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d57x4\" (UniqueName: \"kubernetes.io/projected/57b8ac63-b387-4ee4-bae7-16c1d70e8821-kube-api-access-d57x4\") pod \"community-operators-x9btx\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.200782 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-catalog-content\") pod \"community-operators-x9btx\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.208252 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.302255 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-utilities\") pod \"community-operators-x9btx\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.302332 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d57x4\" (UniqueName: \"kubernetes.io/projected/57b8ac63-b387-4ee4-bae7-16c1d70e8821-kube-api-access-d57x4\") pod \"community-operators-x9btx\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.302357 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-catalog-content\") pod \"community-operators-x9btx\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.302978 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-catalog-content\") pod \"community-operators-x9btx\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.303249 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-utilities\") pod \"community-operators-x9btx\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.322377 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d57x4\" (UniqueName: \"kubernetes.io/projected/57b8ac63-b387-4ee4-bae7-16c1d70e8821-kube-api-access-d57x4\") pod \"community-operators-x9btx\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.483800 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.747714 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54h9m"] Dec 09 10:48:27 crc kubenswrapper[5002]: I1209 10:48:27.812274 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x9btx"] Dec 09 10:48:28 crc kubenswrapper[5002]: I1209 10:48:28.687096 5002 generic.go:334] "Generic (PLEG): container finished" podID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" containerID="4c237f141fb07a1f7942e4ecf8ba790b46a21f1406e2a5ad1f5766cba8abf005" exitCode=0 Dec 09 10:48:28 crc kubenswrapper[5002]: I1209 10:48:28.687151 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9btx" event={"ID":"57b8ac63-b387-4ee4-bae7-16c1d70e8821","Type":"ContainerDied","Data":"4c237f141fb07a1f7942e4ecf8ba790b46a21f1406e2a5ad1f5766cba8abf005"} Dec 09 10:48:28 crc kubenswrapper[5002]: I1209 10:48:28.692996 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9btx" event={"ID":"57b8ac63-b387-4ee4-bae7-16c1d70e8821","Type":"ContainerStarted","Data":"b623a79b195a90efc352f9055ed098087739aa4cd305f2dd9139f375cd5b5b9b"} Dec 09 10:48:28 crc kubenswrapper[5002]: I1209 10:48:28.694961 5002 generic.go:334] "Generic (PLEG): container finished" podID="bc316257-fecc-4af4-9c2f-f352ed45a81a" containerID="189dcafa2cf2ecde1847aaba166a3caf494f8b8ddfc8692fc7ae35914a2b2700" exitCode=0 Dec 09 10:48:28 crc kubenswrapper[5002]: I1209 10:48:28.694981 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h9m" event={"ID":"bc316257-fecc-4af4-9c2f-f352ed45a81a","Type":"ContainerDied","Data":"189dcafa2cf2ecde1847aaba166a3caf494f8b8ddfc8692fc7ae35914a2b2700"} Dec 09 10:48:28 crc kubenswrapper[5002]: I1209 10:48:28.695133 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h9m" event={"ID":"bc316257-fecc-4af4-9c2f-f352ed45a81a","Type":"ContainerStarted","Data":"5633faea311d48ed54bfcdabf06daaa6d1797d80cdf2f9f1d52f4cb587ac730a"} Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.081772 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6bhqr"] Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.083120 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.100948 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6bhqr"] Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.258639 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t5k5\" (UniqueName: \"kubernetes.io/projected/7c57ba50-6bca-4b38-84bb-173daf430393-kube-api-access-7t5k5\") pod \"redhat-operators-6bhqr\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.258709 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-utilities\") pod \"redhat-operators-6bhqr\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.258795 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-catalog-content\") pod \"redhat-operators-6bhqr\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.360358 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t5k5\" (UniqueName: \"kubernetes.io/projected/7c57ba50-6bca-4b38-84bb-173daf430393-kube-api-access-7t5k5\") pod \"redhat-operators-6bhqr\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.360429 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-utilities\") pod \"redhat-operators-6bhqr\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.360482 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-catalog-content\") pod \"redhat-operators-6bhqr\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.361074 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-catalog-content\") pod \"redhat-operators-6bhqr\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.361119 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-utilities\") pod \"redhat-operators-6bhqr\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.385771 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t5k5\" (UniqueName: \"kubernetes.io/projected/7c57ba50-6bca-4b38-84bb-173daf430393-kube-api-access-7t5k5\") pod \"redhat-operators-6bhqr\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:30 crc kubenswrapper[5002]: I1209 10:48:30.405421 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:31 crc kubenswrapper[5002]: I1209 10:48:31.012241 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6bhqr"] Dec 09 10:48:31 crc kubenswrapper[5002]: W1209 10:48:31.058495 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c57ba50_6bca_4b38_84bb_173daf430393.slice/crio-3aa7559eff4ec9061fa129da572f875e635910ef6f354fd5f90fe6b9a8b4478f WatchSource:0}: Error finding container 3aa7559eff4ec9061fa129da572f875e635910ef6f354fd5f90fe6b9a8b4478f: Status 404 returned error can't find the container with id 3aa7559eff4ec9061fa129da572f875e635910ef6f354fd5f90fe6b9a8b4478f Dec 09 10:48:31 crc kubenswrapper[5002]: I1209 10:48:31.740701 5002 generic.go:334] "Generic (PLEG): container finished" podID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" containerID="4d3450d9195823da4f384ae3194cc1e71a44ee15cf328b47efc39eac07e76a0c" exitCode=0 Dec 09 10:48:31 crc kubenswrapper[5002]: I1209 10:48:31.740798 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9btx" event={"ID":"57b8ac63-b387-4ee4-bae7-16c1d70e8821","Type":"ContainerDied","Data":"4d3450d9195823da4f384ae3194cc1e71a44ee15cf328b47efc39eac07e76a0c"} Dec 09 10:48:31 crc kubenswrapper[5002]: I1209 10:48:31.745719 5002 generic.go:334] "Generic (PLEG): container finished" podID="bc316257-fecc-4af4-9c2f-f352ed45a81a" containerID="be0f96a64b5b0a6b328b1f85bf5175c1c7e684abe38b2f0b91cfc18a8e203560" exitCode=0 Dec 09 10:48:31 crc kubenswrapper[5002]: I1209 10:48:31.745758 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h9m" event={"ID":"bc316257-fecc-4af4-9c2f-f352ed45a81a","Type":"ContainerDied","Data":"be0f96a64b5b0a6b328b1f85bf5175c1c7e684abe38b2f0b91cfc18a8e203560"} Dec 09 10:48:31 crc kubenswrapper[5002]: I1209 10:48:31.750513 5002 generic.go:334] "Generic (PLEG): container finished" podID="7c57ba50-6bca-4b38-84bb-173daf430393" containerID="d4cddba500a2ee3f369c0682ab5f8ca68af0db5dade2473dbc66417138adb74f" exitCode=0 Dec 09 10:48:31 crc kubenswrapper[5002]: I1209 10:48:31.750575 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bhqr" event={"ID":"7c57ba50-6bca-4b38-84bb-173daf430393","Type":"ContainerDied","Data":"d4cddba500a2ee3f369c0682ab5f8ca68af0db5dade2473dbc66417138adb74f"} Dec 09 10:48:31 crc kubenswrapper[5002]: I1209 10:48:31.750613 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bhqr" event={"ID":"7c57ba50-6bca-4b38-84bb-173daf430393","Type":"ContainerStarted","Data":"3aa7559eff4ec9061fa129da572f875e635910ef6f354fd5f90fe6b9a8b4478f"} Dec 09 10:48:32 crc kubenswrapper[5002]: I1209 10:48:32.760407 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h9m" event={"ID":"bc316257-fecc-4af4-9c2f-f352ed45a81a","Type":"ContainerStarted","Data":"045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f"} Dec 09 10:48:32 crc kubenswrapper[5002]: I1209 10:48:32.762457 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bhqr" event={"ID":"7c57ba50-6bca-4b38-84bb-173daf430393","Type":"ContainerStarted","Data":"dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9"} Dec 09 10:48:32 crc kubenswrapper[5002]: I1209 10:48:32.764646 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9btx" event={"ID":"57b8ac63-b387-4ee4-bae7-16c1d70e8821","Type":"ContainerStarted","Data":"7ed7c548c8ac81463e7b1e470dda5c580d571861d2313a05f3f4ce4a12c152d9"} Dec 09 10:48:32 crc kubenswrapper[5002]: I1209 10:48:32.785233 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-54h9m" podStartSLOduration=3.299011242 podStartE2EDuration="6.785213023s" podCreationTimestamp="2025-12-09 10:48:26 +0000 UTC" firstStartedPulling="2025-12-09 10:48:28.696343509 +0000 UTC m=+2841.088394590" lastFinishedPulling="2025-12-09 10:48:32.18254529 +0000 UTC m=+2844.574596371" observedRunningTime="2025-12-09 10:48:32.777925217 +0000 UTC m=+2845.169976338" watchObservedRunningTime="2025-12-09 10:48:32.785213023 +0000 UTC m=+2845.177264104" Dec 09 10:48:32 crc kubenswrapper[5002]: I1209 10:48:32.797468 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x9btx" podStartSLOduration=2.337976186 podStartE2EDuration="5.79741768s" podCreationTimestamp="2025-12-09 10:48:27 +0000 UTC" firstStartedPulling="2025-12-09 10:48:28.688970651 +0000 UTC m=+2841.081021732" lastFinishedPulling="2025-12-09 10:48:32.148412145 +0000 UTC m=+2844.540463226" observedRunningTime="2025-12-09 10:48:32.794489081 +0000 UTC m=+2845.186540182" watchObservedRunningTime="2025-12-09 10:48:32.79741768 +0000 UTC m=+2845.189468781" Dec 09 10:48:33 crc kubenswrapper[5002]: I1209 10:48:33.773695 5002 generic.go:334] "Generic (PLEG): container finished" podID="7c57ba50-6bca-4b38-84bb-173daf430393" containerID="dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9" exitCode=0 Dec 09 10:48:33 crc kubenswrapper[5002]: I1209 10:48:33.773738 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bhqr" event={"ID":"7c57ba50-6bca-4b38-84bb-173daf430393","Type":"ContainerDied","Data":"dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9"} Dec 09 10:48:34 crc kubenswrapper[5002]: I1209 10:48:34.782307 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bhqr" event={"ID":"7c57ba50-6bca-4b38-84bb-173daf430393","Type":"ContainerStarted","Data":"5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf"} Dec 09 10:48:34 crc kubenswrapper[5002]: I1209 10:48:34.801118 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6bhqr" podStartSLOduration=2.3839902029999998 podStartE2EDuration="4.801101204s" podCreationTimestamp="2025-12-09 10:48:30 +0000 UTC" firstStartedPulling="2025-12-09 10:48:31.753234707 +0000 UTC m=+2844.145285818" lastFinishedPulling="2025-12-09 10:48:34.170345718 +0000 UTC m=+2846.562396819" observedRunningTime="2025-12-09 10:48:34.798392261 +0000 UTC m=+2847.190443342" watchObservedRunningTime="2025-12-09 10:48:34.801101204 +0000 UTC m=+2847.193152285" Dec 09 10:48:37 crc kubenswrapper[5002]: I1209 10:48:37.208394 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:37 crc kubenswrapper[5002]: I1209 10:48:37.208472 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:37 crc kubenswrapper[5002]: I1209 10:48:37.251468 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:37 crc kubenswrapper[5002]: I1209 10:48:37.485509 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:37 crc kubenswrapper[5002]: I1209 10:48:37.485551 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:37 crc kubenswrapper[5002]: I1209 10:48:37.547127 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:37 crc kubenswrapper[5002]: I1209 10:48:37.843711 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:37 crc kubenswrapper[5002]: I1209 10:48:37.844691 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:37 crc kubenswrapper[5002]: I1209 10:48:37.965212 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:48:37 crc kubenswrapper[5002]: I1209 10:48:37.965288 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:48:38 crc kubenswrapper[5002]: I1209 10:48:38.876171 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x9btx"] Dec 09 10:48:39 crc kubenswrapper[5002]: I1209 10:48:39.817332 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x9btx" podUID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" containerName="registry-server" containerID="cri-o://7ed7c548c8ac81463e7b1e470dda5c580d571861d2313a05f3f4ce4a12c152d9" gracePeriod=2 Dec 09 10:48:40 crc kubenswrapper[5002]: I1209 10:48:40.405730 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:40 crc kubenswrapper[5002]: I1209 10:48:40.405992 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:40 crc kubenswrapper[5002]: I1209 10:48:40.443402 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:40 crc kubenswrapper[5002]: I1209 10:48:40.871237 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:43 crc kubenswrapper[5002]: I1209 10:48:43.083969 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54h9m"] Dec 09 10:48:43 crc kubenswrapper[5002]: I1209 10:48:43.084662 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-54h9m" podUID="bc316257-fecc-4af4-9c2f-f352ed45a81a" containerName="registry-server" containerID="cri-o://045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f" gracePeriod=2 Dec 09 10:48:43 crc kubenswrapper[5002]: I1209 10:48:43.677689 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6bhqr"] Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.187563 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.214859 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-catalog-content\") pod \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.214913 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-utilities\") pod \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.214985 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d57x4\" (UniqueName: \"kubernetes.io/projected/57b8ac63-b387-4ee4-bae7-16c1d70e8821-kube-api-access-d57x4\") pod \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\" (UID: \"57b8ac63-b387-4ee4-bae7-16c1d70e8821\") " Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.217835 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-utilities" (OuterVolumeSpecName: "utilities") pod "57b8ac63-b387-4ee4-bae7-16c1d70e8821" (UID: "57b8ac63-b387-4ee4-bae7-16c1d70e8821"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.222305 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b8ac63-b387-4ee4-bae7-16c1d70e8821-kube-api-access-d57x4" (OuterVolumeSpecName: "kube-api-access-d57x4") pod "57b8ac63-b387-4ee4-bae7-16c1d70e8821" (UID: "57b8ac63-b387-4ee4-bae7-16c1d70e8821"). InnerVolumeSpecName "kube-api-access-d57x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.286315 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57b8ac63-b387-4ee4-bae7-16c1d70e8821" (UID: "57b8ac63-b387-4ee4-bae7-16c1d70e8821"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.316335 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.316514 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b8ac63-b387-4ee4-bae7-16c1d70e8821-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.316645 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d57x4\" (UniqueName: \"kubernetes.io/projected/57b8ac63-b387-4ee4-bae7-16c1d70e8821-kube-api-access-d57x4\") on node \"crc\" DevicePath \"\"" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.505703 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54h9m_bc316257-fecc-4af4-9c2f-f352ed45a81a/registry-server/0.log" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.506421 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.519576 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-utilities\") pod \"bc316257-fecc-4af4-9c2f-f352ed45a81a\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.519672 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-catalog-content\") pod \"bc316257-fecc-4af4-9c2f-f352ed45a81a\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.519723 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmt5q\" (UniqueName: \"kubernetes.io/projected/bc316257-fecc-4af4-9c2f-f352ed45a81a-kube-api-access-xmt5q\") pod \"bc316257-fecc-4af4-9c2f-f352ed45a81a\" (UID: \"bc316257-fecc-4af4-9c2f-f352ed45a81a\") " Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.521427 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-utilities" (OuterVolumeSpecName: "utilities") pod "bc316257-fecc-4af4-9c2f-f352ed45a81a" (UID: "bc316257-fecc-4af4-9c2f-f352ed45a81a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.528210 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc316257-fecc-4af4-9c2f-f352ed45a81a-kube-api-access-xmt5q" (OuterVolumeSpecName: "kube-api-access-xmt5q") pod "bc316257-fecc-4af4-9c2f-f352ed45a81a" (UID: "bc316257-fecc-4af4-9c2f-f352ed45a81a"). InnerVolumeSpecName "kube-api-access-xmt5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.544026 5002 generic.go:334] "Generic (PLEG): container finished" podID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" containerID="7ed7c548c8ac81463e7b1e470dda5c580d571861d2313a05f3f4ce4a12c152d9" exitCode=0 Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.545985 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9btx" event={"ID":"57b8ac63-b387-4ee4-bae7-16c1d70e8821","Type":"ContainerDied","Data":"7ed7c548c8ac81463e7b1e470dda5c580d571861d2313a05f3f4ce4a12c152d9"} Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.546060 5002 scope.go:117] "RemoveContainer" containerID="7ed7c548c8ac81463e7b1e470dda5c580d571861d2313a05f3f4ce4a12c152d9" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.546487 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6bhqr" podUID="7c57ba50-6bca-4b38-84bb-173daf430393" containerName="registry-server" containerID="cri-o://5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf" gracePeriod=2 Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.586876 5002 scope.go:117] "RemoveContainer" containerID="4d3450d9195823da4f384ae3194cc1e71a44ee15cf328b47efc39eac07e76a0c" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.599667 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc316257-fecc-4af4-9c2f-f352ed45a81a" (UID: "bc316257-fecc-4af4-9c2f-f352ed45a81a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.622192 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.622231 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc316257-fecc-4af4-9c2f-f352ed45a81a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.622272 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmt5q\" (UniqueName: \"kubernetes.io/projected/bc316257-fecc-4af4-9c2f-f352ed45a81a-kube-api-access-xmt5q\") on node \"crc\" DevicePath \"\"" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.688776 5002 scope.go:117] "RemoveContainer" containerID="4c237f141fb07a1f7942e4ecf8ba790b46a21f1406e2a5ad1f5766cba8abf005" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.871359 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.927043 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-utilities\") pod \"7c57ba50-6bca-4b38-84bb-173daf430393\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.927185 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-catalog-content\") pod \"7c57ba50-6bca-4b38-84bb-173daf430393\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.927215 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t5k5\" (UniqueName: \"kubernetes.io/projected/7c57ba50-6bca-4b38-84bb-173daf430393-kube-api-access-7t5k5\") pod \"7c57ba50-6bca-4b38-84bb-173daf430393\" (UID: \"7c57ba50-6bca-4b38-84bb-173daf430393\") " Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.928582 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-utilities" (OuterVolumeSpecName: "utilities") pod "7c57ba50-6bca-4b38-84bb-173daf430393" (UID: "7c57ba50-6bca-4b38-84bb-173daf430393"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:48:46 crc kubenswrapper[5002]: I1209 10:48:46.930794 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c57ba50-6bca-4b38-84bb-173daf430393-kube-api-access-7t5k5" (OuterVolumeSpecName: "kube-api-access-7t5k5") pod "7c57ba50-6bca-4b38-84bb-173daf430393" (UID: "7c57ba50-6bca-4b38-84bb-173daf430393"). InnerVolumeSpecName "kube-api-access-7t5k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.028403 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t5k5\" (UniqueName: \"kubernetes.io/projected/7c57ba50-6bca-4b38-84bb-173daf430393-kube-api-access-7t5k5\") on node \"crc\" DevicePath \"\"" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.028435 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.045644 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c57ba50-6bca-4b38-84bb-173daf430393" (UID: "7c57ba50-6bca-4b38-84bb-173daf430393"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.129963 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c57ba50-6bca-4b38-84bb-173daf430393-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.556025 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54h9m_bc316257-fecc-4af4-9c2f-f352ed45a81a/registry-server/0.log" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.557168 5002 generic.go:334] "Generic (PLEG): container finished" podID="bc316257-fecc-4af4-9c2f-f352ed45a81a" containerID="045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f" exitCode=137 Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.557233 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h9m" event={"ID":"bc316257-fecc-4af4-9c2f-f352ed45a81a","Type":"ContainerDied","Data":"045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f"} Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.557270 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h9m" event={"ID":"bc316257-fecc-4af4-9c2f-f352ed45a81a","Type":"ContainerDied","Data":"5633faea311d48ed54bfcdabf06daaa6d1797d80cdf2f9f1d52f4cb587ac730a"} Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.557296 5002 scope.go:117] "RemoveContainer" containerID="045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.557441 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54h9m" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.562487 5002 generic.go:334] "Generic (PLEG): container finished" podID="7c57ba50-6bca-4b38-84bb-173daf430393" containerID="5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf" exitCode=0 Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.562546 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bhqr" event={"ID":"7c57ba50-6bca-4b38-84bb-173daf430393","Type":"ContainerDied","Data":"5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf"} Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.562578 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bhqr" event={"ID":"7c57ba50-6bca-4b38-84bb-173daf430393","Type":"ContainerDied","Data":"3aa7559eff4ec9061fa129da572f875e635910ef6f354fd5f90fe6b9a8b4478f"} Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.562655 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bhqr" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.564690 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9btx" event={"ID":"57b8ac63-b387-4ee4-bae7-16c1d70e8821","Type":"ContainerDied","Data":"b623a79b195a90efc352f9055ed098087739aa4cd305f2dd9139f375cd5b5b9b"} Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.565014 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9btx" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.600701 5002 scope.go:117] "RemoveContainer" containerID="be0f96a64b5b0a6b328b1f85bf5175c1c7e684abe38b2f0b91cfc18a8e203560" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.610443 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54h9m"] Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.616642 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-54h9m"] Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.623210 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6bhqr"] Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.628355 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6bhqr"] Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.635235 5002 scope.go:117] "RemoveContainer" containerID="189dcafa2cf2ecde1847aaba166a3caf494f8b8ddfc8692fc7ae35914a2b2700" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.641032 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x9btx"] Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.648720 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x9btx"] Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.671016 5002 scope.go:117] "RemoveContainer" containerID="045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f" Dec 09 10:48:47 crc kubenswrapper[5002]: E1209 10:48:47.671333 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f\": container with ID starting with 045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f not found: ID does not exist" containerID="045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.671357 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f"} err="failed to get container status \"045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f\": rpc error: code = NotFound desc = could not find container \"045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f\": container with ID starting with 045124a606f08d6d369e16ac0661fb131764736a67fd18b795d9c454ea00d09f not found: ID does not exist" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.671378 5002 scope.go:117] "RemoveContainer" containerID="be0f96a64b5b0a6b328b1f85bf5175c1c7e684abe38b2f0b91cfc18a8e203560" Dec 09 10:48:47 crc kubenswrapper[5002]: E1209 10:48:47.671539 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be0f96a64b5b0a6b328b1f85bf5175c1c7e684abe38b2f0b91cfc18a8e203560\": container with ID starting with be0f96a64b5b0a6b328b1f85bf5175c1c7e684abe38b2f0b91cfc18a8e203560 not found: ID does not exist" containerID="be0f96a64b5b0a6b328b1f85bf5175c1c7e684abe38b2f0b91cfc18a8e203560" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.671562 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be0f96a64b5b0a6b328b1f85bf5175c1c7e684abe38b2f0b91cfc18a8e203560"} err="failed to get container status \"be0f96a64b5b0a6b328b1f85bf5175c1c7e684abe38b2f0b91cfc18a8e203560\": rpc error: code = NotFound desc = could not find container \"be0f96a64b5b0a6b328b1f85bf5175c1c7e684abe38b2f0b91cfc18a8e203560\": container with ID starting with be0f96a64b5b0a6b328b1f85bf5175c1c7e684abe38b2f0b91cfc18a8e203560 not found: ID does not exist" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.671574 5002 scope.go:117] "RemoveContainer" containerID="189dcafa2cf2ecde1847aaba166a3caf494f8b8ddfc8692fc7ae35914a2b2700" Dec 09 10:48:47 crc kubenswrapper[5002]: E1209 10:48:47.671737 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189dcafa2cf2ecde1847aaba166a3caf494f8b8ddfc8692fc7ae35914a2b2700\": container with ID starting with 189dcafa2cf2ecde1847aaba166a3caf494f8b8ddfc8692fc7ae35914a2b2700 not found: ID does not exist" containerID="189dcafa2cf2ecde1847aaba166a3caf494f8b8ddfc8692fc7ae35914a2b2700" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.671753 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189dcafa2cf2ecde1847aaba166a3caf494f8b8ddfc8692fc7ae35914a2b2700"} err="failed to get container status \"189dcafa2cf2ecde1847aaba166a3caf494f8b8ddfc8692fc7ae35914a2b2700\": rpc error: code = NotFound desc = could not find container \"189dcafa2cf2ecde1847aaba166a3caf494f8b8ddfc8692fc7ae35914a2b2700\": container with ID starting with 189dcafa2cf2ecde1847aaba166a3caf494f8b8ddfc8692fc7ae35914a2b2700 not found: ID does not exist" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.671765 5002 scope.go:117] "RemoveContainer" containerID="5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.686016 5002 scope.go:117] "RemoveContainer" containerID="dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.703362 5002 scope.go:117] "RemoveContainer" containerID="d4cddba500a2ee3f369c0682ab5f8ca68af0db5dade2473dbc66417138adb74f" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.732046 5002 scope.go:117] "RemoveContainer" containerID="5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf" Dec 09 10:48:47 crc kubenswrapper[5002]: E1209 10:48:47.732457 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf\": container with ID starting with 5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf not found: ID does not exist" containerID="5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.732547 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf"} err="failed to get container status \"5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf\": rpc error: code = NotFound desc = could not find container \"5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf\": container with ID starting with 5a982d36a79a2708ad72c07f2aa1f2f8d77810ae572c876483bdf334138583cf not found: ID does not exist" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.732580 5002 scope.go:117] "RemoveContainer" containerID="dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9" Dec 09 10:48:47 crc kubenswrapper[5002]: E1209 10:48:47.732902 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9\": container with ID starting with dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9 not found: ID does not exist" containerID="dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.732938 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9"} err="failed to get container status \"dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9\": rpc error: code = NotFound desc = could not find container \"dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9\": container with ID starting with dd4b3e7be3a0e6b96cfcce556a8bb05481ad85b11ed2ddd6e402ebee942974c9 not found: ID does not exist" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.732962 5002 scope.go:117] "RemoveContainer" containerID="d4cddba500a2ee3f369c0682ab5f8ca68af0db5dade2473dbc66417138adb74f" Dec 09 10:48:47 crc kubenswrapper[5002]: E1209 10:48:47.733166 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4cddba500a2ee3f369c0682ab5f8ca68af0db5dade2473dbc66417138adb74f\": container with ID starting with d4cddba500a2ee3f369c0682ab5f8ca68af0db5dade2473dbc66417138adb74f not found: ID does not exist" containerID="d4cddba500a2ee3f369c0682ab5f8ca68af0db5dade2473dbc66417138adb74f" Dec 09 10:48:47 crc kubenswrapper[5002]: I1209 10:48:47.733199 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4cddba500a2ee3f369c0682ab5f8ca68af0db5dade2473dbc66417138adb74f"} err="failed to get container status \"d4cddba500a2ee3f369c0682ab5f8ca68af0db5dade2473dbc66417138adb74f\": rpc error: code = NotFound desc = could not find container \"d4cddba500a2ee3f369c0682ab5f8ca68af0db5dade2473dbc66417138adb74f\": container with ID starting with d4cddba500a2ee3f369c0682ab5f8ca68af0db5dade2473dbc66417138adb74f not found: ID does not exist" Dec 09 10:48:48 crc kubenswrapper[5002]: I1209 10:48:48.074498 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" path="/var/lib/kubelet/pods/57b8ac63-b387-4ee4-bae7-16c1d70e8821/volumes" Dec 09 10:48:48 crc kubenswrapper[5002]: I1209 10:48:48.075952 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c57ba50-6bca-4b38-84bb-173daf430393" path="/var/lib/kubelet/pods/7c57ba50-6bca-4b38-84bb-173daf430393/volumes" Dec 09 10:48:48 crc kubenswrapper[5002]: I1209 10:48:48.077299 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc316257-fecc-4af4-9c2f-f352ed45a81a" path="/var/lib/kubelet/pods/bc316257-fecc-4af4-9c2f-f352ed45a81a/volumes" Dec 09 10:49:07 crc kubenswrapper[5002]: I1209 10:49:07.964343 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:49:07 crc kubenswrapper[5002]: I1209 10:49:07.964968 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:49:37 crc kubenswrapper[5002]: I1209 10:49:37.965401 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:49:37 crc kubenswrapper[5002]: I1209 10:49:37.966096 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:49:37 crc kubenswrapper[5002]: I1209 10:49:37.966159 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:49:37 crc kubenswrapper[5002]: I1209 10:49:37.966924 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"093aa43a31cfe27a0c2f247ef2dbf753766dc1b0af20d4afbacdd52b8dadfe16"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:49:37 crc kubenswrapper[5002]: I1209 10:49:37.966988 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://093aa43a31cfe27a0c2f247ef2dbf753766dc1b0af20d4afbacdd52b8dadfe16" gracePeriod=600 Dec 09 10:49:39 crc kubenswrapper[5002]: I1209 10:49:39.021124 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="093aa43a31cfe27a0c2f247ef2dbf753766dc1b0af20d4afbacdd52b8dadfe16" exitCode=0 Dec 09 10:49:39 crc kubenswrapper[5002]: I1209 10:49:39.021163 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"093aa43a31cfe27a0c2f247ef2dbf753766dc1b0af20d4afbacdd52b8dadfe16"} Dec 09 10:49:39 crc kubenswrapper[5002]: I1209 10:49:39.021748 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445"} Dec 09 10:49:39 crc kubenswrapper[5002]: I1209 10:49:39.021772 5002 scope.go:117] "RemoveContainer" containerID="3e68990294ce88d90ac0cfc0896233c3dbff845f2071adedd38304920291ca53" Dec 09 10:52:07 crc kubenswrapper[5002]: I1209 10:52:07.964619 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:52:07 crc kubenswrapper[5002]: I1209 10:52:07.965476 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:52:37 crc kubenswrapper[5002]: I1209 10:52:37.964999 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:52:37 crc kubenswrapper[5002]: I1209 10:52:37.965660 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:53:07 crc kubenswrapper[5002]: I1209 10:53:07.965185 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 10:53:07 crc kubenswrapper[5002]: I1209 10:53:07.965929 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 10:53:07 crc kubenswrapper[5002]: I1209 10:53:07.965972 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 10:53:07 crc kubenswrapper[5002]: I1209 10:53:07.967159 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 10:53:07 crc kubenswrapper[5002]: I1209 10:53:07.967221 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" gracePeriod=600 Dec 09 10:53:08 crc kubenswrapper[5002]: E1209 10:53:08.118087 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:53:09 crc kubenswrapper[5002]: I1209 10:53:09.022551 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" exitCode=0 Dec 09 10:53:09 crc kubenswrapper[5002]: I1209 10:53:09.022607 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445"} Dec 09 10:53:09 crc kubenswrapper[5002]: I1209 10:53:09.022644 5002 scope.go:117] "RemoveContainer" containerID="093aa43a31cfe27a0c2f247ef2dbf753766dc1b0af20d4afbacdd52b8dadfe16" Dec 09 10:53:09 crc kubenswrapper[5002]: I1209 10:53:09.022986 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:53:09 crc kubenswrapper[5002]: E1209 10:53:09.023225 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:53:21 crc kubenswrapper[5002]: I1209 10:53:21.060283 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:53:21 crc kubenswrapper[5002]: E1209 10:53:21.061068 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:53:36 crc kubenswrapper[5002]: I1209 10:53:36.059732 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:53:36 crc kubenswrapper[5002]: E1209 10:53:36.060444 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:53:48 crc kubenswrapper[5002]: I1209 10:53:48.072287 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:53:48 crc kubenswrapper[5002]: E1209 10:53:48.073151 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:53:59 crc kubenswrapper[5002]: I1209 10:53:59.060987 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:53:59 crc kubenswrapper[5002]: E1209 10:53:59.062004 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:54:14 crc kubenswrapper[5002]: I1209 10:54:14.062936 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:54:14 crc kubenswrapper[5002]: E1209 10:54:14.064198 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:54:27 crc kubenswrapper[5002]: I1209 10:54:27.060071 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:54:27 crc kubenswrapper[5002]: E1209 10:54:27.060978 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:54:40 crc kubenswrapper[5002]: I1209 10:54:40.060676 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:54:40 crc kubenswrapper[5002]: E1209 10:54:40.061406 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:54:55 crc kubenswrapper[5002]: I1209 10:54:55.060397 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:54:55 crc kubenswrapper[5002]: E1209 10:54:55.062395 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:55:06 crc kubenswrapper[5002]: I1209 10:55:06.061120 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:55:06 crc kubenswrapper[5002]: E1209 10:55:06.063460 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:55:17 crc kubenswrapper[5002]: I1209 10:55:17.060698 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:55:17 crc kubenswrapper[5002]: E1209 10:55:17.061827 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:55:32 crc kubenswrapper[5002]: I1209 10:55:32.060789 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:55:32 crc kubenswrapper[5002]: E1209 10:55:32.061774 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.832875 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mtwlg"] Dec 09 10:55:34 crc kubenswrapper[5002]: E1209 10:55:34.834308 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" containerName="registry-server" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834353 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" containerName="registry-server" Dec 09 10:55:34 crc kubenswrapper[5002]: E1209 10:55:34.834375 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc316257-fecc-4af4-9c2f-f352ed45a81a" containerName="extract-utilities" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834383 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc316257-fecc-4af4-9c2f-f352ed45a81a" containerName="extract-utilities" Dec 09 10:55:34 crc kubenswrapper[5002]: E1209 10:55:34.834394 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" containerName="extract-content" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834401 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" containerName="extract-content" Dec 09 10:55:34 crc kubenswrapper[5002]: E1209 10:55:34.834413 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc316257-fecc-4af4-9c2f-f352ed45a81a" containerName="registry-server" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834419 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc316257-fecc-4af4-9c2f-f352ed45a81a" containerName="registry-server" Dec 09 10:55:34 crc kubenswrapper[5002]: E1209 10:55:34.834427 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c57ba50-6bca-4b38-84bb-173daf430393" containerName="registry-server" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834433 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c57ba50-6bca-4b38-84bb-173daf430393" containerName="registry-server" Dec 09 10:55:34 crc kubenswrapper[5002]: E1209 10:55:34.834462 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c57ba50-6bca-4b38-84bb-173daf430393" containerName="extract-content" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834471 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c57ba50-6bca-4b38-84bb-173daf430393" containerName="extract-content" Dec 09 10:55:34 crc kubenswrapper[5002]: E1209 10:55:34.834482 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c57ba50-6bca-4b38-84bb-173daf430393" containerName="extract-utilities" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834488 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c57ba50-6bca-4b38-84bb-173daf430393" containerName="extract-utilities" Dec 09 10:55:34 crc kubenswrapper[5002]: E1209 10:55:34.834502 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" containerName="extract-utilities" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834509 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" containerName="extract-utilities" Dec 09 10:55:34 crc kubenswrapper[5002]: E1209 10:55:34.834516 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc316257-fecc-4af4-9c2f-f352ed45a81a" containerName="extract-content" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834522 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc316257-fecc-4af4-9c2f-f352ed45a81a" containerName="extract-content" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834677 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc316257-fecc-4af4-9c2f-f352ed45a81a" containerName="registry-server" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834701 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c57ba50-6bca-4b38-84bb-173daf430393" containerName="registry-server" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.834713 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b8ac63-b387-4ee4-bae7-16c1d70e8821" containerName="registry-server" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.835886 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:34 crc kubenswrapper[5002]: I1209 10:55:34.859839 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtwlg"] Dec 09 10:55:35 crc kubenswrapper[5002]: I1209 10:55:35.007039 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-utilities\") pod \"redhat-marketplace-mtwlg\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:35 crc kubenswrapper[5002]: I1209 10:55:35.007104 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ql4g\" (UniqueName: \"kubernetes.io/projected/8910aed0-a90b-4e55-b361-4564c8c0c5c3-kube-api-access-2ql4g\") pod \"redhat-marketplace-mtwlg\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:35 crc kubenswrapper[5002]: I1209 10:55:35.007140 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-catalog-content\") pod \"redhat-marketplace-mtwlg\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:35 crc kubenswrapper[5002]: I1209 10:55:35.108539 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-utilities\") pod \"redhat-marketplace-mtwlg\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:35 crc kubenswrapper[5002]: I1209 10:55:35.108601 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ql4g\" (UniqueName: \"kubernetes.io/projected/8910aed0-a90b-4e55-b361-4564c8c0c5c3-kube-api-access-2ql4g\") pod \"redhat-marketplace-mtwlg\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:35 crc kubenswrapper[5002]: I1209 10:55:35.108638 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-catalog-content\") pod \"redhat-marketplace-mtwlg\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:35 crc kubenswrapper[5002]: I1209 10:55:35.109197 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-utilities\") pod \"redhat-marketplace-mtwlg\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:35 crc kubenswrapper[5002]: I1209 10:55:35.109221 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-catalog-content\") pod \"redhat-marketplace-mtwlg\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:35 crc kubenswrapper[5002]: I1209 10:55:35.129059 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ql4g\" (UniqueName: \"kubernetes.io/projected/8910aed0-a90b-4e55-b361-4564c8c0c5c3-kube-api-access-2ql4g\") pod \"redhat-marketplace-mtwlg\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:35 crc kubenswrapper[5002]: I1209 10:55:35.159919 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:35 crc kubenswrapper[5002]: I1209 10:55:35.661731 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtwlg"] Dec 09 10:55:36 crc kubenswrapper[5002]: I1209 10:55:36.189593 5002 generic.go:334] "Generic (PLEG): container finished" podID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" containerID="f7d3901d28d10d84e6bd43824e27d863c0b65a06c4d50f38571ec9e7a6106617" exitCode=0 Dec 09 10:55:36 crc kubenswrapper[5002]: I1209 10:55:36.189704 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtwlg" event={"ID":"8910aed0-a90b-4e55-b361-4564c8c0c5c3","Type":"ContainerDied","Data":"f7d3901d28d10d84e6bd43824e27d863c0b65a06c4d50f38571ec9e7a6106617"} Dec 09 10:55:36 crc kubenswrapper[5002]: I1209 10:55:36.191161 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtwlg" event={"ID":"8910aed0-a90b-4e55-b361-4564c8c0c5c3","Type":"ContainerStarted","Data":"ae2ecfe33bbdc7e5ea6bfd546c4ebe7be39e8aa2474c8825f7b7952dfc336d01"} Dec 09 10:55:36 crc kubenswrapper[5002]: I1209 10:55:36.192300 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 10:55:40 crc kubenswrapper[5002]: I1209 10:55:40.224514 5002 generic.go:334] "Generic (PLEG): container finished" podID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" containerID="89b1defbd67c8ad06f090dcc495e264f378710ec75d203d77796e4810387f927" exitCode=0 Dec 09 10:55:40 crc kubenswrapper[5002]: I1209 10:55:40.224573 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtwlg" event={"ID":"8910aed0-a90b-4e55-b361-4564c8c0c5c3","Type":"ContainerDied","Data":"89b1defbd67c8ad06f090dcc495e264f378710ec75d203d77796e4810387f927"} Dec 09 10:55:42 crc kubenswrapper[5002]: I1209 10:55:42.241317 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtwlg" event={"ID":"8910aed0-a90b-4e55-b361-4564c8c0c5c3","Type":"ContainerStarted","Data":"bcf21911354fbf39d44fa8ae00fd242380b789410e129b9ececfb3059c8f8079"} Dec 09 10:55:42 crc kubenswrapper[5002]: I1209 10:55:42.262931 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mtwlg" podStartSLOduration=3.799362956 podStartE2EDuration="8.262906489s" podCreationTimestamp="2025-12-09 10:55:34 +0000 UTC" firstStartedPulling="2025-12-09 10:55:36.192065294 +0000 UTC m=+3268.584116375" lastFinishedPulling="2025-12-09 10:55:40.655608827 +0000 UTC m=+3273.047659908" observedRunningTime="2025-12-09 10:55:42.258307645 +0000 UTC m=+3274.650358726" watchObservedRunningTime="2025-12-09 10:55:42.262906489 +0000 UTC m=+3274.654957570" Dec 09 10:55:45 crc kubenswrapper[5002]: I1209 10:55:45.060298 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:55:45 crc kubenswrapper[5002]: E1209 10:55:45.060873 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:55:45 crc kubenswrapper[5002]: I1209 10:55:45.160992 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:45 crc kubenswrapper[5002]: I1209 10:55:45.161224 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:45 crc kubenswrapper[5002]: I1209 10:55:45.212073 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:46 crc kubenswrapper[5002]: I1209 10:55:46.309645 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:46 crc kubenswrapper[5002]: I1209 10:55:46.354426 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtwlg"] Dec 09 10:55:48 crc kubenswrapper[5002]: I1209 10:55:48.284056 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mtwlg" podUID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" containerName="registry-server" containerID="cri-o://bcf21911354fbf39d44fa8ae00fd242380b789410e129b9ececfb3059c8f8079" gracePeriod=2 Dec 09 10:55:49 crc kubenswrapper[5002]: I1209 10:55:49.297181 5002 generic.go:334] "Generic (PLEG): container finished" podID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" containerID="bcf21911354fbf39d44fa8ae00fd242380b789410e129b9ececfb3059c8f8079" exitCode=0 Dec 09 10:55:49 crc kubenswrapper[5002]: I1209 10:55:49.297260 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtwlg" event={"ID":"8910aed0-a90b-4e55-b361-4564c8c0c5c3","Type":"ContainerDied","Data":"bcf21911354fbf39d44fa8ae00fd242380b789410e129b9ececfb3059c8f8079"} Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.059726 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.229852 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-utilities\") pod \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.230025 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-catalog-content\") pod \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.230108 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ql4g\" (UniqueName: \"kubernetes.io/projected/8910aed0-a90b-4e55-b361-4564c8c0c5c3-kube-api-access-2ql4g\") pod \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\" (UID: \"8910aed0-a90b-4e55-b361-4564c8c0c5c3\") " Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.233480 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-utilities" (OuterVolumeSpecName: "utilities") pod "8910aed0-a90b-4e55-b361-4564c8c0c5c3" (UID: "8910aed0-a90b-4e55-b361-4564c8c0c5c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.242249 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8910aed0-a90b-4e55-b361-4564c8c0c5c3-kube-api-access-2ql4g" (OuterVolumeSpecName: "kube-api-access-2ql4g") pod "8910aed0-a90b-4e55-b361-4564c8c0c5c3" (UID: "8910aed0-a90b-4e55-b361-4564c8c0c5c3"). InnerVolumeSpecName "kube-api-access-2ql4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.264540 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8910aed0-a90b-4e55-b361-4564c8c0c5c3" (UID: "8910aed0-a90b-4e55-b361-4564c8c0c5c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.306666 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtwlg" Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.306709 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtwlg" event={"ID":"8910aed0-a90b-4e55-b361-4564c8c0c5c3","Type":"ContainerDied","Data":"ae2ecfe33bbdc7e5ea6bfd546c4ebe7be39e8aa2474c8825f7b7952dfc336d01"} Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.306846 5002 scope.go:117] "RemoveContainer" containerID="bcf21911354fbf39d44fa8ae00fd242380b789410e129b9ececfb3059c8f8079" Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.334080 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.334117 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8910aed0-a90b-4e55-b361-4564c8c0c5c3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.334128 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ql4g\" (UniqueName: \"kubernetes.io/projected/8910aed0-a90b-4e55-b361-4564c8c0c5c3-kube-api-access-2ql4g\") on node \"crc\" DevicePath \"\"" Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.344358 5002 scope.go:117] "RemoveContainer" containerID="89b1defbd67c8ad06f090dcc495e264f378710ec75d203d77796e4810387f927" Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.347024 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtwlg"] Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.351778 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtwlg"] Dec 09 10:55:50 crc kubenswrapper[5002]: I1209 10:55:50.377341 5002 scope.go:117] "RemoveContainer" containerID="f7d3901d28d10d84e6bd43824e27d863c0b65a06c4d50f38571ec9e7a6106617" Dec 09 10:55:52 crc kubenswrapper[5002]: I1209 10:55:52.074115 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" path="/var/lib/kubelet/pods/8910aed0-a90b-4e55-b361-4564c8c0c5c3/volumes" Dec 09 10:55:58 crc kubenswrapper[5002]: I1209 10:55:58.066875 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:55:58 crc kubenswrapper[5002]: E1209 10:55:58.067853 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:56:12 crc kubenswrapper[5002]: I1209 10:56:12.060313 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:56:12 crc kubenswrapper[5002]: E1209 10:56:12.061031 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:56:27 crc kubenswrapper[5002]: I1209 10:56:27.060772 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:56:27 crc kubenswrapper[5002]: E1209 10:56:27.061553 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:56:39 crc kubenswrapper[5002]: I1209 10:56:39.061643 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:56:39 crc kubenswrapper[5002]: E1209 10:56:39.062690 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:56:54 crc kubenswrapper[5002]: I1209 10:56:54.061156 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:56:54 crc kubenswrapper[5002]: E1209 10:56:54.062542 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:57:06 crc kubenswrapper[5002]: I1209 10:57:06.060085 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:57:06 crc kubenswrapper[5002]: E1209 10:57:06.060841 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:57:19 crc kubenswrapper[5002]: I1209 10:57:19.060718 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:57:19 crc kubenswrapper[5002]: E1209 10:57:19.063006 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:57:30 crc kubenswrapper[5002]: I1209 10:57:30.061094 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:57:30 crc kubenswrapper[5002]: E1209 10:57:30.062330 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:57:41 crc kubenswrapper[5002]: I1209 10:57:41.060456 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:57:41 crc kubenswrapper[5002]: E1209 10:57:41.061342 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:57:54 crc kubenswrapper[5002]: I1209 10:57:54.060743 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:57:54 crc kubenswrapper[5002]: E1209 10:57:54.062010 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 10:58:08 crc kubenswrapper[5002]: I1209 10:58:08.064214 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 10:58:08 crc kubenswrapper[5002]: I1209 10:58:08.421422 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"a6724d37eb3b52330ed87c33a37154162c1547d16b97b3dcf1ce00d13e350de1"} Dec 09 10:58:39 crc kubenswrapper[5002]: I1209 10:58:39.688163 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hfhkj"] Dec 09 10:58:39 crc kubenswrapper[5002]: E1209 10:58:39.689057 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" containerName="extract-content" Dec 09 10:58:39 crc kubenswrapper[5002]: I1209 10:58:39.689072 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" containerName="extract-content" Dec 09 10:58:39 crc kubenswrapper[5002]: E1209 10:58:39.689092 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" containerName="registry-server" Dec 09 10:58:39 crc kubenswrapper[5002]: I1209 10:58:39.689102 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" containerName="registry-server" Dec 09 10:58:39 crc kubenswrapper[5002]: E1209 10:58:39.689121 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" containerName="extract-utilities" Dec 09 10:58:39 crc kubenswrapper[5002]: I1209 10:58:39.689130 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" containerName="extract-utilities" Dec 09 10:58:39 crc kubenswrapper[5002]: I1209 10:58:39.689322 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="8910aed0-a90b-4e55-b361-4564c8c0c5c3" containerName="registry-server" Dec 09 10:58:39 crc kubenswrapper[5002]: I1209 10:58:39.690530 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:39 crc kubenswrapper[5002]: I1209 10:58:39.699647 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hfhkj"] Dec 09 10:58:39 crc kubenswrapper[5002]: I1209 10:58:39.879788 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-catalog-content\") pod \"redhat-operators-hfhkj\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:39 crc kubenswrapper[5002]: I1209 10:58:39.879899 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9d2v\" (UniqueName: \"kubernetes.io/projected/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-kube-api-access-w9d2v\") pod \"redhat-operators-hfhkj\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:39 crc kubenswrapper[5002]: I1209 10:58:39.880087 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-utilities\") pod \"redhat-operators-hfhkj\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:40 crc kubenswrapper[5002]: I1209 10:58:40.272718 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9d2v\" (UniqueName: \"kubernetes.io/projected/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-kube-api-access-w9d2v\") pod \"redhat-operators-hfhkj\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:40 crc kubenswrapper[5002]: I1209 10:58:40.272853 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-utilities\") pod \"redhat-operators-hfhkj\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:40 crc kubenswrapper[5002]: I1209 10:58:40.272886 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-catalog-content\") pod \"redhat-operators-hfhkj\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:40 crc kubenswrapper[5002]: I1209 10:58:40.273327 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-utilities\") pod \"redhat-operators-hfhkj\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:40 crc kubenswrapper[5002]: I1209 10:58:40.273393 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-catalog-content\") pod \"redhat-operators-hfhkj\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:40 crc kubenswrapper[5002]: I1209 10:58:40.303318 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9d2v\" (UniqueName: \"kubernetes.io/projected/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-kube-api-access-w9d2v\") pod \"redhat-operators-hfhkj\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:40 crc kubenswrapper[5002]: I1209 10:58:40.336456 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:40 crc kubenswrapper[5002]: I1209 10:58:40.609743 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hfhkj"] Dec 09 10:58:40 crc kubenswrapper[5002]: I1209 10:58:40.703443 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hfhkj" event={"ID":"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef","Type":"ContainerStarted","Data":"cb977d72c2661c2539bc27dc2eda0850e3bbaac3ea010b740aa97f707d8bb992"} Dec 09 10:58:41 crc kubenswrapper[5002]: I1209 10:58:41.711351 5002 generic.go:334] "Generic (PLEG): container finished" podID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" containerID="f6bf9b8155cd422e776e261e5985616fa26762472ce0b08ec12c6a35a854fc0b" exitCode=0 Dec 09 10:58:41 crc kubenswrapper[5002]: I1209 10:58:41.711399 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hfhkj" event={"ID":"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef","Type":"ContainerDied","Data":"f6bf9b8155cd422e776e261e5985616fa26762472ce0b08ec12c6a35a854fc0b"} Dec 09 10:58:42 crc kubenswrapper[5002]: I1209 10:58:42.720907 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hfhkj" event={"ID":"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef","Type":"ContainerStarted","Data":"efb9d5e26730eae5e5540f57e1043dd900e318d0eaeb73a0e3f0cba8d2ea01f4"} Dec 09 10:58:43 crc kubenswrapper[5002]: I1209 10:58:43.729314 5002 generic.go:334] "Generic (PLEG): container finished" podID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" containerID="efb9d5e26730eae5e5540f57e1043dd900e318d0eaeb73a0e3f0cba8d2ea01f4" exitCode=0 Dec 09 10:58:43 crc kubenswrapper[5002]: I1209 10:58:43.729364 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hfhkj" event={"ID":"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef","Type":"ContainerDied","Data":"efb9d5e26730eae5e5540f57e1043dd900e318d0eaeb73a0e3f0cba8d2ea01f4"} Dec 09 10:58:44 crc kubenswrapper[5002]: I1209 10:58:44.739486 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hfhkj" event={"ID":"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef","Type":"ContainerStarted","Data":"5766a4acac9865baf74c475d64031b33a81011e33811ddd5bb65195d2539db1d"} Dec 09 10:58:44 crc kubenswrapper[5002]: I1209 10:58:44.764995 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hfhkj" podStartSLOduration=3.351734809 podStartE2EDuration="5.764969383s" podCreationTimestamp="2025-12-09 10:58:39 +0000 UTC" firstStartedPulling="2025-12-09 10:58:41.713298875 +0000 UTC m=+3454.105349966" lastFinishedPulling="2025-12-09 10:58:44.126533439 +0000 UTC m=+3456.518584540" observedRunningTime="2025-12-09 10:58:44.76111759 +0000 UTC m=+3457.153168681" watchObservedRunningTime="2025-12-09 10:58:44.764969383 +0000 UTC m=+3457.157020464" Dec 09 10:58:50 crc kubenswrapper[5002]: I1209 10:58:50.337313 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:50 crc kubenswrapper[5002]: I1209 10:58:50.337723 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:50 crc kubenswrapper[5002]: I1209 10:58:50.383421 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:50 crc kubenswrapper[5002]: I1209 10:58:50.827277 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:50 crc kubenswrapper[5002]: I1209 10:58:50.875081 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hfhkj"] Dec 09 10:58:52 crc kubenswrapper[5002]: I1209 10:58:52.795722 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hfhkj" podUID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" containerName="registry-server" containerID="cri-o://5766a4acac9865baf74c475d64031b33a81011e33811ddd5bb65195d2539db1d" gracePeriod=2 Dec 09 10:58:55 crc kubenswrapper[5002]: I1209 10:58:55.823463 5002 generic.go:334] "Generic (PLEG): container finished" podID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" containerID="5766a4acac9865baf74c475d64031b33a81011e33811ddd5bb65195d2539db1d" exitCode=0 Dec 09 10:58:55 crc kubenswrapper[5002]: I1209 10:58:55.823547 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hfhkj" event={"ID":"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef","Type":"ContainerDied","Data":"5766a4acac9865baf74c475d64031b33a81011e33811ddd5bb65195d2539db1d"} Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.028340 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.127044 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-utilities\") pod \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.127112 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9d2v\" (UniqueName: \"kubernetes.io/projected/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-kube-api-access-w9d2v\") pod \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.127166 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-catalog-content\") pod \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\" (UID: \"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef\") " Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.127974 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-utilities" (OuterVolumeSpecName: "utilities") pod "376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" (UID: "376c46e1-d3fd-45c0-949b-65c9b9d4a5ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.132971 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-kube-api-access-w9d2v" (OuterVolumeSpecName: "kube-api-access-w9d2v") pod "376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" (UID: "376c46e1-d3fd-45c0-949b-65c9b9d4a5ef"). InnerVolumeSpecName "kube-api-access-w9d2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.228821 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9d2v\" (UniqueName: \"kubernetes.io/projected/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-kube-api-access-w9d2v\") on node \"crc\" DevicePath \"\"" Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.228903 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.245593 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" (UID: "376c46e1-d3fd-45c0-949b-65c9b9d4a5ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.330334 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.842382 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hfhkj" event={"ID":"376c46e1-d3fd-45c0-949b-65c9b9d4a5ef","Type":"ContainerDied","Data":"cb977d72c2661c2539bc27dc2eda0850e3bbaac3ea010b740aa97f707d8bb992"} Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.842434 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hfhkj" Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.842449 5002 scope.go:117] "RemoveContainer" containerID="5766a4acac9865baf74c475d64031b33a81011e33811ddd5bb65195d2539db1d" Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.863213 5002 scope.go:117] "RemoveContainer" containerID="efb9d5e26730eae5e5540f57e1043dd900e318d0eaeb73a0e3f0cba8d2ea01f4" Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.878301 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hfhkj"] Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.890916 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hfhkj"] Dec 09 10:58:57 crc kubenswrapper[5002]: I1209 10:58:57.898246 5002 scope.go:117] "RemoveContainer" containerID="f6bf9b8155cd422e776e261e5985616fa26762472ce0b08ec12c6a35a854fc0b" Dec 09 10:58:58 crc kubenswrapper[5002]: I1209 10:58:58.069164 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" path="/var/lib/kubelet/pods/376c46e1-d3fd-45c0-949b-65c9b9d4a5ef/volumes" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.293218 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dlnmc"] Dec 09 10:59:05 crc kubenswrapper[5002]: E1209 10:59:05.295910 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" containerName="extract-utilities" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.295967 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" containerName="extract-utilities" Dec 09 10:59:05 crc kubenswrapper[5002]: E1209 10:59:05.295992 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" containerName="extract-content" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.296005 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" containerName="extract-content" Dec 09 10:59:05 crc kubenswrapper[5002]: E1209 10:59:05.296049 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" containerName="registry-server" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.296059 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" containerName="registry-server" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.296222 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="376c46e1-d3fd-45c0-949b-65c9b9d4a5ef" containerName="registry-server" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.297484 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.306207 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dlnmc"] Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.345250 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-utilities\") pod \"community-operators-dlnmc\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.345327 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-catalog-content\") pod \"community-operators-dlnmc\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.345388 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9pnk\" (UniqueName: \"kubernetes.io/projected/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-kube-api-access-g9pnk\") pod \"community-operators-dlnmc\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.446865 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-catalog-content\") pod \"community-operators-dlnmc\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.447136 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9pnk\" (UniqueName: \"kubernetes.io/projected/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-kube-api-access-g9pnk\") pod \"community-operators-dlnmc\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.447325 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-utilities\") pod \"community-operators-dlnmc\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.447410 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-catalog-content\") pod \"community-operators-dlnmc\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.447687 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-utilities\") pod \"community-operators-dlnmc\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.473017 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9pnk\" (UniqueName: \"kubernetes.io/projected/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-kube-api-access-g9pnk\") pod \"community-operators-dlnmc\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.619855 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.889490 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rrgmt"] Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.891703 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.908208 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rrgmt"] Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.958265 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-utilities\") pod \"certified-operators-rrgmt\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.958356 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-catalog-content\") pod \"certified-operators-rrgmt\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:05 crc kubenswrapper[5002]: I1209 10:59:05.958414 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmst9\" (UniqueName: \"kubernetes.io/projected/527e891a-1888-4ca3-9fc3-ba7571713fe2-kube-api-access-mmst9\") pod \"certified-operators-rrgmt\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:06 crc kubenswrapper[5002]: I1209 10:59:06.060955 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-catalog-content\") pod \"certified-operators-rrgmt\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:06 crc kubenswrapper[5002]: I1209 10:59:06.061134 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmst9\" (UniqueName: \"kubernetes.io/projected/527e891a-1888-4ca3-9fc3-ba7571713fe2-kube-api-access-mmst9\") pod \"certified-operators-rrgmt\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:06 crc kubenswrapper[5002]: I1209 10:59:06.061567 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-catalog-content\") pod \"certified-operators-rrgmt\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:06 crc kubenswrapper[5002]: I1209 10:59:06.061922 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-utilities\") pod \"certified-operators-rrgmt\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:06 crc kubenswrapper[5002]: I1209 10:59:06.062508 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-utilities\") pod \"certified-operators-rrgmt\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:06 crc kubenswrapper[5002]: I1209 10:59:06.078853 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmst9\" (UniqueName: \"kubernetes.io/projected/527e891a-1888-4ca3-9fc3-ba7571713fe2-kube-api-access-mmst9\") pod \"certified-operators-rrgmt\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:06 crc kubenswrapper[5002]: I1209 10:59:06.123164 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dlnmc"] Dec 09 10:59:06 crc kubenswrapper[5002]: I1209 10:59:06.213148 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:06 crc kubenswrapper[5002]: I1209 10:59:06.655283 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rrgmt"] Dec 09 10:59:06 crc kubenswrapper[5002]: W1209 10:59:06.666471 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod527e891a_1888_4ca3_9fc3_ba7571713fe2.slice/crio-8bf8c93322cb1fa259e20f181dd0ba3e7d4da0ca1b572947643b6c0c8be798f3 WatchSource:0}: Error finding container 8bf8c93322cb1fa259e20f181dd0ba3e7d4da0ca1b572947643b6c0c8be798f3: Status 404 returned error can't find the container with id 8bf8c93322cb1fa259e20f181dd0ba3e7d4da0ca1b572947643b6c0c8be798f3 Dec 09 10:59:06 crc kubenswrapper[5002]: I1209 10:59:06.936236 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dlnmc" event={"ID":"e2a1416b-c2dc-4d68-8d76-56f2a76771b7","Type":"ContainerStarted","Data":"e979cadd34523f55b3769af82d4c0b9054b983baf8e39caf69777909dce6f661"} Dec 09 10:59:06 crc kubenswrapper[5002]: I1209 10:59:06.937672 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrgmt" event={"ID":"527e891a-1888-4ca3-9fc3-ba7571713fe2","Type":"ContainerStarted","Data":"8bf8c93322cb1fa259e20f181dd0ba3e7d4da0ca1b572947643b6c0c8be798f3"} Dec 09 10:59:07 crc kubenswrapper[5002]: I1209 10:59:07.947185 5002 generic.go:334] "Generic (PLEG): container finished" podID="527e891a-1888-4ca3-9fc3-ba7571713fe2" containerID="e9a46d7539eb73f4900a00cf817221d1dba89a5c2543e1564e8ba94530444fec" exitCode=0 Dec 09 10:59:07 crc kubenswrapper[5002]: I1209 10:59:07.947280 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrgmt" event={"ID":"527e891a-1888-4ca3-9fc3-ba7571713fe2","Type":"ContainerDied","Data":"e9a46d7539eb73f4900a00cf817221d1dba89a5c2543e1564e8ba94530444fec"} Dec 09 10:59:07 crc kubenswrapper[5002]: I1209 10:59:07.950453 5002 generic.go:334] "Generic (PLEG): container finished" podID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" containerID="8251082640ed7c8e7de1b9a33e81d33d800462ba091fee8a7d8b130fc211d08e" exitCode=0 Dec 09 10:59:07 crc kubenswrapper[5002]: I1209 10:59:07.950515 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dlnmc" event={"ID":"e2a1416b-c2dc-4d68-8d76-56f2a76771b7","Type":"ContainerDied","Data":"8251082640ed7c8e7de1b9a33e81d33d800462ba091fee8a7d8b130fc211d08e"} Dec 09 10:59:09 crc kubenswrapper[5002]: I1209 10:59:09.970899 5002 generic.go:334] "Generic (PLEG): container finished" podID="527e891a-1888-4ca3-9fc3-ba7571713fe2" containerID="a784972951eb448ac405692ef70616589bda94f8156634b8bd48063f150012ec" exitCode=0 Dec 09 10:59:09 crc kubenswrapper[5002]: I1209 10:59:09.970973 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrgmt" event={"ID":"527e891a-1888-4ca3-9fc3-ba7571713fe2","Type":"ContainerDied","Data":"a784972951eb448ac405692ef70616589bda94f8156634b8bd48063f150012ec"} Dec 09 10:59:09 crc kubenswrapper[5002]: I1209 10:59:09.975064 5002 generic.go:334] "Generic (PLEG): container finished" podID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" containerID="ba5b8edd2f93a52da6e4d587311c040dfc5f2b0a911a3a6f332fa936e9837d1d" exitCode=0 Dec 09 10:59:09 crc kubenswrapper[5002]: I1209 10:59:09.975145 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dlnmc" event={"ID":"e2a1416b-c2dc-4d68-8d76-56f2a76771b7","Type":"ContainerDied","Data":"ba5b8edd2f93a52da6e4d587311c040dfc5f2b0a911a3a6f332fa936e9837d1d"} Dec 09 10:59:10 crc kubenswrapper[5002]: I1209 10:59:10.985015 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrgmt" event={"ID":"527e891a-1888-4ca3-9fc3-ba7571713fe2","Type":"ContainerStarted","Data":"6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8"} Dec 09 10:59:10 crc kubenswrapper[5002]: I1209 10:59:10.986891 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dlnmc" event={"ID":"e2a1416b-c2dc-4d68-8d76-56f2a76771b7","Type":"ContainerStarted","Data":"193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0"} Dec 09 10:59:11 crc kubenswrapper[5002]: I1209 10:59:11.009167 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rrgmt" podStartSLOduration=3.336773943 podStartE2EDuration="6.009150508s" podCreationTimestamp="2025-12-09 10:59:05 +0000 UTC" firstStartedPulling="2025-12-09 10:59:07.949483497 +0000 UTC m=+3480.341534588" lastFinishedPulling="2025-12-09 10:59:10.621860052 +0000 UTC m=+3483.013911153" observedRunningTime="2025-12-09 10:59:11.007041632 +0000 UTC m=+3483.399092723" watchObservedRunningTime="2025-12-09 10:59:11.009150508 +0000 UTC m=+3483.401201599" Dec 09 10:59:11 crc kubenswrapper[5002]: I1209 10:59:11.027777 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dlnmc" podStartSLOduration=3.5022103060000003 podStartE2EDuration="6.027757457s" podCreationTimestamp="2025-12-09 10:59:05 +0000 UTC" firstStartedPulling="2025-12-09 10:59:07.951787519 +0000 UTC m=+3480.343838620" lastFinishedPulling="2025-12-09 10:59:10.47733467 +0000 UTC m=+3482.869385771" observedRunningTime="2025-12-09 10:59:11.022581828 +0000 UTC m=+3483.414632919" watchObservedRunningTime="2025-12-09 10:59:11.027757457 +0000 UTC m=+3483.419808538" Dec 09 10:59:15 crc kubenswrapper[5002]: I1209 10:59:15.620645 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:15 crc kubenswrapper[5002]: I1209 10:59:15.621073 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:15 crc kubenswrapper[5002]: I1209 10:59:15.662859 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:16 crc kubenswrapper[5002]: I1209 10:59:16.069205 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:16 crc kubenswrapper[5002]: I1209 10:59:16.113220 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dlnmc"] Dec 09 10:59:16 crc kubenswrapper[5002]: I1209 10:59:16.213418 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:16 crc kubenswrapper[5002]: I1209 10:59:16.213520 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:16 crc kubenswrapper[5002]: I1209 10:59:16.273695 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:17 crc kubenswrapper[5002]: I1209 10:59:17.073234 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.039549 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dlnmc" podUID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" containerName="registry-server" containerID="cri-o://193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0" gracePeriod=2 Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.443731 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.640410 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9pnk\" (UniqueName: \"kubernetes.io/projected/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-kube-api-access-g9pnk\") pod \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.640508 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-utilities\") pod \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.640543 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-catalog-content\") pod \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\" (UID: \"e2a1416b-c2dc-4d68-8d76-56f2a76771b7\") " Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.642051 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-utilities" (OuterVolumeSpecName: "utilities") pod "e2a1416b-c2dc-4d68-8d76-56f2a76771b7" (UID: "e2a1416b-c2dc-4d68-8d76-56f2a76771b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.648088 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-kube-api-access-g9pnk" (OuterVolumeSpecName: "kube-api-access-g9pnk") pod "e2a1416b-c2dc-4d68-8d76-56f2a76771b7" (UID: "e2a1416b-c2dc-4d68-8d76-56f2a76771b7"). InnerVolumeSpecName "kube-api-access-g9pnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.698605 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rrgmt"] Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.701305 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2a1416b-c2dc-4d68-8d76-56f2a76771b7" (UID: "e2a1416b-c2dc-4d68-8d76-56f2a76771b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.741727 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.741768 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9pnk\" (UniqueName: \"kubernetes.io/projected/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-kube-api-access-g9pnk\") on node \"crc\" DevicePath \"\"" Dec 09 10:59:18 crc kubenswrapper[5002]: I1209 10:59:18.741784 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2a1416b-c2dc-4d68-8d76-56f2a76771b7-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.050518 5002 generic.go:334] "Generic (PLEG): container finished" podID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" containerID="193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0" exitCode=0 Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.050602 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dlnmc" event={"ID":"e2a1416b-c2dc-4d68-8d76-56f2a76771b7","Type":"ContainerDied","Data":"193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0"} Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.050677 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dlnmc" event={"ID":"e2a1416b-c2dc-4d68-8d76-56f2a76771b7","Type":"ContainerDied","Data":"e979cadd34523f55b3769af82d4c0b9054b983baf8e39caf69777909dce6f661"} Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.050895 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rrgmt" podUID="527e891a-1888-4ca3-9fc3-ba7571713fe2" containerName="registry-server" containerID="cri-o://6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8" gracePeriod=2 Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.050915 5002 scope.go:117] "RemoveContainer" containerID="193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.050627 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dlnmc" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.083471 5002 scope.go:117] "RemoveContainer" containerID="ba5b8edd2f93a52da6e4d587311c040dfc5f2b0a911a3a6f332fa936e9837d1d" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.086133 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dlnmc"] Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.092775 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dlnmc"] Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.104224 5002 scope.go:117] "RemoveContainer" containerID="8251082640ed7c8e7de1b9a33e81d33d800462ba091fee8a7d8b130fc211d08e" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.220481 5002 scope.go:117] "RemoveContainer" containerID="193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0" Dec 09 10:59:19 crc kubenswrapper[5002]: E1209 10:59:19.221803 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0\": container with ID starting with 193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0 not found: ID does not exist" containerID="193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.221943 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0"} err="failed to get container status \"193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0\": rpc error: code = NotFound desc = could not find container \"193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0\": container with ID starting with 193df1513ea94b0ab6d20adb86dca3ed2cf02974032d0ef96c367821facc09c0 not found: ID does not exist" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.222009 5002 scope.go:117] "RemoveContainer" containerID="ba5b8edd2f93a52da6e4d587311c040dfc5f2b0a911a3a6f332fa936e9837d1d" Dec 09 10:59:19 crc kubenswrapper[5002]: E1209 10:59:19.222538 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba5b8edd2f93a52da6e4d587311c040dfc5f2b0a911a3a6f332fa936e9837d1d\": container with ID starting with ba5b8edd2f93a52da6e4d587311c040dfc5f2b0a911a3a6f332fa936e9837d1d not found: ID does not exist" containerID="ba5b8edd2f93a52da6e4d587311c040dfc5f2b0a911a3a6f332fa936e9837d1d" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.222577 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5b8edd2f93a52da6e4d587311c040dfc5f2b0a911a3a6f332fa936e9837d1d"} err="failed to get container status \"ba5b8edd2f93a52da6e4d587311c040dfc5f2b0a911a3a6f332fa936e9837d1d\": rpc error: code = NotFound desc = could not find container \"ba5b8edd2f93a52da6e4d587311c040dfc5f2b0a911a3a6f332fa936e9837d1d\": container with ID starting with ba5b8edd2f93a52da6e4d587311c040dfc5f2b0a911a3a6f332fa936e9837d1d not found: ID does not exist" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.222608 5002 scope.go:117] "RemoveContainer" containerID="8251082640ed7c8e7de1b9a33e81d33d800462ba091fee8a7d8b130fc211d08e" Dec 09 10:59:19 crc kubenswrapper[5002]: E1209 10:59:19.223200 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8251082640ed7c8e7de1b9a33e81d33d800462ba091fee8a7d8b130fc211d08e\": container with ID starting with 8251082640ed7c8e7de1b9a33e81d33d800462ba091fee8a7d8b130fc211d08e not found: ID does not exist" containerID="8251082640ed7c8e7de1b9a33e81d33d800462ba091fee8a7d8b130fc211d08e" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.223297 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8251082640ed7c8e7de1b9a33e81d33d800462ba091fee8a7d8b130fc211d08e"} err="failed to get container status \"8251082640ed7c8e7de1b9a33e81d33d800462ba091fee8a7d8b130fc211d08e\": rpc error: code = NotFound desc = could not find container \"8251082640ed7c8e7de1b9a33e81d33d800462ba091fee8a7d8b130fc211d08e\": container with ID starting with 8251082640ed7c8e7de1b9a33e81d33d800462ba091fee8a7d8b130fc211d08e not found: ID does not exist" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.466585 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.552208 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-utilities\") pod \"527e891a-1888-4ca3-9fc3-ba7571713fe2\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.552294 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmst9\" (UniqueName: \"kubernetes.io/projected/527e891a-1888-4ca3-9fc3-ba7571713fe2-kube-api-access-mmst9\") pod \"527e891a-1888-4ca3-9fc3-ba7571713fe2\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.552320 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-catalog-content\") pod \"527e891a-1888-4ca3-9fc3-ba7571713fe2\" (UID: \"527e891a-1888-4ca3-9fc3-ba7571713fe2\") " Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.554011 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-utilities" (OuterVolumeSpecName: "utilities") pod "527e891a-1888-4ca3-9fc3-ba7571713fe2" (UID: "527e891a-1888-4ca3-9fc3-ba7571713fe2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.559026 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527e891a-1888-4ca3-9fc3-ba7571713fe2-kube-api-access-mmst9" (OuterVolumeSpecName: "kube-api-access-mmst9") pod "527e891a-1888-4ca3-9fc3-ba7571713fe2" (UID: "527e891a-1888-4ca3-9fc3-ba7571713fe2"). InnerVolumeSpecName "kube-api-access-mmst9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.653903 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.653958 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmst9\" (UniqueName: \"kubernetes.io/projected/527e891a-1888-4ca3-9fc3-ba7571713fe2-kube-api-access-mmst9\") on node \"crc\" DevicePath \"\"" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.915123 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "527e891a-1888-4ca3-9fc3-ba7571713fe2" (UID: "527e891a-1888-4ca3-9fc3-ba7571713fe2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 10:59:19 crc kubenswrapper[5002]: I1209 10:59:19.957780 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527e891a-1888-4ca3-9fc3-ba7571713fe2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.060276 5002 generic.go:334] "Generic (PLEG): container finished" podID="527e891a-1888-4ca3-9fc3-ba7571713fe2" containerID="6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8" exitCode=0 Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.060350 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrgmt" Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.073416 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" path="/var/lib/kubelet/pods/e2a1416b-c2dc-4d68-8d76-56f2a76771b7/volumes" Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.075542 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrgmt" event={"ID":"527e891a-1888-4ca3-9fc3-ba7571713fe2","Type":"ContainerDied","Data":"6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8"} Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.075589 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrgmt" event={"ID":"527e891a-1888-4ca3-9fc3-ba7571713fe2","Type":"ContainerDied","Data":"8bf8c93322cb1fa259e20f181dd0ba3e7d4da0ca1b572947643b6c0c8be798f3"} Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.075619 5002 scope.go:117] "RemoveContainer" containerID="6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8" Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.098479 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rrgmt"] Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.108102 5002 scope.go:117] "RemoveContainer" containerID="a784972951eb448ac405692ef70616589bda94f8156634b8bd48063f150012ec" Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.110376 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rrgmt"] Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.128523 5002 scope.go:117] "RemoveContainer" containerID="e9a46d7539eb73f4900a00cf817221d1dba89a5c2543e1564e8ba94530444fec" Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.143527 5002 scope.go:117] "RemoveContainer" containerID="6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8" Dec 09 10:59:20 crc kubenswrapper[5002]: E1209 10:59:20.144041 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8\": container with ID starting with 6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8 not found: ID does not exist" containerID="6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8" Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.144226 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8"} err="failed to get container status \"6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8\": rpc error: code = NotFound desc = could not find container \"6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8\": container with ID starting with 6f145e38ace5c0f304a128e55e4d654d01e39061f6fcc2dc723be250ec1c5fb8 not found: ID does not exist" Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.144344 5002 scope.go:117] "RemoveContainer" containerID="a784972951eb448ac405692ef70616589bda94f8156634b8bd48063f150012ec" Dec 09 10:59:20 crc kubenswrapper[5002]: E1209 10:59:20.144777 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a784972951eb448ac405692ef70616589bda94f8156634b8bd48063f150012ec\": container with ID starting with a784972951eb448ac405692ef70616589bda94f8156634b8bd48063f150012ec not found: ID does not exist" containerID="a784972951eb448ac405692ef70616589bda94f8156634b8bd48063f150012ec" Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.144821 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a784972951eb448ac405692ef70616589bda94f8156634b8bd48063f150012ec"} err="failed to get container status \"a784972951eb448ac405692ef70616589bda94f8156634b8bd48063f150012ec\": rpc error: code = NotFound desc = could not find container \"a784972951eb448ac405692ef70616589bda94f8156634b8bd48063f150012ec\": container with ID starting with a784972951eb448ac405692ef70616589bda94f8156634b8bd48063f150012ec not found: ID does not exist" Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.144849 5002 scope.go:117] "RemoveContainer" containerID="e9a46d7539eb73f4900a00cf817221d1dba89a5c2543e1564e8ba94530444fec" Dec 09 10:59:20 crc kubenswrapper[5002]: E1209 10:59:20.145178 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a46d7539eb73f4900a00cf817221d1dba89a5c2543e1564e8ba94530444fec\": container with ID starting with e9a46d7539eb73f4900a00cf817221d1dba89a5c2543e1564e8ba94530444fec not found: ID does not exist" containerID="e9a46d7539eb73f4900a00cf817221d1dba89a5c2543e1564e8ba94530444fec" Dec 09 10:59:20 crc kubenswrapper[5002]: I1209 10:59:20.145206 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a46d7539eb73f4900a00cf817221d1dba89a5c2543e1564e8ba94530444fec"} err="failed to get container status \"e9a46d7539eb73f4900a00cf817221d1dba89a5c2543e1564e8ba94530444fec\": rpc error: code = NotFound desc = could not find container \"e9a46d7539eb73f4900a00cf817221d1dba89a5c2543e1564e8ba94530444fec\": container with ID starting with e9a46d7539eb73f4900a00cf817221d1dba89a5c2543e1564e8ba94530444fec not found: ID does not exist" Dec 09 10:59:22 crc kubenswrapper[5002]: I1209 10:59:22.070305 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527e891a-1888-4ca3-9fc3-ba7571713fe2" path="/var/lib/kubelet/pods/527e891a-1888-4ca3-9fc3-ba7571713fe2/volumes" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.145658 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r"] Dec 09 11:00:00 crc kubenswrapper[5002]: E1209 11:00:00.146799 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527e891a-1888-4ca3-9fc3-ba7571713fe2" containerName="registry-server" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.146853 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="527e891a-1888-4ca3-9fc3-ba7571713fe2" containerName="registry-server" Dec 09 11:00:00 crc kubenswrapper[5002]: E1209 11:00:00.146869 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" containerName="extract-utilities" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.146877 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" containerName="extract-utilities" Dec 09 11:00:00 crc kubenswrapper[5002]: E1209 11:00:00.146907 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527e891a-1888-4ca3-9fc3-ba7571713fe2" containerName="extract-content" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.146916 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="527e891a-1888-4ca3-9fc3-ba7571713fe2" containerName="extract-content" Dec 09 11:00:00 crc kubenswrapper[5002]: E1209 11:00:00.146925 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" containerName="registry-server" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.146932 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" containerName="registry-server" Dec 09 11:00:00 crc kubenswrapper[5002]: E1209 11:00:00.146947 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" containerName="extract-content" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.146954 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" containerName="extract-content" Dec 09 11:00:00 crc kubenswrapper[5002]: E1209 11:00:00.146969 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527e891a-1888-4ca3-9fc3-ba7571713fe2" containerName="extract-utilities" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.146976 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="527e891a-1888-4ca3-9fc3-ba7571713fe2" containerName="extract-utilities" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.147160 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="527e891a-1888-4ca3-9fc3-ba7571713fe2" containerName="registry-server" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.147176 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2a1416b-c2dc-4d68-8d76-56f2a76771b7" containerName="registry-server" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.147780 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.152102 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.155406 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.175051 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r"] Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.209279 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz9rt\" (UniqueName: \"kubernetes.io/projected/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-kube-api-access-hz9rt\") pod \"collect-profiles-29421300-mvt9r\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.209349 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-config-volume\") pod \"collect-profiles-29421300-mvt9r\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.209365 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-secret-volume\") pod \"collect-profiles-29421300-mvt9r\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.310568 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz9rt\" (UniqueName: \"kubernetes.io/projected/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-kube-api-access-hz9rt\") pod \"collect-profiles-29421300-mvt9r\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.310699 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-config-volume\") pod \"collect-profiles-29421300-mvt9r\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.310735 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-secret-volume\") pod \"collect-profiles-29421300-mvt9r\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.311687 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-config-volume\") pod \"collect-profiles-29421300-mvt9r\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.317922 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-secret-volume\") pod \"collect-profiles-29421300-mvt9r\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.334023 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz9rt\" (UniqueName: \"kubernetes.io/projected/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-kube-api-access-hz9rt\") pod \"collect-profiles-29421300-mvt9r\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.475717 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:00 crc kubenswrapper[5002]: I1209 11:00:00.901695 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r"] Dec 09 11:00:01 crc kubenswrapper[5002]: I1209 11:00:01.400608 5002 generic.go:334] "Generic (PLEG): container finished" podID="cf7f5a20-7e58-4d15-ad78-e9b07d787bd8" containerID="5da5993d7769a91dbc972d90a4db88c2c61901167fc3a3ff4fca524165e1363f" exitCode=0 Dec 09 11:00:01 crc kubenswrapper[5002]: I1209 11:00:01.400673 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" event={"ID":"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8","Type":"ContainerDied","Data":"5da5993d7769a91dbc972d90a4db88c2c61901167fc3a3ff4fca524165e1363f"} Dec 09 11:00:01 crc kubenswrapper[5002]: I1209 11:00:01.402068 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" event={"ID":"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8","Type":"ContainerStarted","Data":"cb08b907c04eacb05ac368654f791c3c8eaa34459253d9c8035c38fec019edcf"} Dec 09 11:00:02 crc kubenswrapper[5002]: I1209 11:00:02.774183 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:02 crc kubenswrapper[5002]: I1209 11:00:02.947555 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-config-volume\") pod \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " Dec 09 11:00:02 crc kubenswrapper[5002]: I1209 11:00:02.947633 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-secret-volume\") pod \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " Dec 09 11:00:02 crc kubenswrapper[5002]: I1209 11:00:02.947673 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz9rt\" (UniqueName: \"kubernetes.io/projected/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-kube-api-access-hz9rt\") pod \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\" (UID: \"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8\") " Dec 09 11:00:02 crc kubenswrapper[5002]: I1209 11:00:02.948786 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-config-volume" (OuterVolumeSpecName: "config-volume") pod "cf7f5a20-7e58-4d15-ad78-e9b07d787bd8" (UID: "cf7f5a20-7e58-4d15-ad78-e9b07d787bd8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:00:02 crc kubenswrapper[5002]: I1209 11:00:02.954037 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cf7f5a20-7e58-4d15-ad78-e9b07d787bd8" (UID: "cf7f5a20-7e58-4d15-ad78-e9b07d787bd8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:00:02 crc kubenswrapper[5002]: I1209 11:00:02.954562 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-kube-api-access-hz9rt" (OuterVolumeSpecName: "kube-api-access-hz9rt") pod "cf7f5a20-7e58-4d15-ad78-e9b07d787bd8" (UID: "cf7f5a20-7e58-4d15-ad78-e9b07d787bd8"). InnerVolumeSpecName "kube-api-access-hz9rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:00:03 crc kubenswrapper[5002]: I1209 11:00:03.049575 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:00:03 crc kubenswrapper[5002]: I1209 11:00:03.049631 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz9rt\" (UniqueName: \"kubernetes.io/projected/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-kube-api-access-hz9rt\") on node \"crc\" DevicePath \"\"" Dec 09 11:00:03 crc kubenswrapper[5002]: I1209 11:00:03.049644 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:00:03 crc kubenswrapper[5002]: I1209 11:00:03.417780 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" event={"ID":"cf7f5a20-7e58-4d15-ad78-e9b07d787bd8","Type":"ContainerDied","Data":"cb08b907c04eacb05ac368654f791c3c8eaa34459253d9c8035c38fec019edcf"} Dec 09 11:00:03 crc kubenswrapper[5002]: I1209 11:00:03.417858 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb08b907c04eacb05ac368654f791c3c8eaa34459253d9c8035c38fec019edcf" Dec 09 11:00:03 crc kubenswrapper[5002]: I1209 11:00:03.417865 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r" Dec 09 11:00:03 crc kubenswrapper[5002]: I1209 11:00:03.851794 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v"] Dec 09 11:00:03 crc kubenswrapper[5002]: I1209 11:00:03.856761 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421255-9bs7v"] Dec 09 11:00:04 crc kubenswrapper[5002]: I1209 11:00:04.071270 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93168b2c-7da4-41aa-911a-3501ac4931e6" path="/var/lib/kubelet/pods/93168b2c-7da4-41aa-911a-3501ac4931e6/volumes" Dec 09 11:00:37 crc kubenswrapper[5002]: I1209 11:00:37.965277 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:00:37 crc kubenswrapper[5002]: I1209 11:00:37.966104 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:00:44 crc kubenswrapper[5002]: I1209 11:00:44.964487 5002 scope.go:117] "RemoveContainer" containerID="9c557a866dd6f766e68014badca9ed55533ce9360c966a957fdbab95e6fe7edc" Dec 09 11:01:07 crc kubenswrapper[5002]: I1209 11:01:07.964613 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:01:07 crc kubenswrapper[5002]: I1209 11:01:07.965231 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:01:37 crc kubenswrapper[5002]: I1209 11:01:37.964462 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:01:37 crc kubenswrapper[5002]: I1209 11:01:37.965207 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:01:37 crc kubenswrapper[5002]: I1209 11:01:37.965266 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 11:01:37 crc kubenswrapper[5002]: I1209 11:01:37.966185 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6724d37eb3b52330ed87c33a37154162c1547d16b97b3dcf1ce00d13e350de1"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:01:37 crc kubenswrapper[5002]: I1209 11:01:37.966237 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://a6724d37eb3b52330ed87c33a37154162c1547d16b97b3dcf1ce00d13e350de1" gracePeriod=600 Dec 09 11:01:38 crc kubenswrapper[5002]: I1209 11:01:38.144709 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="a6724d37eb3b52330ed87c33a37154162c1547d16b97b3dcf1ce00d13e350de1" exitCode=0 Dec 09 11:01:38 crc kubenswrapper[5002]: I1209 11:01:38.144769 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"a6724d37eb3b52330ed87c33a37154162c1547d16b97b3dcf1ce00d13e350de1"} Dec 09 11:01:38 crc kubenswrapper[5002]: I1209 11:01:38.145095 5002 scope.go:117] "RemoveContainer" containerID="738da2b9f567c691aaa8d2ad1324e9d744e19bb9b23d1692848e356bf8282445" Dec 09 11:01:39 crc kubenswrapper[5002]: I1209 11:01:39.156616 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a"} Dec 09 11:04:07 crc kubenswrapper[5002]: I1209 11:04:07.965032 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:04:07 crc kubenswrapper[5002]: I1209 11:04:07.965548 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:04:37 crc kubenswrapper[5002]: I1209 11:04:37.965246 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:04:37 crc kubenswrapper[5002]: I1209 11:04:37.965796 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:05:07 crc kubenswrapper[5002]: I1209 11:05:07.965387 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:05:07 crc kubenswrapper[5002]: I1209 11:05:07.966232 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:05:07 crc kubenswrapper[5002]: I1209 11:05:07.966299 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 11:05:07 crc kubenswrapper[5002]: I1209 11:05:07.967271 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:05:07 crc kubenswrapper[5002]: I1209 11:05:07.967381 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" gracePeriod=600 Dec 09 11:05:08 crc kubenswrapper[5002]: E1209 11:05:08.099020 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:05:08 crc kubenswrapper[5002]: I1209 11:05:08.924898 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" exitCode=0 Dec 09 11:05:08 crc kubenswrapper[5002]: I1209 11:05:08.924950 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a"} Dec 09 11:05:08 crc kubenswrapper[5002]: I1209 11:05:08.924989 5002 scope.go:117] "RemoveContainer" containerID="a6724d37eb3b52330ed87c33a37154162c1547d16b97b3dcf1ce00d13e350de1" Dec 09 11:05:08 crc kubenswrapper[5002]: I1209 11:05:08.925501 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:05:08 crc kubenswrapper[5002]: E1209 11:05:08.925735 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:05:23 crc kubenswrapper[5002]: I1209 11:05:23.060316 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:05:23 crc kubenswrapper[5002]: E1209 11:05:23.061440 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:05:38 crc kubenswrapper[5002]: I1209 11:05:38.069964 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:05:38 crc kubenswrapper[5002]: E1209 11:05:38.070600 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:05:49 crc kubenswrapper[5002]: I1209 11:05:49.060875 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:05:49 crc kubenswrapper[5002]: E1209 11:05:49.061946 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:06:02 crc kubenswrapper[5002]: I1209 11:06:02.060133 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:06:02 crc kubenswrapper[5002]: E1209 11:06:02.061188 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:06:14 crc kubenswrapper[5002]: I1209 11:06:14.061171 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:06:14 crc kubenswrapper[5002]: E1209 11:06:14.062439 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:06:29 crc kubenswrapper[5002]: I1209 11:06:29.061077 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:06:29 crc kubenswrapper[5002]: E1209 11:06:29.063072 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:06:41 crc kubenswrapper[5002]: I1209 11:06:41.061234 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:06:41 crc kubenswrapper[5002]: E1209 11:06:41.062217 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:06:55 crc kubenswrapper[5002]: I1209 11:06:55.060796 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:06:55 crc kubenswrapper[5002]: E1209 11:06:55.061628 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:07:10 crc kubenswrapper[5002]: I1209 11:07:10.061059 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:07:10 crc kubenswrapper[5002]: E1209 11:07:10.062227 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:07:21 crc kubenswrapper[5002]: I1209 11:07:21.060740 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:07:21 crc kubenswrapper[5002]: E1209 11:07:21.061448 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:07:34 crc kubenswrapper[5002]: I1209 11:07:34.065318 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:07:34 crc kubenswrapper[5002]: E1209 11:07:34.066490 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:07:47 crc kubenswrapper[5002]: I1209 11:07:47.060713 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:07:47 crc kubenswrapper[5002]: E1209 11:07:47.062137 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:08:01 crc kubenswrapper[5002]: I1209 11:08:01.060340 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:08:01 crc kubenswrapper[5002]: E1209 11:08:01.061520 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:08:12 crc kubenswrapper[5002]: I1209 11:08:12.064785 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:08:12 crc kubenswrapper[5002]: E1209 11:08:12.065661 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:08:25 crc kubenswrapper[5002]: I1209 11:08:25.060720 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:08:25 crc kubenswrapper[5002]: E1209 11:08:25.061504 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:08:38 crc kubenswrapper[5002]: I1209 11:08:38.068930 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:08:38 crc kubenswrapper[5002]: E1209 11:08:38.069919 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:08:51 crc kubenswrapper[5002]: I1209 11:08:51.060050 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:08:51 crc kubenswrapper[5002]: E1209 11:08:51.062437 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:09:02 crc kubenswrapper[5002]: I1209 11:09:02.060175 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:09:02 crc kubenswrapper[5002]: E1209 11:09:02.061507 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:09:14 crc kubenswrapper[5002]: I1209 11:09:14.060719 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:09:14 crc kubenswrapper[5002]: E1209 11:09:14.061654 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:09:26 crc kubenswrapper[5002]: I1209 11:09:26.060852 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:09:26 crc kubenswrapper[5002]: E1209 11:09:26.061661 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:09:33 crc kubenswrapper[5002]: I1209 11:09:33.913243 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xcl8w"] Dec 09 11:09:33 crc kubenswrapper[5002]: E1209 11:09:33.920564 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7f5a20-7e58-4d15-ad78-e9b07d787bd8" containerName="collect-profiles" Dec 09 11:09:33 crc kubenswrapper[5002]: I1209 11:09:33.920618 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7f5a20-7e58-4d15-ad78-e9b07d787bd8" containerName="collect-profiles" Dec 09 11:09:33 crc kubenswrapper[5002]: I1209 11:09:33.920904 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7f5a20-7e58-4d15-ad78-e9b07d787bd8" containerName="collect-profiles" Dec 09 11:09:33 crc kubenswrapper[5002]: I1209 11:09:33.923390 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:33 crc kubenswrapper[5002]: I1209 11:09:33.934041 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xcl8w"] Dec 09 11:09:34 crc kubenswrapper[5002]: I1209 11:09:34.017130 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-catalog-content\") pod \"community-operators-xcl8w\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:34 crc kubenswrapper[5002]: I1209 11:09:34.017342 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tmtp\" (UniqueName: \"kubernetes.io/projected/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-kube-api-access-5tmtp\") pod \"community-operators-xcl8w\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:34 crc kubenswrapper[5002]: I1209 11:09:34.017454 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-utilities\") pod \"community-operators-xcl8w\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:34 crc kubenswrapper[5002]: I1209 11:09:34.119031 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-catalog-content\") pod \"community-operators-xcl8w\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:34 crc kubenswrapper[5002]: I1209 11:09:34.119439 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tmtp\" (UniqueName: \"kubernetes.io/projected/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-kube-api-access-5tmtp\") pod \"community-operators-xcl8w\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:34 crc kubenswrapper[5002]: I1209 11:09:34.119501 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-utilities\") pod \"community-operators-xcl8w\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:34 crc kubenswrapper[5002]: I1209 11:09:34.119595 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-catalog-content\") pod \"community-operators-xcl8w\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:34 crc kubenswrapper[5002]: I1209 11:09:34.119897 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-utilities\") pod \"community-operators-xcl8w\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:34 crc kubenswrapper[5002]: I1209 11:09:34.147439 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tmtp\" (UniqueName: \"kubernetes.io/projected/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-kube-api-access-5tmtp\") pod \"community-operators-xcl8w\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:34 crc kubenswrapper[5002]: I1209 11:09:34.250239 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:34 crc kubenswrapper[5002]: I1209 11:09:34.737363 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xcl8w"] Dec 09 11:09:34 crc kubenswrapper[5002]: W1209 11:09:34.758052 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4157d94_17ca_41da_a36a_cc3ff3b7b9ec.slice/crio-d8bb8acfa4baf9639a64ff70a549b08abd8f29089ff60c9641fc112f45b90872 WatchSource:0}: Error finding container d8bb8acfa4baf9639a64ff70a549b08abd8f29089ff60c9641fc112f45b90872: Status 404 returned error can't find the container with id d8bb8acfa4baf9639a64ff70a549b08abd8f29089ff60c9641fc112f45b90872 Dec 09 11:09:35 crc kubenswrapper[5002]: I1209 11:09:35.281564 5002 generic.go:334] "Generic (PLEG): container finished" podID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" containerID="514a9ec8d484a459cf6d5d3ece2b6121232317350a64cdb1996eaca5afcfe947" exitCode=0 Dec 09 11:09:35 crc kubenswrapper[5002]: I1209 11:09:35.281608 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xcl8w" event={"ID":"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec","Type":"ContainerDied","Data":"514a9ec8d484a459cf6d5d3ece2b6121232317350a64cdb1996eaca5afcfe947"} Dec 09 11:09:35 crc kubenswrapper[5002]: I1209 11:09:35.281636 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xcl8w" event={"ID":"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec","Type":"ContainerStarted","Data":"d8bb8acfa4baf9639a64ff70a549b08abd8f29089ff60c9641fc112f45b90872"} Dec 09 11:09:35 crc kubenswrapper[5002]: I1209 11:09:35.283919 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:09:36 crc kubenswrapper[5002]: I1209 11:09:36.292049 5002 generic.go:334] "Generic (PLEG): container finished" podID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" containerID="4a8022dd02d1471109c5f5d2dba02d2c4c9b5b538a816ee3f01f6b435c7ba3ce" exitCode=0 Dec 09 11:09:36 crc kubenswrapper[5002]: I1209 11:09:36.292244 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xcl8w" event={"ID":"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec","Type":"ContainerDied","Data":"4a8022dd02d1471109c5f5d2dba02d2c4c9b5b538a816ee3f01f6b435c7ba3ce"} Dec 09 11:09:37 crc kubenswrapper[5002]: I1209 11:09:37.061382 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:09:37 crc kubenswrapper[5002]: E1209 11:09:37.061975 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:09:37 crc kubenswrapper[5002]: I1209 11:09:37.306028 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xcl8w" event={"ID":"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec","Type":"ContainerStarted","Data":"cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424"} Dec 09 11:09:37 crc kubenswrapper[5002]: I1209 11:09:37.343087 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xcl8w" podStartSLOduration=2.942770642 podStartE2EDuration="4.343062186s" podCreationTimestamp="2025-12-09 11:09:33 +0000 UTC" firstStartedPulling="2025-12-09 11:09:35.283592098 +0000 UTC m=+4107.675643189" lastFinishedPulling="2025-12-09 11:09:36.683883612 +0000 UTC m=+4109.075934733" observedRunningTime="2025-12-09 11:09:37.337097396 +0000 UTC m=+4109.729148507" watchObservedRunningTime="2025-12-09 11:09:37.343062186 +0000 UTC m=+4109.735113307" Dec 09 11:09:44 crc kubenswrapper[5002]: I1209 11:09:44.250979 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:44 crc kubenswrapper[5002]: I1209 11:09:44.251650 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:44 crc kubenswrapper[5002]: I1209 11:09:44.315014 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:44 crc kubenswrapper[5002]: I1209 11:09:44.429489 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:44 crc kubenswrapper[5002]: I1209 11:09:44.566664 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xcl8w"] Dec 09 11:09:46 crc kubenswrapper[5002]: I1209 11:09:46.386427 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xcl8w" podUID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" containerName="registry-server" containerID="cri-o://cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424" gracePeriod=2 Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.382476 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.400481 5002 generic.go:334] "Generic (PLEG): container finished" podID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" containerID="cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424" exitCode=0 Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.400557 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xcl8w" event={"ID":"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec","Type":"ContainerDied","Data":"cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424"} Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.400663 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xcl8w" event={"ID":"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec","Type":"ContainerDied","Data":"d8bb8acfa4baf9639a64ff70a549b08abd8f29089ff60c9641fc112f45b90872"} Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.400699 5002 scope.go:117] "RemoveContainer" containerID="cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.400576 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xcl8w" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.428482 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tmtp\" (UniqueName: \"kubernetes.io/projected/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-kube-api-access-5tmtp\") pod \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.428582 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-utilities\") pod \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.428635 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-catalog-content\") pod \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\" (UID: \"e4157d94-17ca-41da-a36a-cc3ff3b7b9ec\") " Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.429566 5002 scope.go:117] "RemoveContainer" containerID="4a8022dd02d1471109c5f5d2dba02d2c4c9b5b538a816ee3f01f6b435c7ba3ce" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.430758 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-utilities" (OuterVolumeSpecName: "utilities") pod "e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" (UID: "e4157d94-17ca-41da-a36a-cc3ff3b7b9ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.435302 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-kube-api-access-5tmtp" (OuterVolumeSpecName: "kube-api-access-5tmtp") pod "e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" (UID: "e4157d94-17ca-41da-a36a-cc3ff3b7b9ec"). InnerVolumeSpecName "kube-api-access-5tmtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.486336 5002 scope.go:117] "RemoveContainer" containerID="514a9ec8d484a459cf6d5d3ece2b6121232317350a64cdb1996eaca5afcfe947" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.510866 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" (UID: "e4157d94-17ca-41da-a36a-cc3ff3b7b9ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.517328 5002 scope.go:117] "RemoveContainer" containerID="cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424" Dec 09 11:09:47 crc kubenswrapper[5002]: E1209 11:09:47.517873 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424\": container with ID starting with cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424 not found: ID does not exist" containerID="cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.517927 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424"} err="failed to get container status \"cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424\": rpc error: code = NotFound desc = could not find container \"cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424\": container with ID starting with cd7fb43f80e31247a88437eb2122de037349a2aa74258e303c5054575a82e424 not found: ID does not exist" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.517967 5002 scope.go:117] "RemoveContainer" containerID="4a8022dd02d1471109c5f5d2dba02d2c4c9b5b538a816ee3f01f6b435c7ba3ce" Dec 09 11:09:47 crc kubenswrapper[5002]: E1209 11:09:47.518448 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a8022dd02d1471109c5f5d2dba02d2c4c9b5b538a816ee3f01f6b435c7ba3ce\": container with ID starting with 4a8022dd02d1471109c5f5d2dba02d2c4c9b5b538a816ee3f01f6b435c7ba3ce not found: ID does not exist" containerID="4a8022dd02d1471109c5f5d2dba02d2c4c9b5b538a816ee3f01f6b435c7ba3ce" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.518508 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a8022dd02d1471109c5f5d2dba02d2c4c9b5b538a816ee3f01f6b435c7ba3ce"} err="failed to get container status \"4a8022dd02d1471109c5f5d2dba02d2c4c9b5b538a816ee3f01f6b435c7ba3ce\": rpc error: code = NotFound desc = could not find container \"4a8022dd02d1471109c5f5d2dba02d2c4c9b5b538a816ee3f01f6b435c7ba3ce\": container with ID starting with 4a8022dd02d1471109c5f5d2dba02d2c4c9b5b538a816ee3f01f6b435c7ba3ce not found: ID does not exist" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.518554 5002 scope.go:117] "RemoveContainer" containerID="514a9ec8d484a459cf6d5d3ece2b6121232317350a64cdb1996eaca5afcfe947" Dec 09 11:09:47 crc kubenswrapper[5002]: E1209 11:09:47.518994 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514a9ec8d484a459cf6d5d3ece2b6121232317350a64cdb1996eaca5afcfe947\": container with ID starting with 514a9ec8d484a459cf6d5d3ece2b6121232317350a64cdb1996eaca5afcfe947 not found: ID does not exist" containerID="514a9ec8d484a459cf6d5d3ece2b6121232317350a64cdb1996eaca5afcfe947" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.519038 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514a9ec8d484a459cf6d5d3ece2b6121232317350a64cdb1996eaca5afcfe947"} err="failed to get container status \"514a9ec8d484a459cf6d5d3ece2b6121232317350a64cdb1996eaca5afcfe947\": rpc error: code = NotFound desc = could not find container \"514a9ec8d484a459cf6d5d3ece2b6121232317350a64cdb1996eaca5afcfe947\": container with ID starting with 514a9ec8d484a459cf6d5d3ece2b6121232317350a64cdb1996eaca5afcfe947 not found: ID does not exist" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.530072 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tmtp\" (UniqueName: \"kubernetes.io/projected/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-kube-api-access-5tmtp\") on node \"crc\" DevicePath \"\"" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.530109 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.530125 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.763484 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xcl8w"] Dec 09 11:09:47 crc kubenswrapper[5002]: I1209 11:09:47.776049 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xcl8w"] Dec 09 11:09:48 crc kubenswrapper[5002]: I1209 11:09:48.072340 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:09:48 crc kubenswrapper[5002]: E1209 11:09:48.072683 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:09:48 crc kubenswrapper[5002]: I1209 11:09:48.075946 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" path="/var/lib/kubelet/pods/e4157d94-17ca-41da-a36a-cc3ff3b7b9ec/volumes" Dec 09 11:09:54 crc kubenswrapper[5002]: I1209 11:09:54.835925 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-75cwr"] Dec 09 11:09:54 crc kubenswrapper[5002]: E1209 11:09:54.836664 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" containerName="registry-server" Dec 09 11:09:54 crc kubenswrapper[5002]: I1209 11:09:54.836676 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" containerName="registry-server" Dec 09 11:09:54 crc kubenswrapper[5002]: E1209 11:09:54.836697 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" containerName="extract-utilities" Dec 09 11:09:54 crc kubenswrapper[5002]: I1209 11:09:54.836702 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" containerName="extract-utilities" Dec 09 11:09:54 crc kubenswrapper[5002]: E1209 11:09:54.836720 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" containerName="extract-content" Dec 09 11:09:54 crc kubenswrapper[5002]: I1209 11:09:54.836728 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" containerName="extract-content" Dec 09 11:09:54 crc kubenswrapper[5002]: I1209 11:09:54.836859 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4157d94-17ca-41da-a36a-cc3ff3b7b9ec" containerName="registry-server" Dec 09 11:09:54 crc kubenswrapper[5002]: I1209 11:09:54.837882 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:09:54 crc kubenswrapper[5002]: I1209 11:09:54.866522 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-75cwr"] Dec 09 11:09:55 crc kubenswrapper[5002]: I1209 11:09:55.033232 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-catalog-content\") pod \"redhat-operators-75cwr\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:09:55 crc kubenswrapper[5002]: I1209 11:09:55.033549 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-utilities\") pod \"redhat-operators-75cwr\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:09:55 crc kubenswrapper[5002]: I1209 11:09:55.033609 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbb8c\" (UniqueName: \"kubernetes.io/projected/91bbce66-9660-4e2f-b8a2-242beb73c6aa-kube-api-access-xbb8c\") pod \"redhat-operators-75cwr\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:09:55 crc kubenswrapper[5002]: I1209 11:09:55.135091 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-utilities\") pod \"redhat-operators-75cwr\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:09:55 crc kubenswrapper[5002]: I1209 11:09:55.135148 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbb8c\" (UniqueName: \"kubernetes.io/projected/91bbce66-9660-4e2f-b8a2-242beb73c6aa-kube-api-access-xbb8c\") pod \"redhat-operators-75cwr\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:09:55 crc kubenswrapper[5002]: I1209 11:09:55.135188 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-catalog-content\") pod \"redhat-operators-75cwr\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:09:55 crc kubenswrapper[5002]: I1209 11:09:55.135976 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-utilities\") pod \"redhat-operators-75cwr\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:09:55 crc kubenswrapper[5002]: I1209 11:09:55.136019 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-catalog-content\") pod \"redhat-operators-75cwr\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:09:55 crc kubenswrapper[5002]: I1209 11:09:55.156491 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbb8c\" (UniqueName: \"kubernetes.io/projected/91bbce66-9660-4e2f-b8a2-242beb73c6aa-kube-api-access-xbb8c\") pod \"redhat-operators-75cwr\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:09:55 crc kubenswrapper[5002]: I1209 11:09:55.160966 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:09:55 crc kubenswrapper[5002]: I1209 11:09:55.691713 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-75cwr"] Dec 09 11:09:56 crc kubenswrapper[5002]: I1209 11:09:56.504124 5002 generic.go:334] "Generic (PLEG): container finished" podID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" containerID="068b4636d93564ecf6ae32d6e2b297fea06ec947e1ee236c144c882595a363ca" exitCode=0 Dec 09 11:09:56 crc kubenswrapper[5002]: I1209 11:09:56.504198 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75cwr" event={"ID":"91bbce66-9660-4e2f-b8a2-242beb73c6aa","Type":"ContainerDied","Data":"068b4636d93564ecf6ae32d6e2b297fea06ec947e1ee236c144c882595a363ca"} Dec 09 11:09:56 crc kubenswrapper[5002]: I1209 11:09:56.504471 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75cwr" event={"ID":"91bbce66-9660-4e2f-b8a2-242beb73c6aa","Type":"ContainerStarted","Data":"9d7669223f87b048f77e240bd3c73598467aa71e2ed10ceb3a150222ae08ef56"} Dec 09 11:09:57 crc kubenswrapper[5002]: I1209 11:09:57.512185 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75cwr" event={"ID":"91bbce66-9660-4e2f-b8a2-242beb73c6aa","Type":"ContainerStarted","Data":"7e66e1cc984f2fe3c36bdf34a310dc62190309c2bfc4240ccbcbeded812ab46f"} Dec 09 11:09:58 crc kubenswrapper[5002]: I1209 11:09:58.523413 5002 generic.go:334] "Generic (PLEG): container finished" podID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" containerID="7e66e1cc984f2fe3c36bdf34a310dc62190309c2bfc4240ccbcbeded812ab46f" exitCode=0 Dec 09 11:09:58 crc kubenswrapper[5002]: I1209 11:09:58.523497 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75cwr" event={"ID":"91bbce66-9660-4e2f-b8a2-242beb73c6aa","Type":"ContainerDied","Data":"7e66e1cc984f2fe3c36bdf34a310dc62190309c2bfc4240ccbcbeded812ab46f"} Dec 09 11:09:59 crc kubenswrapper[5002]: I1209 11:09:59.533296 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75cwr" event={"ID":"91bbce66-9660-4e2f-b8a2-242beb73c6aa","Type":"ContainerStarted","Data":"6ffaccd7962e752b1b9b22aef288b72e9f474a8a1677c7c687ab1c7481d71751"} Dec 09 11:09:59 crc kubenswrapper[5002]: I1209 11:09:59.550583 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-75cwr" podStartSLOduration=3.085070046 podStartE2EDuration="5.55056365s" podCreationTimestamp="2025-12-09 11:09:54 +0000 UTC" firstStartedPulling="2025-12-09 11:09:56.505736133 +0000 UTC m=+4128.897787214" lastFinishedPulling="2025-12-09 11:09:58.971229727 +0000 UTC m=+4131.363280818" observedRunningTime="2025-12-09 11:09:59.547622052 +0000 UTC m=+4131.939673153" watchObservedRunningTime="2025-12-09 11:09:59.55056365 +0000 UTC m=+4131.942614731" Dec 09 11:10:01 crc kubenswrapper[5002]: I1209 11:10:01.061403 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:10:01 crc kubenswrapper[5002]: E1209 11:10:01.061633 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:10:05 crc kubenswrapper[5002]: I1209 11:10:05.162282 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:10:05 crc kubenswrapper[5002]: I1209 11:10:05.163282 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:10:05 crc kubenswrapper[5002]: I1209 11:10:05.216359 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:10:05 crc kubenswrapper[5002]: I1209 11:10:05.651248 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:10:05 crc kubenswrapper[5002]: I1209 11:10:05.715609 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-75cwr"] Dec 09 11:10:07 crc kubenswrapper[5002]: I1209 11:10:07.599172 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-75cwr" podUID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" containerName="registry-server" containerID="cri-o://6ffaccd7962e752b1b9b22aef288b72e9f474a8a1677c7c687ab1c7481d71751" gracePeriod=2 Dec 09 11:10:10 crc kubenswrapper[5002]: I1209 11:10:10.630396 5002 generic.go:334] "Generic (PLEG): container finished" podID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" containerID="6ffaccd7962e752b1b9b22aef288b72e9f474a8a1677c7c687ab1c7481d71751" exitCode=0 Dec 09 11:10:10 crc kubenswrapper[5002]: I1209 11:10:10.630447 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75cwr" event={"ID":"91bbce66-9660-4e2f-b8a2-242beb73c6aa","Type":"ContainerDied","Data":"6ffaccd7962e752b1b9b22aef288b72e9f474a8a1677c7c687ab1c7481d71751"} Dec 09 11:10:10 crc kubenswrapper[5002]: I1209 11:10:10.689224 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:10:10 crc kubenswrapper[5002]: I1209 11:10:10.865358 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbb8c\" (UniqueName: \"kubernetes.io/projected/91bbce66-9660-4e2f-b8a2-242beb73c6aa-kube-api-access-xbb8c\") pod \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " Dec 09 11:10:10 crc kubenswrapper[5002]: I1209 11:10:10.865423 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-catalog-content\") pod \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " Dec 09 11:10:10 crc kubenswrapper[5002]: I1209 11:10:10.865526 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-utilities\") pod \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\" (UID: \"91bbce66-9660-4e2f-b8a2-242beb73c6aa\") " Dec 09 11:10:10 crc kubenswrapper[5002]: I1209 11:10:10.866548 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-utilities" (OuterVolumeSpecName: "utilities") pod "91bbce66-9660-4e2f-b8a2-242beb73c6aa" (UID: "91bbce66-9660-4e2f-b8a2-242beb73c6aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:10:10 crc kubenswrapper[5002]: I1209 11:10:10.870484 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91bbce66-9660-4e2f-b8a2-242beb73c6aa-kube-api-access-xbb8c" (OuterVolumeSpecName: "kube-api-access-xbb8c") pod "91bbce66-9660-4e2f-b8a2-242beb73c6aa" (UID: "91bbce66-9660-4e2f-b8a2-242beb73c6aa"). InnerVolumeSpecName "kube-api-access-xbb8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:10:10 crc kubenswrapper[5002]: I1209 11:10:10.967757 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbb8c\" (UniqueName: \"kubernetes.io/projected/91bbce66-9660-4e2f-b8a2-242beb73c6aa-kube-api-access-xbb8c\") on node \"crc\" DevicePath \"\"" Dec 09 11:10:10 crc kubenswrapper[5002]: I1209 11:10:10.967797 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:10:10 crc kubenswrapper[5002]: I1209 11:10:10.992353 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91bbce66-9660-4e2f-b8a2-242beb73c6aa" (UID: "91bbce66-9660-4e2f-b8a2-242beb73c6aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:10:11 crc kubenswrapper[5002]: I1209 11:10:11.069134 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91bbce66-9660-4e2f-b8a2-242beb73c6aa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:10:11 crc kubenswrapper[5002]: I1209 11:10:11.640658 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75cwr" event={"ID":"91bbce66-9660-4e2f-b8a2-242beb73c6aa","Type":"ContainerDied","Data":"9d7669223f87b048f77e240bd3c73598467aa71e2ed10ceb3a150222ae08ef56"} Dec 09 11:10:11 crc kubenswrapper[5002]: I1209 11:10:11.641060 5002 scope.go:117] "RemoveContainer" containerID="6ffaccd7962e752b1b9b22aef288b72e9f474a8a1677c7c687ab1c7481d71751" Dec 09 11:10:11 crc kubenswrapper[5002]: I1209 11:10:11.640859 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75cwr" Dec 09 11:10:11 crc kubenswrapper[5002]: I1209 11:10:11.661598 5002 scope.go:117] "RemoveContainer" containerID="7e66e1cc984f2fe3c36bdf34a310dc62190309c2bfc4240ccbcbeded812ab46f" Dec 09 11:10:11 crc kubenswrapper[5002]: I1209 11:10:11.704240 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-75cwr"] Dec 09 11:10:11 crc kubenswrapper[5002]: I1209 11:10:11.708438 5002 scope.go:117] "RemoveContainer" containerID="068b4636d93564ecf6ae32d6e2b297fea06ec947e1ee236c144c882595a363ca" Dec 09 11:10:11 crc kubenswrapper[5002]: I1209 11:10:11.712062 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-75cwr"] Dec 09 11:10:12 crc kubenswrapper[5002]: I1209 11:10:12.070420 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" path="/var/lib/kubelet/pods/91bbce66-9660-4e2f-b8a2-242beb73c6aa/volumes" Dec 09 11:10:13 crc kubenswrapper[5002]: I1209 11:10:13.059691 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:10:13 crc kubenswrapper[5002]: I1209 11:10:13.659738 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"1afc9ebf9ed2fc9e4d452f4f472fa50ac1aa3c9ea389d4cf549db9d1f5d70925"} Dec 09 11:12:37 crc kubenswrapper[5002]: I1209 11:12:37.964693 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:12:37 crc kubenswrapper[5002]: I1209 11:12:37.965291 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:13:07 crc kubenswrapper[5002]: I1209 11:13:07.965130 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:13:07 crc kubenswrapper[5002]: I1209 11:13:07.966029 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:13:37 crc kubenswrapper[5002]: I1209 11:13:37.965283 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:13:37 crc kubenswrapper[5002]: I1209 11:13:37.965992 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:13:37 crc kubenswrapper[5002]: I1209 11:13:37.966059 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 11:13:37 crc kubenswrapper[5002]: I1209 11:13:37.966968 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1afc9ebf9ed2fc9e4d452f4f472fa50ac1aa3c9ea389d4cf549db9d1f5d70925"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:13:37 crc kubenswrapper[5002]: I1209 11:13:37.967083 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://1afc9ebf9ed2fc9e4d452f4f472fa50ac1aa3c9ea389d4cf549db9d1f5d70925" gracePeriod=600 Dec 09 11:13:38 crc kubenswrapper[5002]: I1209 11:13:38.357545 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="1afc9ebf9ed2fc9e4d452f4f472fa50ac1aa3c9ea389d4cf549db9d1f5d70925" exitCode=0 Dec 09 11:13:38 crc kubenswrapper[5002]: I1209 11:13:38.357606 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"1afc9ebf9ed2fc9e4d452f4f472fa50ac1aa3c9ea389d4cf549db9d1f5d70925"} Dec 09 11:13:38 crc kubenswrapper[5002]: I1209 11:13:38.357942 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3"} Dec 09 11:13:38 crc kubenswrapper[5002]: I1209 11:13:38.357971 5002 scope.go:117] "RemoveContainer" containerID="56e68643067c5f91b3de8ae49956bcf26e265baae08c30c7adce94eacdf30d2a" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.188561 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9"] Dec 09 11:15:00 crc kubenswrapper[5002]: E1209 11:15:00.189486 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" containerName="registry-server" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.189502 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" containerName="registry-server" Dec 09 11:15:00 crc kubenswrapper[5002]: E1209 11:15:00.189526 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" containerName="extract-utilities" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.189536 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" containerName="extract-utilities" Dec 09 11:15:00 crc kubenswrapper[5002]: E1209 11:15:00.189564 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" containerName="extract-content" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.189573 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" containerName="extract-content" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.189795 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="91bbce66-9660-4e2f-b8a2-242beb73c6aa" containerName="registry-server" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.190587 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.192595 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.193038 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.195462 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9"] Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.218389 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60ef1b3d-7380-4ad4-83c5-783d87789929-secret-volume\") pod \"collect-profiles-29421315-pxbj9\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.218435 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60ef1b3d-7380-4ad4-83c5-783d87789929-config-volume\") pod \"collect-profiles-29421315-pxbj9\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.218530 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz8h5\" (UniqueName: \"kubernetes.io/projected/60ef1b3d-7380-4ad4-83c5-783d87789929-kube-api-access-jz8h5\") pod \"collect-profiles-29421315-pxbj9\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.319013 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60ef1b3d-7380-4ad4-83c5-783d87789929-secret-volume\") pod \"collect-profiles-29421315-pxbj9\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.319048 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60ef1b3d-7380-4ad4-83c5-783d87789929-config-volume\") pod \"collect-profiles-29421315-pxbj9\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.319108 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz8h5\" (UniqueName: \"kubernetes.io/projected/60ef1b3d-7380-4ad4-83c5-783d87789929-kube-api-access-jz8h5\") pod \"collect-profiles-29421315-pxbj9\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.320321 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60ef1b3d-7380-4ad4-83c5-783d87789929-config-volume\") pod \"collect-profiles-29421315-pxbj9\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.331088 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60ef1b3d-7380-4ad4-83c5-783d87789929-secret-volume\") pod \"collect-profiles-29421315-pxbj9\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.350300 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz8h5\" (UniqueName: \"kubernetes.io/projected/60ef1b3d-7380-4ad4-83c5-783d87789929-kube-api-access-jz8h5\") pod \"collect-profiles-29421315-pxbj9\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.518313 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:00 crc kubenswrapper[5002]: I1209 11:15:00.945197 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9"] Dec 09 11:15:01 crc kubenswrapper[5002]: I1209 11:15:01.024650 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" event={"ID":"60ef1b3d-7380-4ad4-83c5-783d87789929","Type":"ContainerStarted","Data":"ea179dcc114cd346e705064020a802db788422ef5a636f136fe099769c8cc719"} Dec 09 11:15:02 crc kubenswrapper[5002]: I1209 11:15:02.031794 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" event={"ID":"60ef1b3d-7380-4ad4-83c5-783d87789929","Type":"ContainerStarted","Data":"b74609cd71d8288e561ff40f177b861af5d648570db5eb2c8e3bcde13287198d"} Dec 09 11:15:02 crc kubenswrapper[5002]: I1209 11:15:02.054446 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" podStartSLOduration=2.054429514 podStartE2EDuration="2.054429514s" podCreationTimestamp="2025-12-09 11:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:15:02.047974471 +0000 UTC m=+4434.440025552" watchObservedRunningTime="2025-12-09 11:15:02.054429514 +0000 UTC m=+4434.446480595" Dec 09 11:15:03 crc kubenswrapper[5002]: I1209 11:15:03.040958 5002 generic.go:334] "Generic (PLEG): container finished" podID="60ef1b3d-7380-4ad4-83c5-783d87789929" containerID="b74609cd71d8288e561ff40f177b861af5d648570db5eb2c8e3bcde13287198d" exitCode=0 Dec 09 11:15:03 crc kubenswrapper[5002]: I1209 11:15:03.041002 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" event={"ID":"60ef1b3d-7380-4ad4-83c5-783d87789929","Type":"ContainerDied","Data":"b74609cd71d8288e561ff40f177b861af5d648570db5eb2c8e3bcde13287198d"} Dec 09 11:15:04 crc kubenswrapper[5002]: I1209 11:15:04.584477 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:04 crc kubenswrapper[5002]: I1209 11:15:04.676766 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60ef1b3d-7380-4ad4-83c5-783d87789929-config-volume\") pod \"60ef1b3d-7380-4ad4-83c5-783d87789929\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " Dec 09 11:15:04 crc kubenswrapper[5002]: I1209 11:15:04.676846 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz8h5\" (UniqueName: \"kubernetes.io/projected/60ef1b3d-7380-4ad4-83c5-783d87789929-kube-api-access-jz8h5\") pod \"60ef1b3d-7380-4ad4-83c5-783d87789929\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " Dec 09 11:15:04 crc kubenswrapper[5002]: I1209 11:15:04.676960 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60ef1b3d-7380-4ad4-83c5-783d87789929-secret-volume\") pod \"60ef1b3d-7380-4ad4-83c5-783d87789929\" (UID: \"60ef1b3d-7380-4ad4-83c5-783d87789929\") " Dec 09 11:15:04 crc kubenswrapper[5002]: I1209 11:15:04.677706 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60ef1b3d-7380-4ad4-83c5-783d87789929-config-volume" (OuterVolumeSpecName: "config-volume") pod "60ef1b3d-7380-4ad4-83c5-783d87789929" (UID: "60ef1b3d-7380-4ad4-83c5-783d87789929"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:15:04 crc kubenswrapper[5002]: I1209 11:15:04.682195 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60ef1b3d-7380-4ad4-83c5-783d87789929-kube-api-access-jz8h5" (OuterVolumeSpecName: "kube-api-access-jz8h5") pod "60ef1b3d-7380-4ad4-83c5-783d87789929" (UID: "60ef1b3d-7380-4ad4-83c5-783d87789929"). InnerVolumeSpecName "kube-api-access-jz8h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:15:04 crc kubenswrapper[5002]: I1209 11:15:04.682687 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ef1b3d-7380-4ad4-83c5-783d87789929-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "60ef1b3d-7380-4ad4-83c5-783d87789929" (UID: "60ef1b3d-7380-4ad4-83c5-783d87789929"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:15:04 crc kubenswrapper[5002]: I1209 11:15:04.777779 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60ef1b3d-7380-4ad4-83c5-783d87789929-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:15:04 crc kubenswrapper[5002]: I1209 11:15:04.777831 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz8h5\" (UniqueName: \"kubernetes.io/projected/60ef1b3d-7380-4ad4-83c5-783d87789929-kube-api-access-jz8h5\") on node \"crc\" DevicePath \"\"" Dec 09 11:15:04 crc kubenswrapper[5002]: I1209 11:15:04.777846 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60ef1b3d-7380-4ad4-83c5-783d87789929-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:15:05 crc kubenswrapper[5002]: I1209 11:15:05.056085 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" event={"ID":"60ef1b3d-7380-4ad4-83c5-783d87789929","Type":"ContainerDied","Data":"ea179dcc114cd346e705064020a802db788422ef5a636f136fe099769c8cc719"} Dec 09 11:15:05 crc kubenswrapper[5002]: I1209 11:15:05.056129 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9" Dec 09 11:15:05 crc kubenswrapper[5002]: I1209 11:15:05.056136 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea179dcc114cd346e705064020a802db788422ef5a636f136fe099769c8cc719" Dec 09 11:15:05 crc kubenswrapper[5002]: I1209 11:15:05.126384 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f"] Dec 09 11:15:05 crc kubenswrapper[5002]: I1209 11:15:05.133916 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421270-m9c9f"] Dec 09 11:15:06 crc kubenswrapper[5002]: I1209 11:15:06.070564 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a2e581-9142-4731-ba35-b89fc3efa4fa" path="/var/lib/kubelet/pods/34a2e581-9142-4731-ba35-b89fc3efa4fa/volumes" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.641760 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4j5mq"] Dec 09 11:15:11 crc kubenswrapper[5002]: E1209 11:15:11.643173 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ef1b3d-7380-4ad4-83c5-783d87789929" containerName="collect-profiles" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.643195 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ef1b3d-7380-4ad4-83c5-783d87789929" containerName="collect-profiles" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.643610 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ef1b3d-7380-4ad4-83c5-783d87789929" containerName="collect-profiles" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.645935 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.677177 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j5mq"] Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.681888 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxlql\" (UniqueName: \"kubernetes.io/projected/a58c7888-f969-41a8-9d04-0402135ce5b3-kube-api-access-pxlql\") pod \"redhat-marketplace-4j5mq\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.681953 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-utilities\") pod \"redhat-marketplace-4j5mq\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.682059 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-catalog-content\") pod \"redhat-marketplace-4j5mq\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.783296 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlql\" (UniqueName: \"kubernetes.io/projected/a58c7888-f969-41a8-9d04-0402135ce5b3-kube-api-access-pxlql\") pod \"redhat-marketplace-4j5mq\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.783349 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-utilities\") pod \"redhat-marketplace-4j5mq\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.783427 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-catalog-content\") pod \"redhat-marketplace-4j5mq\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.783859 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-catalog-content\") pod \"redhat-marketplace-4j5mq\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.784091 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-utilities\") pod \"redhat-marketplace-4j5mq\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.808957 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxlql\" (UniqueName: \"kubernetes.io/projected/a58c7888-f969-41a8-9d04-0402135ce5b3-kube-api-access-pxlql\") pod \"redhat-marketplace-4j5mq\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:11 crc kubenswrapper[5002]: I1209 11:15:11.974456 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:12 crc kubenswrapper[5002]: I1209 11:15:12.445385 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j5mq"] Dec 09 11:15:12 crc kubenswrapper[5002]: W1209 11:15:12.452428 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda58c7888_f969_41a8_9d04_0402135ce5b3.slice/crio-cde8a18aca52c3b2a6f862bea8765b21509639ae310b5d147caf298ebe2c3399 WatchSource:0}: Error finding container cde8a18aca52c3b2a6f862bea8765b21509639ae310b5d147caf298ebe2c3399: Status 404 returned error can't find the container with id cde8a18aca52c3b2a6f862bea8765b21509639ae310b5d147caf298ebe2c3399 Dec 09 11:15:13 crc kubenswrapper[5002]: I1209 11:15:13.123799 5002 generic.go:334] "Generic (PLEG): container finished" podID="a58c7888-f969-41a8-9d04-0402135ce5b3" containerID="22fd7bc2d8ea4be6640a05ac85c292c008620110da480f7a2832b60d9eb1f2ea" exitCode=0 Dec 09 11:15:13 crc kubenswrapper[5002]: I1209 11:15:13.123872 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j5mq" event={"ID":"a58c7888-f969-41a8-9d04-0402135ce5b3","Type":"ContainerDied","Data":"22fd7bc2d8ea4be6640a05ac85c292c008620110da480f7a2832b60d9eb1f2ea"} Dec 09 11:15:13 crc kubenswrapper[5002]: I1209 11:15:13.123912 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j5mq" event={"ID":"a58c7888-f969-41a8-9d04-0402135ce5b3","Type":"ContainerStarted","Data":"cde8a18aca52c3b2a6f862bea8765b21509639ae310b5d147caf298ebe2c3399"} Dec 09 11:15:13 crc kubenswrapper[5002]: I1209 11:15:13.126355 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:15:14 crc kubenswrapper[5002]: I1209 11:15:14.132568 5002 generic.go:334] "Generic (PLEG): container finished" podID="a58c7888-f969-41a8-9d04-0402135ce5b3" containerID="b7095e8ac5e6882906359dcd52cbdf34119f5b331ed76e080642ca81a776bb3f" exitCode=0 Dec 09 11:15:14 crc kubenswrapper[5002]: I1209 11:15:14.132623 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j5mq" event={"ID":"a58c7888-f969-41a8-9d04-0402135ce5b3","Type":"ContainerDied","Data":"b7095e8ac5e6882906359dcd52cbdf34119f5b331ed76e080642ca81a776bb3f"} Dec 09 11:15:15 crc kubenswrapper[5002]: I1209 11:15:15.144895 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j5mq" event={"ID":"a58c7888-f969-41a8-9d04-0402135ce5b3","Type":"ContainerStarted","Data":"1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c"} Dec 09 11:15:15 crc kubenswrapper[5002]: I1209 11:15:15.165269 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4j5mq" podStartSLOduration=2.785511518 podStartE2EDuration="4.165249981s" podCreationTimestamp="2025-12-09 11:15:11 +0000 UTC" firstStartedPulling="2025-12-09 11:15:13.126033639 +0000 UTC m=+4445.518084730" lastFinishedPulling="2025-12-09 11:15:14.505772112 +0000 UTC m=+4446.897823193" observedRunningTime="2025-12-09 11:15:15.161012248 +0000 UTC m=+4447.553063329" watchObservedRunningTime="2025-12-09 11:15:15.165249981 +0000 UTC m=+4447.557301062" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.016226 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pwdgl"] Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.018183 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.028963 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pwdgl"] Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.094024 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-catalog-content\") pod \"certified-operators-pwdgl\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.094126 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49p95\" (UniqueName: \"kubernetes.io/projected/070fb24b-b5df-425b-83e6-069e6efd6dd2-kube-api-access-49p95\") pod \"certified-operators-pwdgl\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.094231 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-utilities\") pod \"certified-operators-pwdgl\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.194888 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49p95\" (UniqueName: \"kubernetes.io/projected/070fb24b-b5df-425b-83e6-069e6efd6dd2-kube-api-access-49p95\") pod \"certified-operators-pwdgl\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.194993 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-utilities\") pod \"certified-operators-pwdgl\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.195059 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-catalog-content\") pod \"certified-operators-pwdgl\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.195546 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-catalog-content\") pod \"certified-operators-pwdgl\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.195625 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-utilities\") pod \"certified-operators-pwdgl\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.214839 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49p95\" (UniqueName: \"kubernetes.io/projected/070fb24b-b5df-425b-83e6-069e6efd6dd2-kube-api-access-49p95\") pod \"certified-operators-pwdgl\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.339873 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:19 crc kubenswrapper[5002]: I1209 11:15:19.866355 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pwdgl"] Dec 09 11:15:20 crc kubenswrapper[5002]: I1209 11:15:20.187735 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwdgl" event={"ID":"070fb24b-b5df-425b-83e6-069e6efd6dd2","Type":"ContainerStarted","Data":"d55355b8b6c73dea36d414a09fd04bdcf5f1f4edb2fcd4402ebe17bb4a91467d"} Dec 09 11:15:21 crc kubenswrapper[5002]: I1209 11:15:21.208716 5002 generic.go:334] "Generic (PLEG): container finished" podID="070fb24b-b5df-425b-83e6-069e6efd6dd2" containerID="4e8cea53356439d39faaca27ff8cb247f8c5a53ad2f2d103b0f7169d0de36b92" exitCode=0 Dec 09 11:15:21 crc kubenswrapper[5002]: I1209 11:15:21.208993 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwdgl" event={"ID":"070fb24b-b5df-425b-83e6-069e6efd6dd2","Type":"ContainerDied","Data":"4e8cea53356439d39faaca27ff8cb247f8c5a53ad2f2d103b0f7169d0de36b92"} Dec 09 11:15:21 crc kubenswrapper[5002]: I1209 11:15:21.974587 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:21 crc kubenswrapper[5002]: I1209 11:15:21.974655 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:22 crc kubenswrapper[5002]: I1209 11:15:22.025178 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:22 crc kubenswrapper[5002]: I1209 11:15:22.250332 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwdgl" event={"ID":"070fb24b-b5df-425b-83e6-069e6efd6dd2","Type":"ContainerStarted","Data":"18c60aeb8e75296e71c74a9f1ddbc334edb4bb13578db2139cea99bc3b2c44a8"} Dec 09 11:15:22 crc kubenswrapper[5002]: I1209 11:15:22.460870 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:23 crc kubenswrapper[5002]: I1209 11:15:23.259730 5002 generic.go:334] "Generic (PLEG): container finished" podID="070fb24b-b5df-425b-83e6-069e6efd6dd2" containerID="18c60aeb8e75296e71c74a9f1ddbc334edb4bb13578db2139cea99bc3b2c44a8" exitCode=0 Dec 09 11:15:23 crc kubenswrapper[5002]: I1209 11:15:23.259840 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwdgl" event={"ID":"070fb24b-b5df-425b-83e6-069e6efd6dd2","Type":"ContainerDied","Data":"18c60aeb8e75296e71c74a9f1ddbc334edb4bb13578db2139cea99bc3b2c44a8"} Dec 09 11:15:24 crc kubenswrapper[5002]: I1209 11:15:24.270053 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwdgl" event={"ID":"070fb24b-b5df-425b-83e6-069e6efd6dd2","Type":"ContainerStarted","Data":"2dc5a6fdc0a6d4b574f5e22b903476e4b4a67e49856db16c52d08e07a51be307"} Dec 09 11:15:24 crc kubenswrapper[5002]: I1209 11:15:24.288663 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pwdgl" podStartSLOduration=3.74061769 podStartE2EDuration="6.288643381s" podCreationTimestamp="2025-12-09 11:15:18 +0000 UTC" firstStartedPulling="2025-12-09 11:15:21.21046585 +0000 UTC m=+4453.602516931" lastFinishedPulling="2025-12-09 11:15:23.758491541 +0000 UTC m=+4456.150542622" observedRunningTime="2025-12-09 11:15:24.287751307 +0000 UTC m=+4456.679802408" watchObservedRunningTime="2025-12-09 11:15:24.288643381 +0000 UTC m=+4456.680694472" Dec 09 11:15:24 crc kubenswrapper[5002]: I1209 11:15:24.410108 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j5mq"] Dec 09 11:15:24 crc kubenswrapper[5002]: I1209 11:15:24.410378 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4j5mq" podUID="a58c7888-f969-41a8-9d04-0402135ce5b3" containerName="registry-server" containerID="cri-o://1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c" gracePeriod=2 Dec 09 11:15:24 crc kubenswrapper[5002]: I1209 11:15:24.796366 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:24 crc kubenswrapper[5002]: I1209 11:15:24.975201 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxlql\" (UniqueName: \"kubernetes.io/projected/a58c7888-f969-41a8-9d04-0402135ce5b3-kube-api-access-pxlql\") pod \"a58c7888-f969-41a8-9d04-0402135ce5b3\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " Dec 09 11:15:24 crc kubenswrapper[5002]: I1209 11:15:24.975352 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-utilities\") pod \"a58c7888-f969-41a8-9d04-0402135ce5b3\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " Dec 09 11:15:24 crc kubenswrapper[5002]: I1209 11:15:24.976246 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-utilities" (OuterVolumeSpecName: "utilities") pod "a58c7888-f969-41a8-9d04-0402135ce5b3" (UID: "a58c7888-f969-41a8-9d04-0402135ce5b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:15:24 crc kubenswrapper[5002]: I1209 11:15:24.976352 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-catalog-content\") pod \"a58c7888-f969-41a8-9d04-0402135ce5b3\" (UID: \"a58c7888-f969-41a8-9d04-0402135ce5b3\") " Dec 09 11:15:24 crc kubenswrapper[5002]: I1209 11:15:24.977659 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:15:24 crc kubenswrapper[5002]: I1209 11:15:24.980524 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58c7888-f969-41a8-9d04-0402135ce5b3-kube-api-access-pxlql" (OuterVolumeSpecName: "kube-api-access-pxlql") pod "a58c7888-f969-41a8-9d04-0402135ce5b3" (UID: "a58c7888-f969-41a8-9d04-0402135ce5b3"). InnerVolumeSpecName "kube-api-access-pxlql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.002383 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a58c7888-f969-41a8-9d04-0402135ce5b3" (UID: "a58c7888-f969-41a8-9d04-0402135ce5b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.078976 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxlql\" (UniqueName: \"kubernetes.io/projected/a58c7888-f969-41a8-9d04-0402135ce5b3-kube-api-access-pxlql\") on node \"crc\" DevicePath \"\"" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.079019 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58c7888-f969-41a8-9d04-0402135ce5b3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.278099 5002 generic.go:334] "Generic (PLEG): container finished" podID="a58c7888-f969-41a8-9d04-0402135ce5b3" containerID="1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c" exitCode=0 Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.278187 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j5mq" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.278214 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j5mq" event={"ID":"a58c7888-f969-41a8-9d04-0402135ce5b3","Type":"ContainerDied","Data":"1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c"} Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.278307 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j5mq" event={"ID":"a58c7888-f969-41a8-9d04-0402135ce5b3","Type":"ContainerDied","Data":"cde8a18aca52c3b2a6f862bea8765b21509639ae310b5d147caf298ebe2c3399"} Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.278337 5002 scope.go:117] "RemoveContainer" containerID="1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.297777 5002 scope.go:117] "RemoveContainer" containerID="b7095e8ac5e6882906359dcd52cbdf34119f5b331ed76e080642ca81a776bb3f" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.308129 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j5mq"] Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.314037 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j5mq"] Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.337377 5002 scope.go:117] "RemoveContainer" containerID="22fd7bc2d8ea4be6640a05ac85c292c008620110da480f7a2832b60d9eb1f2ea" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.352247 5002 scope.go:117] "RemoveContainer" containerID="1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c" Dec 09 11:15:25 crc kubenswrapper[5002]: E1209 11:15:25.352619 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c\": container with ID starting with 1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c not found: ID does not exist" containerID="1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.352671 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c"} err="failed to get container status \"1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c\": rpc error: code = NotFound desc = could not find container \"1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c\": container with ID starting with 1116df1fb64c9865094b43156c62ff1581325663af1fca2e15e0fef0bde4986c not found: ID does not exist" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.352697 5002 scope.go:117] "RemoveContainer" containerID="b7095e8ac5e6882906359dcd52cbdf34119f5b331ed76e080642ca81a776bb3f" Dec 09 11:15:25 crc kubenswrapper[5002]: E1209 11:15:25.353085 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7095e8ac5e6882906359dcd52cbdf34119f5b331ed76e080642ca81a776bb3f\": container with ID starting with b7095e8ac5e6882906359dcd52cbdf34119f5b331ed76e080642ca81a776bb3f not found: ID does not exist" containerID="b7095e8ac5e6882906359dcd52cbdf34119f5b331ed76e080642ca81a776bb3f" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.353115 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7095e8ac5e6882906359dcd52cbdf34119f5b331ed76e080642ca81a776bb3f"} err="failed to get container status \"b7095e8ac5e6882906359dcd52cbdf34119f5b331ed76e080642ca81a776bb3f\": rpc error: code = NotFound desc = could not find container \"b7095e8ac5e6882906359dcd52cbdf34119f5b331ed76e080642ca81a776bb3f\": container with ID starting with b7095e8ac5e6882906359dcd52cbdf34119f5b331ed76e080642ca81a776bb3f not found: ID does not exist" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.353137 5002 scope.go:117] "RemoveContainer" containerID="22fd7bc2d8ea4be6640a05ac85c292c008620110da480f7a2832b60d9eb1f2ea" Dec 09 11:15:25 crc kubenswrapper[5002]: E1209 11:15:25.353384 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22fd7bc2d8ea4be6640a05ac85c292c008620110da480f7a2832b60d9eb1f2ea\": container with ID starting with 22fd7bc2d8ea4be6640a05ac85c292c008620110da480f7a2832b60d9eb1f2ea not found: ID does not exist" containerID="22fd7bc2d8ea4be6640a05ac85c292c008620110da480f7a2832b60d9eb1f2ea" Dec 09 11:15:25 crc kubenswrapper[5002]: I1209 11:15:25.353419 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22fd7bc2d8ea4be6640a05ac85c292c008620110da480f7a2832b60d9eb1f2ea"} err="failed to get container status \"22fd7bc2d8ea4be6640a05ac85c292c008620110da480f7a2832b60d9eb1f2ea\": rpc error: code = NotFound desc = could not find container \"22fd7bc2d8ea4be6640a05ac85c292c008620110da480f7a2832b60d9eb1f2ea\": container with ID starting with 22fd7bc2d8ea4be6640a05ac85c292c008620110da480f7a2832b60d9eb1f2ea not found: ID does not exist" Dec 09 11:15:26 crc kubenswrapper[5002]: I1209 11:15:26.069724 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a58c7888-f969-41a8-9d04-0402135ce5b3" path="/var/lib/kubelet/pods/a58c7888-f969-41a8-9d04-0402135ce5b3/volumes" Dec 09 11:15:29 crc kubenswrapper[5002]: I1209 11:15:29.341051 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:29 crc kubenswrapper[5002]: I1209 11:15:29.341442 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:29 crc kubenswrapper[5002]: I1209 11:15:29.385049 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:30 crc kubenswrapper[5002]: I1209 11:15:30.369032 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:31 crc kubenswrapper[5002]: I1209 11:15:31.821245 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pwdgl"] Dec 09 11:15:32 crc kubenswrapper[5002]: I1209 11:15:32.333198 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pwdgl" podUID="070fb24b-b5df-425b-83e6-069e6efd6dd2" containerName="registry-server" containerID="cri-o://2dc5a6fdc0a6d4b574f5e22b903476e4b4a67e49856db16c52d08e07a51be307" gracePeriod=2 Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.349209 5002 generic.go:334] "Generic (PLEG): container finished" podID="070fb24b-b5df-425b-83e6-069e6efd6dd2" containerID="2dc5a6fdc0a6d4b574f5e22b903476e4b4a67e49856db16c52d08e07a51be307" exitCode=0 Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.349256 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwdgl" event={"ID":"070fb24b-b5df-425b-83e6-069e6efd6dd2","Type":"ContainerDied","Data":"2dc5a6fdc0a6d4b574f5e22b903476e4b4a67e49856db16c52d08e07a51be307"} Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.740741 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.816753 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-catalog-content\") pod \"070fb24b-b5df-425b-83e6-069e6efd6dd2\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.817971 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-utilities\") pod \"070fb24b-b5df-425b-83e6-069e6efd6dd2\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.818017 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49p95\" (UniqueName: \"kubernetes.io/projected/070fb24b-b5df-425b-83e6-069e6efd6dd2-kube-api-access-49p95\") pod \"070fb24b-b5df-425b-83e6-069e6efd6dd2\" (UID: \"070fb24b-b5df-425b-83e6-069e6efd6dd2\") " Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.818910 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-utilities" (OuterVolumeSpecName: "utilities") pod "070fb24b-b5df-425b-83e6-069e6efd6dd2" (UID: "070fb24b-b5df-425b-83e6-069e6efd6dd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.823370 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/070fb24b-b5df-425b-83e6-069e6efd6dd2-kube-api-access-49p95" (OuterVolumeSpecName: "kube-api-access-49p95") pod "070fb24b-b5df-425b-83e6-069e6efd6dd2" (UID: "070fb24b-b5df-425b-83e6-069e6efd6dd2"). InnerVolumeSpecName "kube-api-access-49p95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.863606 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "070fb24b-b5df-425b-83e6-069e6efd6dd2" (UID: "070fb24b-b5df-425b-83e6-069e6efd6dd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.918828 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.918863 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070fb24b-b5df-425b-83e6-069e6efd6dd2-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:15:34 crc kubenswrapper[5002]: I1209 11:15:34.918872 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49p95\" (UniqueName: \"kubernetes.io/projected/070fb24b-b5df-425b-83e6-069e6efd6dd2-kube-api-access-49p95\") on node \"crc\" DevicePath \"\"" Dec 09 11:15:35 crc kubenswrapper[5002]: I1209 11:15:35.358634 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwdgl" event={"ID":"070fb24b-b5df-425b-83e6-069e6efd6dd2","Type":"ContainerDied","Data":"d55355b8b6c73dea36d414a09fd04bdcf5f1f4edb2fcd4402ebe17bb4a91467d"} Dec 09 11:15:35 crc kubenswrapper[5002]: I1209 11:15:35.358687 5002 scope.go:117] "RemoveContainer" containerID="2dc5a6fdc0a6d4b574f5e22b903476e4b4a67e49856db16c52d08e07a51be307" Dec 09 11:15:35 crc kubenswrapper[5002]: I1209 11:15:35.359411 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwdgl" Dec 09 11:15:35 crc kubenswrapper[5002]: I1209 11:15:35.377890 5002 scope.go:117] "RemoveContainer" containerID="18c60aeb8e75296e71c74a9f1ddbc334edb4bb13578db2139cea99bc3b2c44a8" Dec 09 11:15:35 crc kubenswrapper[5002]: I1209 11:15:35.393655 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pwdgl"] Dec 09 11:15:35 crc kubenswrapper[5002]: I1209 11:15:35.399430 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pwdgl"] Dec 09 11:15:35 crc kubenswrapper[5002]: I1209 11:15:35.414647 5002 scope.go:117] "RemoveContainer" containerID="4e8cea53356439d39faaca27ff8cb247f8c5a53ad2f2d103b0f7169d0de36b92" Dec 09 11:15:36 crc kubenswrapper[5002]: I1209 11:15:36.068885 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="070fb24b-b5df-425b-83e6-069e6efd6dd2" path="/var/lib/kubelet/pods/070fb24b-b5df-425b-83e6-069e6efd6dd2/volumes" Dec 09 11:15:45 crc kubenswrapper[5002]: I1209 11:15:45.288160 5002 scope.go:117] "RemoveContainer" containerID="aa6fcbf12e04623fa63ea94c7e083da4f095f361fe544fe064077ad55174d680" Dec 09 11:16:07 crc kubenswrapper[5002]: I1209 11:16:07.964714 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:16:07 crc kubenswrapper[5002]: I1209 11:16:07.965393 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.705675 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-4qffn"] Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.712569 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-4qffn"] Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.829268 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-xhlfh"] Dec 09 11:16:14 crc kubenswrapper[5002]: E1209 11:16:14.829593 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070fb24b-b5df-425b-83e6-069e6efd6dd2" containerName="extract-content" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.829615 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="070fb24b-b5df-425b-83e6-069e6efd6dd2" containerName="extract-content" Dec 09 11:16:14 crc kubenswrapper[5002]: E1209 11:16:14.829627 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070fb24b-b5df-425b-83e6-069e6efd6dd2" containerName="extract-utilities" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.829634 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="070fb24b-b5df-425b-83e6-069e6efd6dd2" containerName="extract-utilities" Dec 09 11:16:14 crc kubenswrapper[5002]: E1209 11:16:14.829643 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070fb24b-b5df-425b-83e6-069e6efd6dd2" containerName="registry-server" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.829650 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="070fb24b-b5df-425b-83e6-069e6efd6dd2" containerName="registry-server" Dec 09 11:16:14 crc kubenswrapper[5002]: E1209 11:16:14.829668 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58c7888-f969-41a8-9d04-0402135ce5b3" containerName="extract-content" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.829674 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58c7888-f969-41a8-9d04-0402135ce5b3" containerName="extract-content" Dec 09 11:16:14 crc kubenswrapper[5002]: E1209 11:16:14.829688 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58c7888-f969-41a8-9d04-0402135ce5b3" containerName="extract-utilities" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.829698 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58c7888-f969-41a8-9d04-0402135ce5b3" containerName="extract-utilities" Dec 09 11:16:14 crc kubenswrapper[5002]: E1209 11:16:14.829718 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58c7888-f969-41a8-9d04-0402135ce5b3" containerName="registry-server" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.829727 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58c7888-f969-41a8-9d04-0402135ce5b3" containerName="registry-server" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.829877 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="070fb24b-b5df-425b-83e6-069e6efd6dd2" containerName="registry-server" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.829896 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a58c7888-f969-41a8-9d04-0402135ce5b3" containerName="registry-server" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.830396 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.833041 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.833116 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.834493 5002 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-vhsld" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.843695 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.865312 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xhlfh"] Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.976656 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fac2eae5-7bfc-4fa8-9fd0-329274deead7-node-mnt\") pod \"crc-storage-crc-xhlfh\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.976724 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzx6b\" (UniqueName: \"kubernetes.io/projected/fac2eae5-7bfc-4fa8-9fd0-329274deead7-kube-api-access-wzx6b\") pod \"crc-storage-crc-xhlfh\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:14 crc kubenswrapper[5002]: I1209 11:16:14.976866 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fac2eae5-7bfc-4fa8-9fd0-329274deead7-crc-storage\") pod \"crc-storage-crc-xhlfh\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:15 crc kubenswrapper[5002]: I1209 11:16:15.078457 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fac2eae5-7bfc-4fa8-9fd0-329274deead7-crc-storage\") pod \"crc-storage-crc-xhlfh\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:15 crc kubenswrapper[5002]: I1209 11:16:15.078530 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fac2eae5-7bfc-4fa8-9fd0-329274deead7-node-mnt\") pod \"crc-storage-crc-xhlfh\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:15 crc kubenswrapper[5002]: I1209 11:16:15.078560 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzx6b\" (UniqueName: \"kubernetes.io/projected/fac2eae5-7bfc-4fa8-9fd0-329274deead7-kube-api-access-wzx6b\") pod \"crc-storage-crc-xhlfh\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:15 crc kubenswrapper[5002]: I1209 11:16:15.079077 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fac2eae5-7bfc-4fa8-9fd0-329274deead7-node-mnt\") pod \"crc-storage-crc-xhlfh\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:15 crc kubenswrapper[5002]: I1209 11:16:15.079623 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fac2eae5-7bfc-4fa8-9fd0-329274deead7-crc-storage\") pod \"crc-storage-crc-xhlfh\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:15 crc kubenswrapper[5002]: I1209 11:16:15.097805 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzx6b\" (UniqueName: \"kubernetes.io/projected/fac2eae5-7bfc-4fa8-9fd0-329274deead7-kube-api-access-wzx6b\") pod \"crc-storage-crc-xhlfh\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:15 crc kubenswrapper[5002]: I1209 11:16:15.158292 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:15 crc kubenswrapper[5002]: I1209 11:16:15.380097 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xhlfh"] Dec 09 11:16:15 crc kubenswrapper[5002]: I1209 11:16:15.655455 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xhlfh" event={"ID":"fac2eae5-7bfc-4fa8-9fd0-329274deead7","Type":"ContainerStarted","Data":"2d1ea26dc208d02e328e98666d6f1e76b486aa8d795b540e69d69cdb8c3ed360"} Dec 09 11:16:16 crc kubenswrapper[5002]: I1209 11:16:16.097213 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0ffb30-1518-4d4c-ac8b-b619bd493a5b" path="/var/lib/kubelet/pods/7b0ffb30-1518-4d4c-ac8b-b619bd493a5b/volumes" Dec 09 11:16:17 crc kubenswrapper[5002]: I1209 11:16:17.669189 5002 generic.go:334] "Generic (PLEG): container finished" podID="fac2eae5-7bfc-4fa8-9fd0-329274deead7" containerID="914e993c4badace72ab6cb15e7954efe7851adf6447e9d9fdde84d6738f9a6dc" exitCode=0 Dec 09 11:16:17 crc kubenswrapper[5002]: I1209 11:16:17.669223 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xhlfh" event={"ID":"fac2eae5-7bfc-4fa8-9fd0-329274deead7","Type":"ContainerDied","Data":"914e993c4badace72ab6cb15e7954efe7851adf6447e9d9fdde84d6738f9a6dc"} Dec 09 11:16:18 crc kubenswrapper[5002]: I1209 11:16:18.966311 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.133680 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fac2eae5-7bfc-4fa8-9fd0-329274deead7-node-mnt\") pod \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.133829 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac2eae5-7bfc-4fa8-9fd0-329274deead7-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "fac2eae5-7bfc-4fa8-9fd0-329274deead7" (UID: "fac2eae5-7bfc-4fa8-9fd0-329274deead7"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.134085 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fac2eae5-7bfc-4fa8-9fd0-329274deead7-crc-storage\") pod \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.134163 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzx6b\" (UniqueName: \"kubernetes.io/projected/fac2eae5-7bfc-4fa8-9fd0-329274deead7-kube-api-access-wzx6b\") pod \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\" (UID: \"fac2eae5-7bfc-4fa8-9fd0-329274deead7\") " Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.134485 5002 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fac2eae5-7bfc-4fa8-9fd0-329274deead7-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.143744 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac2eae5-7bfc-4fa8-9fd0-329274deead7-kube-api-access-wzx6b" (OuterVolumeSpecName: "kube-api-access-wzx6b") pod "fac2eae5-7bfc-4fa8-9fd0-329274deead7" (UID: "fac2eae5-7bfc-4fa8-9fd0-329274deead7"). InnerVolumeSpecName "kube-api-access-wzx6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.156353 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fac2eae5-7bfc-4fa8-9fd0-329274deead7-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "fac2eae5-7bfc-4fa8-9fd0-329274deead7" (UID: "fac2eae5-7bfc-4fa8-9fd0-329274deead7"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.235572 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzx6b\" (UniqueName: \"kubernetes.io/projected/fac2eae5-7bfc-4fa8-9fd0-329274deead7-kube-api-access-wzx6b\") on node \"crc\" DevicePath \"\"" Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.235612 5002 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fac2eae5-7bfc-4fa8-9fd0-329274deead7-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.683907 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xhlfh" event={"ID":"fac2eae5-7bfc-4fa8-9fd0-329274deead7","Type":"ContainerDied","Data":"2d1ea26dc208d02e328e98666d6f1e76b486aa8d795b540e69d69cdb8c3ed360"} Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.683951 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d1ea26dc208d02e328e98666d6f1e76b486aa8d795b540e69d69cdb8c3ed360" Dec 09 11:16:19 crc kubenswrapper[5002]: I1209 11:16:19.684007 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xhlfh" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.199209 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-xhlfh"] Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.204426 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-xhlfh"] Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.314146 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ld7mr"] Dec 09 11:16:21 crc kubenswrapper[5002]: E1209 11:16:21.314566 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac2eae5-7bfc-4fa8-9fd0-329274deead7" containerName="storage" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.314590 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac2eae5-7bfc-4fa8-9fd0-329274deead7" containerName="storage" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.314785 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac2eae5-7bfc-4fa8-9fd0-329274deead7" containerName="storage" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.315429 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.318314 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.318317 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.318684 5002 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-vhsld" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.319724 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.334035 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ld7mr"] Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.466441 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/18ae0272-32f6-4a7c-9240-251cf1f79aaf-node-mnt\") pod \"crc-storage-crc-ld7mr\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.466510 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/18ae0272-32f6-4a7c-9240-251cf1f79aaf-crc-storage\") pod \"crc-storage-crc-ld7mr\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.466551 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfhlg\" (UniqueName: \"kubernetes.io/projected/18ae0272-32f6-4a7c-9240-251cf1f79aaf-kube-api-access-dfhlg\") pod \"crc-storage-crc-ld7mr\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.567979 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/18ae0272-32f6-4a7c-9240-251cf1f79aaf-node-mnt\") pod \"crc-storage-crc-ld7mr\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.568050 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/18ae0272-32f6-4a7c-9240-251cf1f79aaf-crc-storage\") pod \"crc-storage-crc-ld7mr\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.568080 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfhlg\" (UniqueName: \"kubernetes.io/projected/18ae0272-32f6-4a7c-9240-251cf1f79aaf-kube-api-access-dfhlg\") pod \"crc-storage-crc-ld7mr\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.568368 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/18ae0272-32f6-4a7c-9240-251cf1f79aaf-node-mnt\") pod \"crc-storage-crc-ld7mr\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.568879 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/18ae0272-32f6-4a7c-9240-251cf1f79aaf-crc-storage\") pod \"crc-storage-crc-ld7mr\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.731506 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfhlg\" (UniqueName: \"kubernetes.io/projected/18ae0272-32f6-4a7c-9240-251cf1f79aaf-kube-api-access-dfhlg\") pod \"crc-storage-crc-ld7mr\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:21 crc kubenswrapper[5002]: I1209 11:16:21.931948 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:22 crc kubenswrapper[5002]: I1209 11:16:22.077047 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fac2eae5-7bfc-4fa8-9fd0-329274deead7" path="/var/lib/kubelet/pods/fac2eae5-7bfc-4fa8-9fd0-329274deead7/volumes" Dec 09 11:16:22 crc kubenswrapper[5002]: I1209 11:16:22.367707 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ld7mr"] Dec 09 11:16:22 crc kubenswrapper[5002]: I1209 11:16:22.704722 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ld7mr" event={"ID":"18ae0272-32f6-4a7c-9240-251cf1f79aaf","Type":"ContainerStarted","Data":"ee767c539cb83fe9b340c0e15ac35eafbfd702d0ddcb70da6be0da48094ce641"} Dec 09 11:16:23 crc kubenswrapper[5002]: I1209 11:16:23.712066 5002 generic.go:334] "Generic (PLEG): container finished" podID="18ae0272-32f6-4a7c-9240-251cf1f79aaf" containerID="969636260e7a63e145716cc8abbe36c93b40734b6984b071f0591af29c01a81e" exitCode=0 Dec 09 11:16:23 crc kubenswrapper[5002]: I1209 11:16:23.712141 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ld7mr" event={"ID":"18ae0272-32f6-4a7c-9240-251cf1f79aaf","Type":"ContainerDied","Data":"969636260e7a63e145716cc8abbe36c93b40734b6984b071f0591af29c01a81e"} Dec 09 11:16:24 crc kubenswrapper[5002]: I1209 11:16:24.968148 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.126444 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/18ae0272-32f6-4a7c-9240-251cf1f79aaf-crc-storage\") pod \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.126746 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfhlg\" (UniqueName: \"kubernetes.io/projected/18ae0272-32f6-4a7c-9240-251cf1f79aaf-kube-api-access-dfhlg\") pod \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.129102 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/18ae0272-32f6-4a7c-9240-251cf1f79aaf-node-mnt\") pod \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\" (UID: \"18ae0272-32f6-4a7c-9240-251cf1f79aaf\") " Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.129201 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18ae0272-32f6-4a7c-9240-251cf1f79aaf-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "18ae0272-32f6-4a7c-9240-251cf1f79aaf" (UID: "18ae0272-32f6-4a7c-9240-251cf1f79aaf"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.129876 5002 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/18ae0272-32f6-4a7c-9240-251cf1f79aaf-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.134062 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ae0272-32f6-4a7c-9240-251cf1f79aaf-kube-api-access-dfhlg" (OuterVolumeSpecName: "kube-api-access-dfhlg") pod "18ae0272-32f6-4a7c-9240-251cf1f79aaf" (UID: "18ae0272-32f6-4a7c-9240-251cf1f79aaf"). InnerVolumeSpecName "kube-api-access-dfhlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.142963 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ae0272-32f6-4a7c-9240-251cf1f79aaf-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "18ae0272-32f6-4a7c-9240-251cf1f79aaf" (UID: "18ae0272-32f6-4a7c-9240-251cf1f79aaf"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.230863 5002 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/18ae0272-32f6-4a7c-9240-251cf1f79aaf-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.230893 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfhlg\" (UniqueName: \"kubernetes.io/projected/18ae0272-32f6-4a7c-9240-251cf1f79aaf-kube-api-access-dfhlg\") on node \"crc\" DevicePath \"\"" Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.731379 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ld7mr" event={"ID":"18ae0272-32f6-4a7c-9240-251cf1f79aaf","Type":"ContainerDied","Data":"ee767c539cb83fe9b340c0e15ac35eafbfd702d0ddcb70da6be0da48094ce641"} Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.731471 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee767c539cb83fe9b340c0e15ac35eafbfd702d0ddcb70da6be0da48094ce641" Dec 09 11:16:25 crc kubenswrapper[5002]: I1209 11:16:25.731423 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld7mr" Dec 09 11:16:37 crc kubenswrapper[5002]: I1209 11:16:37.965312 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:16:37 crc kubenswrapper[5002]: I1209 11:16:37.965765 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:16:45 crc kubenswrapper[5002]: I1209 11:16:45.362354 5002 scope.go:117] "RemoveContainer" containerID="9cd0cf638903e9cf4ac27f38a9d808ccdcf8521186458e924d0f193ef5e4b6f6" Dec 09 11:17:07 crc kubenswrapper[5002]: I1209 11:17:07.964739 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:17:07 crc kubenswrapper[5002]: I1209 11:17:07.965294 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:17:07 crc kubenswrapper[5002]: I1209 11:17:07.965337 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 11:17:07 crc kubenswrapper[5002]: I1209 11:17:07.965963 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:17:07 crc kubenswrapper[5002]: I1209 11:17:07.966045 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" gracePeriod=600 Dec 09 11:17:08 crc kubenswrapper[5002]: E1209 11:17:08.595685 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:17:09 crc kubenswrapper[5002]: I1209 11:17:09.035723 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" exitCode=0 Dec 09 11:17:09 crc kubenswrapper[5002]: I1209 11:17:09.035769 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3"} Dec 09 11:17:09 crc kubenswrapper[5002]: I1209 11:17:09.035811 5002 scope.go:117] "RemoveContainer" containerID="1afc9ebf9ed2fc9e4d452f4f472fa50ac1aa3c9ea389d4cf549db9d1f5d70925" Dec 09 11:17:09 crc kubenswrapper[5002]: I1209 11:17:09.036225 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:17:09 crc kubenswrapper[5002]: E1209 11:17:09.036501 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:17:20 crc kubenswrapper[5002]: I1209 11:17:20.060928 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:17:20 crc kubenswrapper[5002]: E1209 11:17:20.061995 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:17:34 crc kubenswrapper[5002]: I1209 11:17:34.061520 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:17:34 crc kubenswrapper[5002]: E1209 11:17:34.062938 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:17:47 crc kubenswrapper[5002]: I1209 11:17:47.060459 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:17:47 crc kubenswrapper[5002]: E1209 11:17:47.061175 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:18:00 crc kubenswrapper[5002]: I1209 11:18:00.060691 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:18:00 crc kubenswrapper[5002]: E1209 11:18:00.062461 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:18:15 crc kubenswrapper[5002]: I1209 11:18:15.059955 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:18:15 crc kubenswrapper[5002]: E1209 11:18:15.060704 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:18:29 crc kubenswrapper[5002]: I1209 11:18:29.059867 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:18:29 crc kubenswrapper[5002]: E1209 11:18:29.061607 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:18:43 crc kubenswrapper[5002]: I1209 11:18:43.061091 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:18:43 crc kubenswrapper[5002]: E1209 11:18:43.062159 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:18:58 crc kubenswrapper[5002]: I1209 11:18:58.067137 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:18:58 crc kubenswrapper[5002]: E1209 11:18:58.067770 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:19:13 crc kubenswrapper[5002]: I1209 11:19:13.060033 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:19:13 crc kubenswrapper[5002]: E1209 11:19:13.060959 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:19:24 crc kubenswrapper[5002]: I1209 11:19:24.059697 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:19:24 crc kubenswrapper[5002]: E1209 11:19:24.060573 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.263713 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-6z772"] Dec 09 11:19:31 crc kubenswrapper[5002]: E1209 11:19:31.264512 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ae0272-32f6-4a7c-9240-251cf1f79aaf" containerName="storage" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.264593 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ae0272-32f6-4a7c-9240-251cf1f79aaf" containerName="storage" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.264754 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ae0272-32f6-4a7c-9240-251cf1f79aaf" containerName="storage" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.265518 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.270250 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.270282 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.270383 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-4jf92" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.270295 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.274367 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.276506 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-6z772"] Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.385374 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-config\") pod \"dnsmasq-dns-5d7b5456f5-6z772\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.385446 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krcjh\" (UniqueName: \"kubernetes.io/projected/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-kube-api-access-krcjh\") pod \"dnsmasq-dns-5d7b5456f5-6z772\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.385484 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-6z772\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.487155 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krcjh\" (UniqueName: \"kubernetes.io/projected/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-kube-api-access-krcjh\") pod \"dnsmasq-dns-5d7b5456f5-6z772\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.487214 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-6z772\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.487338 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-config\") pod \"dnsmasq-dns-5d7b5456f5-6z772\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.488424 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-config\") pod \"dnsmasq-dns-5d7b5456f5-6z772\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.489240 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-6z772\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.519468 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krcjh\" (UniqueName: \"kubernetes.io/projected/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-kube-api-access-krcjh\") pod \"dnsmasq-dns-5d7b5456f5-6z772\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.546323 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-rw2wd"] Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.547910 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.562762 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-rw2wd"] Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.583333 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.689662 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdmf\" (UniqueName: \"kubernetes.io/projected/7dc8d716-07de-4018-91e5-a94ccc37a692-kube-api-access-2zdmf\") pod \"dnsmasq-dns-98ddfc8f-rw2wd\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.690013 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-rw2wd\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.690078 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-config\") pod \"dnsmasq-dns-98ddfc8f-rw2wd\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.791728 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-config\") pod \"dnsmasq-dns-98ddfc8f-rw2wd\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.791836 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdmf\" (UniqueName: \"kubernetes.io/projected/7dc8d716-07de-4018-91e5-a94ccc37a692-kube-api-access-2zdmf\") pod \"dnsmasq-dns-98ddfc8f-rw2wd\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.791923 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-rw2wd\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.792984 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-rw2wd\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.793584 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-config\") pod \"dnsmasq-dns-98ddfc8f-rw2wd\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.822191 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdmf\" (UniqueName: \"kubernetes.io/projected/7dc8d716-07de-4018-91e5-a94ccc37a692-kube-api-access-2zdmf\") pod \"dnsmasq-dns-98ddfc8f-rw2wd\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:31 crc kubenswrapper[5002]: I1209 11:19:31.865319 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.041726 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-6z772"] Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.120664 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" event={"ID":"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1","Type":"ContainerStarted","Data":"40e9c52a8d729862f26d31facb508cd1607add675a662862f1d7310df5e0a8b5"} Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.277123 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-rw2wd"] Dec 09 11:19:32 crc kubenswrapper[5002]: W1209 11:19:32.283741 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc8d716_07de_4018_91e5_a94ccc37a692.slice/crio-ec200b5989512d802ef03ecdb70d3003bfe972599d5fdeee63241699a2a463d5 WatchSource:0}: Error finding container ec200b5989512d802ef03ecdb70d3003bfe972599d5fdeee63241699a2a463d5: Status 404 returned error can't find the container with id ec200b5989512d802ef03ecdb70d3003bfe972599d5fdeee63241699a2a463d5 Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.412757 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.414236 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.417334 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.417500 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.417630 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.417788 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.417904 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-228rx" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.431697 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.501579 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.501631 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.501779 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.501912 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.501967 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.501994 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsckf\" (UniqueName: \"kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-kube-api-access-gsckf\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.502010 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.502044 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.502066 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.603163 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.603229 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.603278 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.603301 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.603341 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.603372 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsckf\" (UniqueName: \"kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-kube-api-access-gsckf\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.603395 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.603423 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.603450 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.604478 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.604795 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.606035 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.606386 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.608938 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.609292 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.615597 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.615646 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b272446c890aff24493bdcc404f373c7542819e469da99fc019e55263aae6767/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.617795 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.632908 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsckf\" (UniqueName: \"kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-kube-api-access-gsckf\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.655149 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\") pod \"rabbitmq-server-0\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.702599 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.703803 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.705465 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-87bn9" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.706009 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.706122 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.706442 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.707702 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.720674 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.755915 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.806472 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb8vk\" (UniqueName: \"kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-kube-api-access-jb8vk\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.806585 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62678300-fdda-43eb-88cc-9104f3950c14-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.806615 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.806643 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.806737 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.806847 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2a840a07-2967-4f13-8754-4816b3607e7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.806940 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62678300-fdda-43eb-88cc-9104f3950c14-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.806984 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.807075 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.908168 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2a840a07-2967-4f13-8754-4816b3607e7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.908581 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62678300-fdda-43eb-88cc-9104f3950c14-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.908610 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.908646 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.908674 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb8vk\" (UniqueName: \"kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-kube-api-access-jb8vk\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.908728 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62678300-fdda-43eb-88cc-9104f3950c14-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.908746 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.908764 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.908783 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.909437 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.909568 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.909725 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.910109 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.912456 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62678300-fdda-43eb-88cc-9104f3950c14-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.913387 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.917310 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62678300-fdda-43eb-88cc-9104f3950c14-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.917435 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.917504 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2a840a07-2967-4f13-8754-4816b3607e7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aee1bc9c0e18517cf5839fabf07a7ec45fd5a691a4dd335abaa37517e9b64dea/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.928844 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb8vk\" (UniqueName: \"kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-kube-api-access-jb8vk\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:32 crc kubenswrapper[5002]: I1209 11:19:32.946945 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2a840a07-2967-4f13-8754-4816b3607e7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b\") pod \"rabbitmq-cell1-server-0\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.027198 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.141100 5002 generic.go:334] "Generic (PLEG): container finished" podID="7dc8d716-07de-4018-91e5-a94ccc37a692" containerID="26b46a3c7f05cfbce9786c0631708109ca6b81691b4547f176d4e4ce34b3e481" exitCode=0 Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.141203 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" event={"ID":"7dc8d716-07de-4018-91e5-a94ccc37a692","Type":"ContainerDied","Data":"26b46a3c7f05cfbce9786c0631708109ca6b81691b4547f176d4e4ce34b3e481"} Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.141236 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" event={"ID":"7dc8d716-07de-4018-91e5-a94ccc37a692","Type":"ContainerStarted","Data":"ec200b5989512d802ef03ecdb70d3003bfe972599d5fdeee63241699a2a463d5"} Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.146149 5002 generic.go:334] "Generic (PLEG): container finished" podID="37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" containerID="234ce99c6f653ee082b5ae28f9e01005519c705ebf17171f65b6b979227b0c03" exitCode=0 Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.146185 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" event={"ID":"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1","Type":"ContainerDied","Data":"234ce99c6f653ee082b5ae28f9e01005519c705ebf17171f65b6b979227b0c03"} Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.188491 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:19:33 crc kubenswrapper[5002]: W1209 11:19:33.193001 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9fcf8b7_a837_42a0_84a7_d6b3ed961f6d.slice/crio-bf98ee07d9efb543e0f62abd19ef06dfb58a9bd98f98024880f698c783b85d55 WatchSource:0}: Error finding container bf98ee07d9efb543e0f62abd19ef06dfb58a9bd98f98024880f698c783b85d55: Status 404 returned error can't find the container with id bf98ee07d9efb543e0f62abd19ef06dfb58a9bd98f98024880f698c783b85d55 Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.551671 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:19:33 crc kubenswrapper[5002]: W1209 11:19:33.564036 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62678300_fdda_43eb_88cc_9104f3950c14.slice/crio-594f1cf755d2099a61c5d9ba22973ad642f297cbf769eac1ce6c6fc9993ba2fb WatchSource:0}: Error finding container 594f1cf755d2099a61c5d9ba22973ad642f297cbf769eac1ce6c6fc9993ba2fb: Status 404 returned error can't find the container with id 594f1cf755d2099a61c5d9ba22973ad642f297cbf769eac1ce6c6fc9993ba2fb Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.648965 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.657189 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.663393 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-m5sld" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.663685 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.663919 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.665116 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.679040 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.686959 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.739506 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68ec5be1-063f-481b-a00c-cf882c596d5f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.739561 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68ec5be1-063f-481b-a00c-cf882c596d5f-kolla-config\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.739599 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68ec5be1-063f-481b-a00c-cf882c596d5f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.739643 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68ec5be1-063f-481b-a00c-cf882c596d5f-config-data-default\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.739663 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjc94\" (UniqueName: \"kubernetes.io/projected/68ec5be1-063f-481b-a00c-cf882c596d5f-kube-api-access-kjc94\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.739687 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-260e0be2-5533-4807-aa00-72c998a8320e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-260e0be2-5533-4807-aa00-72c998a8320e\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.739711 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68ec5be1-063f-481b-a00c-cf882c596d5f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.739737 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ec5be1-063f-481b-a00c-cf882c596d5f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.840499 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68ec5be1-063f-481b-a00c-cf882c596d5f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.840943 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ec5be1-063f-481b-a00c-cf882c596d5f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.841058 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68ec5be1-063f-481b-a00c-cf882c596d5f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.841135 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68ec5be1-063f-481b-a00c-cf882c596d5f-kolla-config\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.841261 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68ec5be1-063f-481b-a00c-cf882c596d5f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.841632 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68ec5be1-063f-481b-a00c-cf882c596d5f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.841928 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68ec5be1-063f-481b-a00c-cf882c596d5f-kolla-config\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.842643 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68ec5be1-063f-481b-a00c-cf882c596d5f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.842787 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68ec5be1-063f-481b-a00c-cf882c596d5f-config-data-default\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.843521 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjc94\" (UniqueName: \"kubernetes.io/projected/68ec5be1-063f-481b-a00c-cf882c596d5f-kube-api-access-kjc94\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.843692 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-260e0be2-5533-4807-aa00-72c998a8320e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-260e0be2-5533-4807-aa00-72c998a8320e\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.843377 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68ec5be1-063f-481b-a00c-cf882c596d5f-config-data-default\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.844170 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68ec5be1-063f-481b-a00c-cf882c596d5f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.844637 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ec5be1-063f-481b-a00c-cf882c596d5f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.846413 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.846522 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-260e0be2-5533-4807-aa00-72c998a8320e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-260e0be2-5533-4807-aa00-72c998a8320e\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/14db0e99ecca92017d5625f3386408c2dd717cf1adfbea1400ae67b6739aab70/globalmount\"" pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.932269 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjc94\" (UniqueName: \"kubernetes.io/projected/68ec5be1-063f-481b-a00c-cf882c596d5f-kube-api-access-kjc94\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.976855 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-260e0be2-5533-4807-aa00-72c998a8320e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-260e0be2-5533-4807-aa00-72c998a8320e\") pod \"openstack-galera-0\" (UID: \"68ec5be1-063f-481b-a00c-cf882c596d5f\") " pod="openstack/openstack-galera-0" Dec 09 11:19:33 crc kubenswrapper[5002]: I1209 11:19:33.988981 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.139952 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.141012 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.143119 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xwzwz" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.143759 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.154845 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.181663 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" event={"ID":"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1","Type":"ContainerStarted","Data":"b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800"} Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.181836 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.190568 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d","Type":"ContainerStarted","Data":"bf98ee07d9efb543e0f62abd19ef06dfb58a9bd98f98024880f698c783b85d55"} Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.198352 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" event={"ID":"7dc8d716-07de-4018-91e5-a94ccc37a692","Type":"ContainerStarted","Data":"815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5"} Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.208995 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62678300-fdda-43eb-88cc-9104f3950c14","Type":"ContainerStarted","Data":"594f1cf755d2099a61c5d9ba22973ad642f297cbf769eac1ce6c6fc9993ba2fb"} Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.219718 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" podStartSLOduration=3.219702472 podStartE2EDuration="3.219702472s" podCreationTimestamp="2025-12-09 11:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:19:34.217960336 +0000 UTC m=+4706.610011417" watchObservedRunningTime="2025-12-09 11:19:34.219702472 +0000 UTC m=+4706.611753553" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.244015 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" podStartSLOduration=3.243994672 podStartE2EDuration="3.243994672s" podCreationTimestamp="2025-12-09 11:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:19:34.240157869 +0000 UTC m=+4706.632208960" watchObservedRunningTime="2025-12-09 11:19:34.243994672 +0000 UTC m=+4706.636045753" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.251003 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81-config-data\") pod \"memcached-0\" (UID: \"7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81\") " pod="openstack/memcached-0" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.251742 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhdv9\" (UniqueName: \"kubernetes.io/projected/7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81-kube-api-access-qhdv9\") pod \"memcached-0\" (UID: \"7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81\") " pod="openstack/memcached-0" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.252248 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81-kolla-config\") pod \"memcached-0\" (UID: \"7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81\") " pod="openstack/memcached-0" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.281666 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.353523 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81-kolla-config\") pod \"memcached-0\" (UID: \"7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81\") " pod="openstack/memcached-0" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.354548 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81-config-data\") pod \"memcached-0\" (UID: \"7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81\") " pod="openstack/memcached-0" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.354661 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhdv9\" (UniqueName: \"kubernetes.io/projected/7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81-kube-api-access-qhdv9\") pod \"memcached-0\" (UID: \"7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81\") " pod="openstack/memcached-0" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.354444 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81-kolla-config\") pod \"memcached-0\" (UID: \"7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81\") " pod="openstack/memcached-0" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.355488 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81-config-data\") pod \"memcached-0\" (UID: \"7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81\") " pod="openstack/memcached-0" Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.530915 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhdv9\" (UniqueName: \"kubernetes.io/projected/7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81-kube-api-access-qhdv9\") pod \"memcached-0\" (UID: \"7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81\") " pod="openstack/memcached-0" Dec 09 11:19:34 crc kubenswrapper[5002]: W1209 11:19:34.535132 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68ec5be1_063f_481b_a00c_cf882c596d5f.slice/crio-9442fcce033257a4ae7ee77437e8c945bf73e22067a7cd6202f6932718ef79c5 WatchSource:0}: Error finding container 9442fcce033257a4ae7ee77437e8c945bf73e22067a7cd6202f6932718ef79c5: Status 404 returned error can't find the container with id 9442fcce033257a4ae7ee77437e8c945bf73e22067a7cd6202f6932718ef79c5 Dec 09 11:19:34 crc kubenswrapper[5002]: I1209 11:19:34.761209 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.207952 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.209416 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.215143 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.215222 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.215368 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.215405 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-7wk8s" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.223772 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d","Type":"ContainerStarted","Data":"33e96bd8698fd76a0217e6236bcf79b43a092291e150d26de91036fad7b27a98"} Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.226324 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62678300-fdda-43eb-88cc-9104f3950c14","Type":"ContainerStarted","Data":"5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1"} Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.227875 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.230736 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68ec5be1-063f-481b-a00c-cf882c596d5f","Type":"ContainerStarted","Data":"a4e0fefc735a0ffdc86e4f1e67e2b03d60df49c6cab722578fca00244790d0aa"} Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.230779 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68ec5be1-063f-481b-a00c-cf882c596d5f","Type":"ContainerStarted","Data":"9442fcce033257a4ae7ee77437e8c945bf73e22067a7cd6202f6932718ef79c5"} Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.231091 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.275571 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 09 11:19:35 crc kubenswrapper[5002]: W1209 11:19:35.283146 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a5d1355_cfbe_4ebb_bb37_a8ce5b29de81.slice/crio-275916fe37e0539cf87d0645d7a8d4856087817997920b6a073670fb53fcdd6e WatchSource:0}: Error finding container 275916fe37e0539cf87d0645d7a8d4856087817997920b6a073670fb53fcdd6e: Status 404 returned error can't find the container with id 275916fe37e0539cf87d0645d7a8d4856087817997920b6a073670fb53fcdd6e Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.370714 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2zxv\" (UniqueName: \"kubernetes.io/projected/22c2018d-2934-4552-a848-770614c6ff8c-kube-api-access-z2zxv\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.370772 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22c2018d-2934-4552-a848-770614c6ff8c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.370865 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c2018d-2934-4552-a848-770614c6ff8c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.370911 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a1c88718-4535-4315-8911-ce45d5b3e9e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1c88718-4535-4315-8911-ce45d5b3e9e7\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.370959 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22c2018d-2934-4552-a848-770614c6ff8c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.370998 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22c2018d-2934-4552-a848-770614c6ff8c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.371029 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c2018d-2934-4552-a848-770614c6ff8c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.371076 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22c2018d-2934-4552-a848-770614c6ff8c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.473015 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2zxv\" (UniqueName: \"kubernetes.io/projected/22c2018d-2934-4552-a848-770614c6ff8c-kube-api-access-z2zxv\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.473070 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22c2018d-2934-4552-a848-770614c6ff8c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.473110 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c2018d-2934-4552-a848-770614c6ff8c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.473143 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a1c88718-4535-4315-8911-ce45d5b3e9e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1c88718-4535-4315-8911-ce45d5b3e9e7\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.473175 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22c2018d-2934-4552-a848-770614c6ff8c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.473204 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22c2018d-2934-4552-a848-770614c6ff8c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.473228 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c2018d-2934-4552-a848-770614c6ff8c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.473251 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22c2018d-2934-4552-a848-770614c6ff8c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.473663 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22c2018d-2934-4552-a848-770614c6ff8c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.474528 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22c2018d-2934-4552-a848-770614c6ff8c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.475092 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22c2018d-2934-4552-a848-770614c6ff8c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.476228 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22c2018d-2934-4552-a848-770614c6ff8c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.481423 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c2018d-2934-4552-a848-770614c6ff8c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.481650 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c2018d-2934-4552-a848-770614c6ff8c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.482663 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.482722 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a1c88718-4535-4315-8911-ce45d5b3e9e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1c88718-4535-4315-8911-ce45d5b3e9e7\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/04725cfceaded07bc0bcdeef889119f61c347ada7696da53253a8826ae7afd3b/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.490274 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2zxv\" (UniqueName: \"kubernetes.io/projected/22c2018d-2934-4552-a848-770614c6ff8c-kube-api-access-z2zxv\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.509220 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a1c88718-4535-4315-8911-ce45d5b3e9e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1c88718-4535-4315-8911-ce45d5b3e9e7\") pod \"openstack-cell1-galera-0\" (UID: \"22c2018d-2934-4552-a848-770614c6ff8c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.529999 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:35 crc kubenswrapper[5002]: I1209 11:19:35.954786 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 11:19:36 crc kubenswrapper[5002]: I1209 11:19:36.240431 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81","Type":"ContainerStarted","Data":"f501216286ab5eb8414fb99595d339bb5bd5b14363bf0f1477d656879f0e3fab"} Dec 09 11:19:36 crc kubenswrapper[5002]: I1209 11:19:36.240473 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81","Type":"ContainerStarted","Data":"275916fe37e0539cf87d0645d7a8d4856087817997920b6a073670fb53fcdd6e"} Dec 09 11:19:36 crc kubenswrapper[5002]: I1209 11:19:36.269303 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.269274961 podStartE2EDuration="2.269274961s" podCreationTimestamp="2025-12-09 11:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:19:36.267205546 +0000 UTC m=+4708.659256657" watchObservedRunningTime="2025-12-09 11:19:36.269274961 +0000 UTC m=+4708.661326042" Dec 09 11:19:37 crc kubenswrapper[5002]: I1209 11:19:37.060065 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:19:37 crc kubenswrapper[5002]: E1209 11:19:37.060751 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:19:37 crc kubenswrapper[5002]: I1209 11:19:37.254250 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22c2018d-2934-4552-a848-770614c6ff8c","Type":"ContainerStarted","Data":"2e9251ecdd93612a08128be6810d9a6a550c13f36cb264b9f1ca52c58629ab11"} Dec 09 11:19:37 crc kubenswrapper[5002]: I1209 11:19:37.254349 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22c2018d-2934-4552-a848-770614c6ff8c","Type":"ContainerStarted","Data":"b802afe7e6cd4d3402ab95c98ea2f3e5f057ebbcb421356a5b6d42234b5f6ba6"} Dec 09 11:19:37 crc kubenswrapper[5002]: I1209 11:19:37.254496 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 09 11:19:38 crc kubenswrapper[5002]: I1209 11:19:38.267640 5002 generic.go:334] "Generic (PLEG): container finished" podID="68ec5be1-063f-481b-a00c-cf882c596d5f" containerID="a4e0fefc735a0ffdc86e4f1e67e2b03d60df49c6cab722578fca00244790d0aa" exitCode=0 Dec 09 11:19:38 crc kubenswrapper[5002]: I1209 11:19:38.267705 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68ec5be1-063f-481b-a00c-cf882c596d5f","Type":"ContainerDied","Data":"a4e0fefc735a0ffdc86e4f1e67e2b03d60df49c6cab722578fca00244790d0aa"} Dec 09 11:19:39 crc kubenswrapper[5002]: I1209 11:19:39.276920 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68ec5be1-063f-481b-a00c-cf882c596d5f","Type":"ContainerStarted","Data":"d6239ca21443c158566a3ea2f3fc3051b19dac8cbf30575b2d5737647edf3d17"} Dec 09 11:19:39 crc kubenswrapper[5002]: I1209 11:19:39.307030 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.307014021 podStartE2EDuration="7.307014021s" podCreationTimestamp="2025-12-09 11:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:19:39.303754544 +0000 UTC m=+4711.695805625" watchObservedRunningTime="2025-12-09 11:19:39.307014021 +0000 UTC m=+4711.699065102" Dec 09 11:19:40 crc kubenswrapper[5002]: I1209 11:19:40.285958 5002 generic.go:334] "Generic (PLEG): container finished" podID="22c2018d-2934-4552-a848-770614c6ff8c" containerID="2e9251ecdd93612a08128be6810d9a6a550c13f36cb264b9f1ca52c58629ab11" exitCode=0 Dec 09 11:19:40 crc kubenswrapper[5002]: I1209 11:19:40.286004 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22c2018d-2934-4552-a848-770614c6ff8c","Type":"ContainerDied","Data":"2e9251ecdd93612a08128be6810d9a6a550c13f36cb264b9f1ca52c58629ab11"} Dec 09 11:19:41 crc kubenswrapper[5002]: I1209 11:19:41.297099 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22c2018d-2934-4552-a848-770614c6ff8c","Type":"ContainerStarted","Data":"cf794f5f43eeb1b0f8d0eda116c5217aa7b8bf396febc11c55190945a60a9251"} Dec 09 11:19:41 crc kubenswrapper[5002]: I1209 11:19:41.342212 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.342192046 podStartE2EDuration="7.342192046s" podCreationTimestamp="2025-12-09 11:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:19:41.331951283 +0000 UTC m=+4713.724002394" watchObservedRunningTime="2025-12-09 11:19:41.342192046 +0000 UTC m=+4713.734243137" Dec 09 11:19:41 crc kubenswrapper[5002]: I1209 11:19:41.585129 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:41 crc kubenswrapper[5002]: I1209 11:19:41.867354 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:19:41 crc kubenswrapper[5002]: I1209 11:19:41.909996 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-6z772"] Dec 09 11:19:42 crc kubenswrapper[5002]: I1209 11:19:42.302939 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" podUID="37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" containerName="dnsmasq-dns" containerID="cri-o://b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800" gracePeriod=10 Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.234777 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.312116 5002 generic.go:334] "Generic (PLEG): container finished" podID="37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" containerID="b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800" exitCode=0 Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.312158 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" event={"ID":"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1","Type":"ContainerDied","Data":"b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800"} Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.312191 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" event={"ID":"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1","Type":"ContainerDied","Data":"40e9c52a8d729862f26d31facb508cd1607add675a662862f1d7310df5e0a8b5"} Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.312190 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-6z772" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.312279 5002 scope.go:117] "RemoveContainer" containerID="b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.336753 5002 scope.go:117] "RemoveContainer" containerID="234ce99c6f653ee082b5ae28f9e01005519c705ebf17171f65b6b979227b0c03" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.358095 5002 scope.go:117] "RemoveContainer" containerID="b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800" Dec 09 11:19:43 crc kubenswrapper[5002]: E1209 11:19:43.358464 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800\": container with ID starting with b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800 not found: ID does not exist" containerID="b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.358494 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800"} err="failed to get container status \"b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800\": rpc error: code = NotFound desc = could not find container \"b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800\": container with ID starting with b97fc566f387c3c48a2cf22a9c62dd3ee0ec22f35056658a14f34fd62a333800 not found: ID does not exist" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.358515 5002 scope.go:117] "RemoveContainer" containerID="234ce99c6f653ee082b5ae28f9e01005519c705ebf17171f65b6b979227b0c03" Dec 09 11:19:43 crc kubenswrapper[5002]: E1209 11:19:43.358753 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"234ce99c6f653ee082b5ae28f9e01005519c705ebf17171f65b6b979227b0c03\": container with ID starting with 234ce99c6f653ee082b5ae28f9e01005519c705ebf17171f65b6b979227b0c03 not found: ID does not exist" containerID="234ce99c6f653ee082b5ae28f9e01005519c705ebf17171f65b6b979227b0c03" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.358800 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"234ce99c6f653ee082b5ae28f9e01005519c705ebf17171f65b6b979227b0c03"} err="failed to get container status \"234ce99c6f653ee082b5ae28f9e01005519c705ebf17171f65b6b979227b0c03\": rpc error: code = NotFound desc = could not find container \"234ce99c6f653ee082b5ae28f9e01005519c705ebf17171f65b6b979227b0c03\": container with ID starting with 234ce99c6f653ee082b5ae28f9e01005519c705ebf17171f65b6b979227b0c03 not found: ID does not exist" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.401760 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krcjh\" (UniqueName: \"kubernetes.io/projected/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-kube-api-access-krcjh\") pod \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.401974 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-dns-svc\") pod \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.402110 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-config\") pod \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\" (UID: \"37910bb8-e88f-4c7f-ae2a-868ac75ff9d1\") " Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.406794 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-kube-api-access-krcjh" (OuterVolumeSpecName: "kube-api-access-krcjh") pod "37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" (UID: "37910bb8-e88f-4c7f-ae2a-868ac75ff9d1"). InnerVolumeSpecName "kube-api-access-krcjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.437307 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-config" (OuterVolumeSpecName: "config") pod "37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" (UID: "37910bb8-e88f-4c7f-ae2a-868ac75ff9d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.441679 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" (UID: "37910bb8-e88f-4c7f-ae2a-868ac75ff9d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.503609 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.503647 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.503657 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krcjh\" (UniqueName: \"kubernetes.io/projected/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1-kube-api-access-krcjh\") on node \"crc\" DevicePath \"\"" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.638606 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-6z772"] Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.643623 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-6z772"] Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.989066 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 09 11:19:43 crc kubenswrapper[5002]: I1209 11:19:43.989139 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 09 11:19:44 crc kubenswrapper[5002]: I1209 11:19:44.075862 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" path="/var/lib/kubelet/pods/37910bb8-e88f-4c7f-ae2a-868ac75ff9d1/volumes" Dec 09 11:19:44 crc kubenswrapper[5002]: I1209 11:19:44.762535 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 09 11:19:45 crc kubenswrapper[5002]: I1209 11:19:45.530460 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:45 crc kubenswrapper[5002]: I1209 11:19:45.530523 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:45 crc kubenswrapper[5002]: I1209 11:19:45.619189 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:46 crc kubenswrapper[5002]: I1209 11:19:46.215677 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 09 11:19:46 crc kubenswrapper[5002]: I1209 11:19:46.321950 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 09 11:19:46 crc kubenswrapper[5002]: I1209 11:19:46.417250 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 09 11:19:48 crc kubenswrapper[5002]: I1209 11:19:48.069092 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:19:48 crc kubenswrapper[5002]: E1209 11:19:48.069544 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:19:59 crc kubenswrapper[5002]: I1209 11:19:59.060019 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:19:59 crc kubenswrapper[5002]: E1209 11:19:59.061905 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:20:07 crc kubenswrapper[5002]: I1209 11:20:07.531172 5002 generic.go:334] "Generic (PLEG): container finished" podID="c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" containerID="33e96bd8698fd76a0217e6236bcf79b43a092291e150d26de91036fad7b27a98" exitCode=0 Dec 09 11:20:07 crc kubenswrapper[5002]: I1209 11:20:07.531353 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d","Type":"ContainerDied","Data":"33e96bd8698fd76a0217e6236bcf79b43a092291e150d26de91036fad7b27a98"} Dec 09 11:20:07 crc kubenswrapper[5002]: I1209 11:20:07.533507 5002 generic.go:334] "Generic (PLEG): container finished" podID="62678300-fdda-43eb-88cc-9104f3950c14" containerID="5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1" exitCode=0 Dec 09 11:20:07 crc kubenswrapper[5002]: I1209 11:20:07.533533 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62678300-fdda-43eb-88cc-9104f3950c14","Type":"ContainerDied","Data":"5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1"} Dec 09 11:20:08 crc kubenswrapper[5002]: I1209 11:20:08.542564 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d","Type":"ContainerStarted","Data":"0433cb4f025b1cb810dc68f09d78b047fc215c4b0d0317e77feb2580f895b808"} Dec 09 11:20:08 crc kubenswrapper[5002]: I1209 11:20:08.543123 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 09 11:20:08 crc kubenswrapper[5002]: I1209 11:20:08.544710 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62678300-fdda-43eb-88cc-9104f3950c14","Type":"ContainerStarted","Data":"4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08"} Dec 09 11:20:08 crc kubenswrapper[5002]: I1209 11:20:08.544932 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:08 crc kubenswrapper[5002]: I1209 11:20:08.572249 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.572230094 podStartE2EDuration="37.572230094s" podCreationTimestamp="2025-12-09 11:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:20:08.563419909 +0000 UTC m=+4740.955471040" watchObservedRunningTime="2025-12-09 11:20:08.572230094 +0000 UTC m=+4740.964281185" Dec 09 11:20:08 crc kubenswrapper[5002]: I1209 11:20:08.590330 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.590303588 podStartE2EDuration="37.590303588s" podCreationTimestamp="2025-12-09 11:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:20:08.586960308 +0000 UTC m=+4740.979011409" watchObservedRunningTime="2025-12-09 11:20:08.590303588 +0000 UTC m=+4740.982354669" Dec 09 11:20:14 crc kubenswrapper[5002]: I1209 11:20:14.064344 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:20:14 crc kubenswrapper[5002]: E1209 11:20:14.065189 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:20:22 crc kubenswrapper[5002]: I1209 11:20:22.759037 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 09 11:20:23 crc kubenswrapper[5002]: I1209 11:20:23.029559 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:25 crc kubenswrapper[5002]: I1209 11:20:25.059927 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:20:25 crc kubenswrapper[5002]: E1209 11:20:25.061052 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.495900 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dpz7k"] Dec 09 11:20:28 crc kubenswrapper[5002]: E1209 11:20:28.496643 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" containerName="init" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.496660 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" containerName="init" Dec 09 11:20:28 crc kubenswrapper[5002]: E1209 11:20:28.496681 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" containerName="dnsmasq-dns" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.496689 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" containerName="dnsmasq-dns" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.496896 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="37910bb8-e88f-4c7f-ae2a-868ac75ff9d1" containerName="dnsmasq-dns" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.497861 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.505879 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dpz7k"] Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.506559 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrznh\" (UniqueName: \"kubernetes.io/projected/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-kube-api-access-jrznh\") pod \"dnsmasq-dns-5b7946d7b9-dpz7k\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.506629 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-config\") pod \"dnsmasq-dns-5b7946d7b9-dpz7k\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.506794 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-dpz7k\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.607932 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrznh\" (UniqueName: \"kubernetes.io/projected/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-kube-api-access-jrznh\") pod \"dnsmasq-dns-5b7946d7b9-dpz7k\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.607993 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-config\") pod \"dnsmasq-dns-5b7946d7b9-dpz7k\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.608030 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-dpz7k\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.608905 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-config\") pod \"dnsmasq-dns-5b7946d7b9-dpz7k\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.609026 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-dpz7k\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.625923 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrznh\" (UniqueName: \"kubernetes.io/projected/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-kube-api-access-jrznh\") pod \"dnsmasq-dns-5b7946d7b9-dpz7k\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:28 crc kubenswrapper[5002]: I1209 11:20:28.818791 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:29 crc kubenswrapper[5002]: I1209 11:20:29.102025 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:20:29 crc kubenswrapper[5002]: I1209 11:20:29.300792 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dpz7k"] Dec 09 11:20:29 crc kubenswrapper[5002]: I1209 11:20:29.676023 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:20:29 crc kubenswrapper[5002]: I1209 11:20:29.704561 5002 generic.go:334] "Generic (PLEG): container finished" podID="eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" containerID="a3a5af4e962e93fa331a9e92580bea90dc7636bf8e59e67d2ec4c35ebf7417bf" exitCode=0 Dec 09 11:20:29 crc kubenswrapper[5002]: I1209 11:20:29.704602 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" event={"ID":"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba","Type":"ContainerDied","Data":"a3a5af4e962e93fa331a9e92580bea90dc7636bf8e59e67d2ec4c35ebf7417bf"} Dec 09 11:20:29 crc kubenswrapper[5002]: I1209 11:20:29.704628 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" event={"ID":"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba","Type":"ContainerStarted","Data":"3054edb575e045ae587f8e02401808bb624376baf6857a00d791acd5ccaecbb5"} Dec 09 11:20:30 crc kubenswrapper[5002]: I1209 11:20:30.713018 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" event={"ID":"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba","Type":"ContainerStarted","Data":"0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967"} Dec 09 11:20:30 crc kubenswrapper[5002]: I1209 11:20:30.713446 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:30 crc kubenswrapper[5002]: I1209 11:20:30.735534 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" podStartSLOduration=2.735513974 podStartE2EDuration="2.735513974s" podCreationTimestamp="2025-12-09 11:20:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:20:30.729091833 +0000 UTC m=+4763.121142944" watchObservedRunningTime="2025-12-09 11:20:30.735513974 +0000 UTC m=+4763.127565055" Dec 09 11:20:30 crc kubenswrapper[5002]: I1209 11:20:30.885007 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" containerName="rabbitmq" containerID="cri-o://0433cb4f025b1cb810dc68f09d78b047fc215c4b0d0317e77feb2580f895b808" gracePeriod=604799 Dec 09 11:20:31 crc kubenswrapper[5002]: I1209 11:20:31.383006 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="62678300-fdda-43eb-88cc-9104f3950c14" containerName="rabbitmq" containerID="cri-o://4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08" gracePeriod=604799 Dec 09 11:20:32 crc kubenswrapper[5002]: I1209 11:20:32.756692 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.234:5672: connect: connection refused" Dec 09 11:20:33 crc kubenswrapper[5002]: I1209 11:20:33.028421 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="62678300-fdda-43eb-88cc-9104f3950c14" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.235:5672: connect: connection refused" Dec 09 11:20:36 crc kubenswrapper[5002]: I1209 11:20:36.060795 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:20:36 crc kubenswrapper[5002]: E1209 11:20:36.061795 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:20:37 crc kubenswrapper[5002]: I1209 11:20:37.766390 5002 generic.go:334] "Generic (PLEG): container finished" podID="c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" containerID="0433cb4f025b1cb810dc68f09d78b047fc215c4b0d0317e77feb2580f895b808" exitCode=0 Dec 09 11:20:37 crc kubenswrapper[5002]: I1209 11:20:37.766585 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d","Type":"ContainerDied","Data":"0433cb4f025b1cb810dc68f09d78b047fc215c4b0d0317e77feb2580f895b808"} Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.013126 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.063397 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-confd\") pod \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.063438 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsckf\" (UniqueName: \"kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-kube-api-access-gsckf\") pod \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.063518 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-plugins-conf\") pod \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.063549 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-pod-info\") pod \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.063597 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-erlang-cookie-secret\") pod \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.063622 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-plugins\") pod \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.063854 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\") pod \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.063946 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-server-conf\") pod \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.063971 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-erlang-cookie\") pod \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\" (UID: \"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.064831 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" (UID: "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.065320 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" (UID: "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.067185 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" (UID: "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.070298 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" (UID: "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.071546 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-pod-info" (OuterVolumeSpecName: "pod-info") pod "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" (UID: "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.071940 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-kube-api-access-gsckf" (OuterVolumeSpecName: "kube-api-access-gsckf") pod "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" (UID: "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d"). InnerVolumeSpecName "kube-api-access-gsckf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.082539 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3" (OuterVolumeSpecName: "persistence") pod "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" (UID: "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d"). InnerVolumeSpecName "pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.096444 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-server-conf" (OuterVolumeSpecName: "server-conf") pod "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" (UID: "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.161729 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" (UID: "c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.166004 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.166038 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsckf\" (UniqueName: \"kubernetes.io/projected/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-kube-api-access-gsckf\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.166070 5002 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.166079 5002 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-pod-info\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.166088 5002 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.166096 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.166152 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\") on node \"crc\" " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.166165 5002 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-server-conf\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.166173 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.188063 5002 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.188262 5002 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3") on node "crc" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.267627 5002 reconciler_common.go:293] "Volume detached for volume \"pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.321741 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.369362 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62678300-fdda-43eb-88cc-9104f3950c14-erlang-cookie-secret\") pod \"62678300-fdda-43eb-88cc-9104f3950c14\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.369429 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-server-conf\") pod \"62678300-fdda-43eb-88cc-9104f3950c14\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.369474 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-plugins-conf\") pod \"62678300-fdda-43eb-88cc-9104f3950c14\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.369543 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62678300-fdda-43eb-88cc-9104f3950c14-pod-info\") pod \"62678300-fdda-43eb-88cc-9104f3950c14\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.369606 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-confd\") pod \"62678300-fdda-43eb-88cc-9104f3950c14\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.369639 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb8vk\" (UniqueName: \"kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-kube-api-access-jb8vk\") pod \"62678300-fdda-43eb-88cc-9104f3950c14\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.369714 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-plugins\") pod \"62678300-fdda-43eb-88cc-9104f3950c14\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.369760 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-erlang-cookie\") pod \"62678300-fdda-43eb-88cc-9104f3950c14\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.369944 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b\") pod \"62678300-fdda-43eb-88cc-9104f3950c14\" (UID: \"62678300-fdda-43eb-88cc-9104f3950c14\") " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.369956 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "62678300-fdda-43eb-88cc-9104f3950c14" (UID: "62678300-fdda-43eb-88cc-9104f3950c14"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.370251 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "62678300-fdda-43eb-88cc-9104f3950c14" (UID: "62678300-fdda-43eb-88cc-9104f3950c14"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.370296 5002 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.370721 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "62678300-fdda-43eb-88cc-9104f3950c14" (UID: "62678300-fdda-43eb-88cc-9104f3950c14"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.373516 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/62678300-fdda-43eb-88cc-9104f3950c14-pod-info" (OuterVolumeSpecName: "pod-info") pod "62678300-fdda-43eb-88cc-9104f3950c14" (UID: "62678300-fdda-43eb-88cc-9104f3950c14"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.374659 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62678300-fdda-43eb-88cc-9104f3950c14-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "62678300-fdda-43eb-88cc-9104f3950c14" (UID: "62678300-fdda-43eb-88cc-9104f3950c14"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.375377 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-kube-api-access-jb8vk" (OuterVolumeSpecName: "kube-api-access-jb8vk") pod "62678300-fdda-43eb-88cc-9104f3950c14" (UID: "62678300-fdda-43eb-88cc-9104f3950c14"). InnerVolumeSpecName "kube-api-access-jb8vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.391303 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b" (OuterVolumeSpecName: "persistence") pod "62678300-fdda-43eb-88cc-9104f3950c14" (UID: "62678300-fdda-43eb-88cc-9104f3950c14"). InnerVolumeSpecName "pvc-2a840a07-2967-4f13-8754-4816b3607e7b". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.401460 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-server-conf" (OuterVolumeSpecName: "server-conf") pod "62678300-fdda-43eb-88cc-9104f3950c14" (UID: "62678300-fdda-43eb-88cc-9104f3950c14"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.455430 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "62678300-fdda-43eb-88cc-9104f3950c14" (UID: "62678300-fdda-43eb-88cc-9104f3950c14"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.471197 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.471403 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.471486 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2a840a07-2967-4f13-8754-4816b3607e7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b\") on node \"crc\" " Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.471554 5002 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62678300-fdda-43eb-88cc-9104f3950c14-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.471611 5002 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62678300-fdda-43eb-88cc-9104f3950c14-server-conf\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.471663 5002 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62678300-fdda-43eb-88cc-9104f3950c14-pod-info\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.471716 5002 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.471768 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb8vk\" (UniqueName: \"kubernetes.io/projected/62678300-fdda-43eb-88cc-9104f3950c14-kube-api-access-jb8vk\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.487626 5002 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.487779 5002 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2a840a07-2967-4f13-8754-4816b3607e7b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b") on node "crc" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.573147 5002 reconciler_common.go:293] "Volume detached for volume \"pvc-2a840a07-2967-4f13-8754-4816b3607e7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.776079 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d","Type":"ContainerDied","Data":"bf98ee07d9efb543e0f62abd19ef06dfb58a9bd98f98024880f698c783b85d55"} Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.776106 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.777483 5002 scope.go:117] "RemoveContainer" containerID="0433cb4f025b1cb810dc68f09d78b047fc215c4b0d0317e77feb2580f895b808" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.779085 5002 generic.go:334] "Generic (PLEG): container finished" podID="62678300-fdda-43eb-88cc-9104f3950c14" containerID="4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08" exitCode=0 Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.779127 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.779145 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62678300-fdda-43eb-88cc-9104f3950c14","Type":"ContainerDied","Data":"4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08"} Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.779438 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62678300-fdda-43eb-88cc-9104f3950c14","Type":"ContainerDied","Data":"594f1cf755d2099a61c5d9ba22973ad642f297cbf769eac1ce6c6fc9993ba2fb"} Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.804264 5002 scope.go:117] "RemoveContainer" containerID="33e96bd8698fd76a0217e6236bcf79b43a092291e150d26de91036fad7b27a98" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.818735 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.820008 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.828039 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.843541 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:20:38 crc kubenswrapper[5002]: E1209 11:20:38.844019 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62678300-fdda-43eb-88cc-9104f3950c14" containerName="rabbitmq" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.844046 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="62678300-fdda-43eb-88cc-9104f3950c14" containerName="rabbitmq" Dec 09 11:20:38 crc kubenswrapper[5002]: E1209 11:20:38.844075 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" containerName="rabbitmq" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.844083 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" containerName="rabbitmq" Dec 09 11:20:38 crc kubenswrapper[5002]: E1209 11:20:38.844101 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" containerName="setup-container" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.844110 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" containerName="setup-container" Dec 09 11:20:38 crc kubenswrapper[5002]: E1209 11:20:38.844125 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62678300-fdda-43eb-88cc-9104f3950c14" containerName="setup-container" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.844134 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="62678300-fdda-43eb-88cc-9104f3950c14" containerName="setup-container" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.844332 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" containerName="rabbitmq" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.844366 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="62678300-fdda-43eb-88cc-9104f3950c14" containerName="rabbitmq" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.845719 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.850189 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.850449 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-228rx" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.850764 5002 scope.go:117] "RemoveContainer" containerID="4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.851067 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.851273 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.851481 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.876008 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.880701 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22ca3dfe-9996-43c3-89b4-ee6624561059-server-conf\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.880777 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22ca3dfe-9996-43c3-89b4-ee6624561059-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.880810 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22ca3dfe-9996-43c3-89b4-ee6624561059-pod-info\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.880848 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22ca3dfe-9996-43c3-89b4-ee6624561059-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.880873 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22ca3dfe-9996-43c3-89b4-ee6624561059-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.880898 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4294\" (UniqueName: \"kubernetes.io/projected/22ca3dfe-9996-43c3-89b4-ee6624561059-kube-api-access-t4294\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.891653 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.891746 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22ca3dfe-9996-43c3-89b4-ee6624561059-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.891856 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22ca3dfe-9996-43c3-89b4-ee6624561059-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.892138 5002 scope.go:117] "RemoveContainer" containerID="5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.931459 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.945475 5002 scope.go:117] "RemoveContainer" containerID="4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08" Dec 09 11:20:38 crc kubenswrapper[5002]: E1209 11:20:38.946132 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08\": container with ID starting with 4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08 not found: ID does not exist" containerID="4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.946178 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08"} err="failed to get container status \"4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08\": rpc error: code = NotFound desc = could not find container \"4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08\": container with ID starting with 4396c5168179d7a0079b9815670857e903918c3ca0b43a905776f2acd2700f08 not found: ID does not exist" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.946205 5002 scope.go:117] "RemoveContainer" containerID="5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1" Dec 09 11:20:38 crc kubenswrapper[5002]: E1209 11:20:38.946466 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1\": container with ID starting with 5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1 not found: ID does not exist" containerID="5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.946493 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1"} err="failed to get container status \"5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1\": rpc error: code = NotFound desc = could not find container \"5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1\": container with ID starting with 5ec2ff841e47ea0923b05e4ba1dbcafa53790339f2cd876f85e28c70d44dfca1 not found: ID does not exist" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.946522 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.959045 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.960312 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.963719 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-rw2wd"] Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.963994 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" podUID="7dc8d716-07de-4018-91e5-a94ccc37a692" containerName="dnsmasq-dns" containerID="cri-o://815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5" gracePeriod=10 Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.969365 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.971977 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.972010 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-87bn9" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.972088 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.972156 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.972692 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.993911 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.993955 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.993989 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.994018 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22ca3dfe-9996-43c3-89b4-ee6624561059-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.994042 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.994983 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995075 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2a840a07-2967-4f13-8754-4816b3607e7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995106 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbpzk\" (UniqueName: \"kubernetes.io/projected/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-kube-api-access-fbpzk\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995131 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22ca3dfe-9996-43c3-89b4-ee6624561059-server-conf\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995148 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995201 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995228 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22ca3dfe-9996-43c3-89b4-ee6624561059-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995252 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22ca3dfe-9996-43c3-89b4-ee6624561059-pod-info\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995274 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22ca3dfe-9996-43c3-89b4-ee6624561059-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995292 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22ca3dfe-9996-43c3-89b4-ee6624561059-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995307 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4294\" (UniqueName: \"kubernetes.io/projected/22ca3dfe-9996-43c3-89b4-ee6624561059-kube-api-access-t4294\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995327 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.995375 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22ca3dfe-9996-43c3-89b4-ee6624561059-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.996499 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22ca3dfe-9996-43c3-89b4-ee6624561059-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.996660 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22ca3dfe-9996-43c3-89b4-ee6624561059-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.996935 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22ca3dfe-9996-43c3-89b4-ee6624561059-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.997941 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22ca3dfe-9996-43c3-89b4-ee6624561059-server-conf\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.999065 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:20:38 crc kubenswrapper[5002]: I1209 11:20:38.999172 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b272446c890aff24493bdcc404f373c7542819e469da99fc019e55263aae6767/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.000603 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22ca3dfe-9996-43c3-89b4-ee6624561059-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.000681 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22ca3dfe-9996-43c3-89b4-ee6624561059-pod-info\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.001619 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22ca3dfe-9996-43c3-89b4-ee6624561059-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.010729 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4294\" (UniqueName: \"kubernetes.io/projected/22ca3dfe-9996-43c3-89b4-ee6624561059-kube-api-access-t4294\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.029801 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ae7c983-54a3-49f5-bb37-4eec70f603f3\") pod \"rabbitmq-server-0\" (UID: \"22ca3dfe-9996-43c3-89b4-ee6624561059\") " pod="openstack/rabbitmq-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.096802 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.096868 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.096902 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.096931 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.096961 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.096991 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2a840a07-2967-4f13-8754-4816b3607e7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.097009 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbpzk\" (UniqueName: \"kubernetes.io/projected/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-kube-api-access-fbpzk\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.097024 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.097049 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.097882 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.098693 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.101800 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.102378 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.102374 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.103697 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.103721 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2a840a07-2967-4f13-8754-4816b3607e7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aee1bc9c0e18517cf5839fabf07a7ec45fd5a691a4dd335abaa37517e9b64dea/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.103921 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.104095 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.118381 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbpzk\" (UniqueName: \"kubernetes.io/projected/8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c-kube-api-access-fbpzk\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.136555 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2a840a07-2967-4f13-8754-4816b3607e7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a840a07-2967-4f13-8754-4816b3607e7b\") pod \"rabbitmq-cell1-server-0\" (UID: \"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.176604 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.294359 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.389584 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.502551 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-dns-svc\") pod \"7dc8d716-07de-4018-91e5-a94ccc37a692\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.503012 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zdmf\" (UniqueName: \"kubernetes.io/projected/7dc8d716-07de-4018-91e5-a94ccc37a692-kube-api-access-2zdmf\") pod \"7dc8d716-07de-4018-91e5-a94ccc37a692\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.503069 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-config\") pod \"7dc8d716-07de-4018-91e5-a94ccc37a692\" (UID: \"7dc8d716-07de-4018-91e5-a94ccc37a692\") " Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.507942 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc8d716-07de-4018-91e5-a94ccc37a692-kube-api-access-2zdmf" (OuterVolumeSpecName: "kube-api-access-2zdmf") pod "7dc8d716-07de-4018-91e5-a94ccc37a692" (UID: "7dc8d716-07de-4018-91e5-a94ccc37a692"). InnerVolumeSpecName "kube-api-access-2zdmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.547531 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-config" (OuterVolumeSpecName: "config") pod "7dc8d716-07de-4018-91e5-a94ccc37a692" (UID: "7dc8d716-07de-4018-91e5-a94ccc37a692"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.550775 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7dc8d716-07de-4018-91e5-a94ccc37a692" (UID: "7dc8d716-07de-4018-91e5-a94ccc37a692"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.604827 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zdmf\" (UniqueName: \"kubernetes.io/projected/7dc8d716-07de-4018-91e5-a94ccc37a692-kube-api-access-2zdmf\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.604874 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.604882 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc8d716-07de-4018-91e5-a94ccc37a692-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.657999 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.762088 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.789056 5002 generic.go:334] "Generic (PLEG): container finished" podID="7dc8d716-07de-4018-91e5-a94ccc37a692" containerID="815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5" exitCode=0 Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.789091 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" event={"ID":"7dc8d716-07de-4018-91e5-a94ccc37a692","Type":"ContainerDied","Data":"815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5"} Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.789134 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.789149 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-rw2wd" event={"ID":"7dc8d716-07de-4018-91e5-a94ccc37a692","Type":"ContainerDied","Data":"ec200b5989512d802ef03ecdb70d3003bfe972599d5fdeee63241699a2a463d5"} Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.789171 5002 scope.go:117] "RemoveContainer" containerID="815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.789986 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"22ca3dfe-9996-43c3-89b4-ee6624561059","Type":"ContainerStarted","Data":"fa41376981cb1cd96effc0ab862c11598dad3ac38d45f90a5a236cc8a3883a0e"} Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.791456 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c","Type":"ContainerStarted","Data":"c68c52a66beeda870c59bef959eeef341c8ad95428938015ba5a7cd754419ab1"} Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.817610 5002 scope.go:117] "RemoveContainer" containerID="26b46a3c7f05cfbce9786c0631708109ca6b81691b4547f176d4e4ce34b3e481" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.821716 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-rw2wd"] Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.827032 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-rw2wd"] Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.852466 5002 scope.go:117] "RemoveContainer" containerID="815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5" Dec 09 11:20:39 crc kubenswrapper[5002]: E1209 11:20:39.853025 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5\": container with ID starting with 815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5 not found: ID does not exist" containerID="815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.853068 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5"} err="failed to get container status \"815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5\": rpc error: code = NotFound desc = could not find container \"815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5\": container with ID starting with 815e2050399fde0dbeb9ba16718dcdf46cfee01f70d8b7acc23b9b09eb1ffaf5 not found: ID does not exist" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.853100 5002 scope.go:117] "RemoveContainer" containerID="26b46a3c7f05cfbce9786c0631708109ca6b81691b4547f176d4e4ce34b3e481" Dec 09 11:20:39 crc kubenswrapper[5002]: E1209 11:20:39.853352 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26b46a3c7f05cfbce9786c0631708109ca6b81691b4547f176d4e4ce34b3e481\": container with ID starting with 26b46a3c7f05cfbce9786c0631708109ca6b81691b4547f176d4e4ce34b3e481 not found: ID does not exist" containerID="26b46a3c7f05cfbce9786c0631708109ca6b81691b4547f176d4e4ce34b3e481" Dec 09 11:20:39 crc kubenswrapper[5002]: I1209 11:20:39.853380 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b46a3c7f05cfbce9786c0631708109ca6b81691b4547f176d4e4ce34b3e481"} err="failed to get container status \"26b46a3c7f05cfbce9786c0631708109ca6b81691b4547f176d4e4ce34b3e481\": rpc error: code = NotFound desc = could not find container \"26b46a3c7f05cfbce9786c0631708109ca6b81691b4547f176d4e4ce34b3e481\": container with ID starting with 26b46a3c7f05cfbce9786c0631708109ca6b81691b4547f176d4e4ce34b3e481 not found: ID does not exist" Dec 09 11:20:40 crc kubenswrapper[5002]: I1209 11:20:40.072981 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62678300-fdda-43eb-88cc-9104f3950c14" path="/var/lib/kubelet/pods/62678300-fdda-43eb-88cc-9104f3950c14/volumes" Dec 09 11:20:40 crc kubenswrapper[5002]: I1209 11:20:40.073547 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dc8d716-07de-4018-91e5-a94ccc37a692" path="/var/lib/kubelet/pods/7dc8d716-07de-4018-91e5-a94ccc37a692/volumes" Dec 09 11:20:40 crc kubenswrapper[5002]: I1209 11:20:40.074735 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d" path="/var/lib/kubelet/pods/c9fcf8b7-a837-42a0-84a7-d6b3ed961f6d/volumes" Dec 09 11:20:41 crc kubenswrapper[5002]: I1209 11:20:41.810367 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"22ca3dfe-9996-43c3-89b4-ee6624561059","Type":"ContainerStarted","Data":"ec4af23e1bacb40dc2db0b8f1f598ad73cefb05da9a69dd09c18126e916f5852"} Dec 09 11:20:41 crc kubenswrapper[5002]: I1209 11:20:41.812279 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c","Type":"ContainerStarted","Data":"c4db3bfd606942795b42f4ed24356e4ba6f6e22f8177f46085f3b992656f44b0"} Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.380901 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-67pzd"] Dec 09 11:20:44 crc kubenswrapper[5002]: E1209 11:20:44.381884 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc8d716-07de-4018-91e5-a94ccc37a692" containerName="dnsmasq-dns" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.381899 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc8d716-07de-4018-91e5-a94ccc37a692" containerName="dnsmasq-dns" Dec 09 11:20:44 crc kubenswrapper[5002]: E1209 11:20:44.381916 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc8d716-07de-4018-91e5-a94ccc37a692" containerName="init" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.381923 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc8d716-07de-4018-91e5-a94ccc37a692" containerName="init" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.382163 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc8d716-07de-4018-91e5-a94ccc37a692" containerName="dnsmasq-dns" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.383446 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.405762 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-67pzd"] Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.478573 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlpmr\" (UniqueName: \"kubernetes.io/projected/64b04cab-6967-4943-a455-10892c9972b8-kube-api-access-dlpmr\") pod \"redhat-operators-67pzd\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.478897 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-utilities\") pod \"redhat-operators-67pzd\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.479124 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-catalog-content\") pod \"redhat-operators-67pzd\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.580615 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-utilities\") pod \"redhat-operators-67pzd\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.580685 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-catalog-content\") pod \"redhat-operators-67pzd\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.580762 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlpmr\" (UniqueName: \"kubernetes.io/projected/64b04cab-6967-4943-a455-10892c9972b8-kube-api-access-dlpmr\") pod \"redhat-operators-67pzd\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.581164 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-utilities\") pod \"redhat-operators-67pzd\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.581605 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-catalog-content\") pod \"redhat-operators-67pzd\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.599676 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlpmr\" (UniqueName: \"kubernetes.io/projected/64b04cab-6967-4943-a455-10892c9972b8-kube-api-access-dlpmr\") pod \"redhat-operators-67pzd\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:44 crc kubenswrapper[5002]: I1209 11:20:44.708416 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:45 crc kubenswrapper[5002]: I1209 11:20:45.140940 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-67pzd"] Dec 09 11:20:45 crc kubenswrapper[5002]: I1209 11:20:45.842167 5002 generic.go:334] "Generic (PLEG): container finished" podID="64b04cab-6967-4943-a455-10892c9972b8" containerID="bee6fb02426ce59a951a663c82654c20214f98d28f6069e8cdec8bbe8fac9dba" exitCode=0 Dec 09 11:20:45 crc kubenswrapper[5002]: I1209 11:20:45.842224 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pzd" event={"ID":"64b04cab-6967-4943-a455-10892c9972b8","Type":"ContainerDied","Data":"bee6fb02426ce59a951a663c82654c20214f98d28f6069e8cdec8bbe8fac9dba"} Dec 09 11:20:45 crc kubenswrapper[5002]: I1209 11:20:45.842473 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pzd" event={"ID":"64b04cab-6967-4943-a455-10892c9972b8","Type":"ContainerStarted","Data":"e8cb126445d6e504f4720a254e00ffbc10a126df2c84f34e0c25484cd5730ed4"} Dec 09 11:20:45 crc kubenswrapper[5002]: I1209 11:20:45.844702 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:20:46 crc kubenswrapper[5002]: I1209 11:20:46.852883 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pzd" event={"ID":"64b04cab-6967-4943-a455-10892c9972b8","Type":"ContainerStarted","Data":"8351e530696911a009248b68430bebed62a325b7a392081717dc34ab4e671570"} Dec 09 11:20:47 crc kubenswrapper[5002]: I1209 11:20:47.061322 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:20:47 crc kubenswrapper[5002]: E1209 11:20:47.061778 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:20:47 crc kubenswrapper[5002]: I1209 11:20:47.864217 5002 generic.go:334] "Generic (PLEG): container finished" podID="64b04cab-6967-4943-a455-10892c9972b8" containerID="8351e530696911a009248b68430bebed62a325b7a392081717dc34ab4e671570" exitCode=0 Dec 09 11:20:47 crc kubenswrapper[5002]: I1209 11:20:47.864256 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pzd" event={"ID":"64b04cab-6967-4943-a455-10892c9972b8","Type":"ContainerDied","Data":"8351e530696911a009248b68430bebed62a325b7a392081717dc34ab4e671570"} Dec 09 11:20:48 crc kubenswrapper[5002]: I1209 11:20:48.876470 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pzd" event={"ID":"64b04cab-6967-4943-a455-10892c9972b8","Type":"ContainerStarted","Data":"df1000f0688c3dac47feeedfc28f841cce6dfaa01ecaaafdd7458f0f85f75c4c"} Dec 09 11:20:48 crc kubenswrapper[5002]: I1209 11:20:48.901477 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-67pzd" podStartSLOduration=2.491993302 podStartE2EDuration="4.901458087s" podCreationTimestamp="2025-12-09 11:20:44 +0000 UTC" firstStartedPulling="2025-12-09 11:20:45.844452962 +0000 UTC m=+4778.236504043" lastFinishedPulling="2025-12-09 11:20:48.253917737 +0000 UTC m=+4780.645968828" observedRunningTime="2025-12-09 11:20:48.89971295 +0000 UTC m=+4781.291764061" watchObservedRunningTime="2025-12-09 11:20:48.901458087 +0000 UTC m=+4781.293509168" Dec 09 11:20:54 crc kubenswrapper[5002]: I1209 11:20:54.709005 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:54 crc kubenswrapper[5002]: I1209 11:20:54.709587 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:54 crc kubenswrapper[5002]: I1209 11:20:54.757157 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:54 crc kubenswrapper[5002]: I1209 11:20:54.994151 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:20:55 crc kubenswrapper[5002]: I1209 11:20:55.046640 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-67pzd"] Dec 09 11:20:56 crc kubenswrapper[5002]: I1209 11:20:56.935890 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-67pzd" podUID="64b04cab-6967-4943-a455-10892c9972b8" containerName="registry-server" containerID="cri-o://df1000f0688c3dac47feeedfc28f841cce6dfaa01ecaaafdd7458f0f85f75c4c" gracePeriod=2 Dec 09 11:20:58 crc kubenswrapper[5002]: I1209 11:20:58.957025 5002 generic.go:334] "Generic (PLEG): container finished" podID="64b04cab-6967-4943-a455-10892c9972b8" containerID="df1000f0688c3dac47feeedfc28f841cce6dfaa01ecaaafdd7458f0f85f75c4c" exitCode=0 Dec 09 11:20:58 crc kubenswrapper[5002]: I1209 11:20:58.957120 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pzd" event={"ID":"64b04cab-6967-4943-a455-10892c9972b8","Type":"ContainerDied","Data":"df1000f0688c3dac47feeedfc28f841cce6dfaa01ecaaafdd7458f0f85f75c4c"} Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.060560 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:21:00 crc kubenswrapper[5002]: E1209 11:21:00.061291 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.074719 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.229951 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-utilities\") pod \"64b04cab-6967-4943-a455-10892c9972b8\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.229992 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-catalog-content\") pod \"64b04cab-6967-4943-a455-10892c9972b8\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.230015 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlpmr\" (UniqueName: \"kubernetes.io/projected/64b04cab-6967-4943-a455-10892c9972b8-kube-api-access-dlpmr\") pod \"64b04cab-6967-4943-a455-10892c9972b8\" (UID: \"64b04cab-6967-4943-a455-10892c9972b8\") " Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.233580 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-utilities" (OuterVolumeSpecName: "utilities") pod "64b04cab-6967-4943-a455-10892c9972b8" (UID: "64b04cab-6967-4943-a455-10892c9972b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.235172 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b04cab-6967-4943-a455-10892c9972b8-kube-api-access-dlpmr" (OuterVolumeSpecName: "kube-api-access-dlpmr") pod "64b04cab-6967-4943-a455-10892c9972b8" (UID: "64b04cab-6967-4943-a455-10892c9972b8"). InnerVolumeSpecName "kube-api-access-dlpmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.331790 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.331870 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlpmr\" (UniqueName: \"kubernetes.io/projected/64b04cab-6967-4943-a455-10892c9972b8-kube-api-access-dlpmr\") on node \"crc\" DevicePath \"\"" Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.378347 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64b04cab-6967-4943-a455-10892c9972b8" (UID: "64b04cab-6967-4943-a455-10892c9972b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.433093 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b04cab-6967-4943-a455-10892c9972b8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.979595 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pzd" event={"ID":"64b04cab-6967-4943-a455-10892c9972b8","Type":"ContainerDied","Data":"e8cb126445d6e504f4720a254e00ffbc10a126df2c84f34e0c25484cd5730ed4"} Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.979656 5002 scope.go:117] "RemoveContainer" containerID="df1000f0688c3dac47feeedfc28f841cce6dfaa01ecaaafdd7458f0f85f75c4c" Dec 09 11:21:00 crc kubenswrapper[5002]: I1209 11:21:00.979801 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67pzd" Dec 09 11:21:01 crc kubenswrapper[5002]: I1209 11:21:01.006002 5002 scope.go:117] "RemoveContainer" containerID="8351e530696911a009248b68430bebed62a325b7a392081717dc34ab4e671570" Dec 09 11:21:01 crc kubenswrapper[5002]: I1209 11:21:01.010002 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-67pzd"] Dec 09 11:21:01 crc kubenswrapper[5002]: I1209 11:21:01.023015 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-67pzd"] Dec 09 11:21:01 crc kubenswrapper[5002]: I1209 11:21:01.033993 5002 scope.go:117] "RemoveContainer" containerID="bee6fb02426ce59a951a663c82654c20214f98d28f6069e8cdec8bbe8fac9dba" Dec 09 11:21:02 crc kubenswrapper[5002]: I1209 11:21:02.088805 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b04cab-6967-4943-a455-10892c9972b8" path="/var/lib/kubelet/pods/64b04cab-6967-4943-a455-10892c9972b8/volumes" Dec 09 11:21:12 crc kubenswrapper[5002]: I1209 11:21:12.060944 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:21:12 crc kubenswrapper[5002]: E1209 11:21:12.061741 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:21:14 crc kubenswrapper[5002]: I1209 11:21:14.093135 5002 generic.go:334] "Generic (PLEG): container finished" podID="22ca3dfe-9996-43c3-89b4-ee6624561059" containerID="ec4af23e1bacb40dc2db0b8f1f598ad73cefb05da9a69dd09c18126e916f5852" exitCode=0 Dec 09 11:21:14 crc kubenswrapper[5002]: I1209 11:21:14.093240 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"22ca3dfe-9996-43c3-89b4-ee6624561059","Type":"ContainerDied","Data":"ec4af23e1bacb40dc2db0b8f1f598ad73cefb05da9a69dd09c18126e916f5852"} Dec 09 11:21:14 crc kubenswrapper[5002]: I1209 11:21:14.095936 5002 generic.go:334] "Generic (PLEG): container finished" podID="8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c" containerID="c4db3bfd606942795b42f4ed24356e4ba6f6e22f8177f46085f3b992656f44b0" exitCode=0 Dec 09 11:21:14 crc kubenswrapper[5002]: I1209 11:21:14.095972 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c","Type":"ContainerDied","Data":"c4db3bfd606942795b42f4ed24356e4ba6f6e22f8177f46085f3b992656f44b0"} Dec 09 11:21:15 crc kubenswrapper[5002]: I1209 11:21:15.104919 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"22ca3dfe-9996-43c3-89b4-ee6624561059","Type":"ContainerStarted","Data":"ab35ed5ddfeb941380311f0b260f4d3b028cef9bc1b01bf980d5711416df54cb"} Dec 09 11:21:15 crc kubenswrapper[5002]: I1209 11:21:15.106692 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 09 11:21:15 crc kubenswrapper[5002]: I1209 11:21:15.109335 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c","Type":"ContainerStarted","Data":"edd8b0072f0ee29fb24296b79a48eb1da4923357c6df293efa9b2687bacab42d"} Dec 09 11:21:15 crc kubenswrapper[5002]: I1209 11:21:15.109629 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:21:15 crc kubenswrapper[5002]: I1209 11:21:15.170181 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.170162188 podStartE2EDuration="37.170162188s" podCreationTimestamp="2025-12-09 11:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:21:15.168018621 +0000 UTC m=+4807.560069712" watchObservedRunningTime="2025-12-09 11:21:15.170162188 +0000 UTC m=+4807.562213269" Dec 09 11:21:15 crc kubenswrapper[5002]: I1209 11:21:15.173734 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.173722254 podStartE2EDuration="37.173722254s" podCreationTimestamp="2025-12-09 11:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:21:15.147885563 +0000 UTC m=+4807.539936654" watchObservedRunningTime="2025-12-09 11:21:15.173722254 +0000 UTC m=+4807.565773345" Dec 09 11:21:27 crc kubenswrapper[5002]: I1209 11:21:27.060851 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:21:27 crc kubenswrapper[5002]: E1209 11:21:27.062016 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:21:29 crc kubenswrapper[5002]: I1209 11:21:29.180024 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 09 11:21:29 crc kubenswrapper[5002]: I1209 11:21:29.298936 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:21:39 crc kubenswrapper[5002]: I1209 11:21:39.060511 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:21:39 crc kubenswrapper[5002]: E1209 11:21:39.061409 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.410621 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 09 11:21:41 crc kubenswrapper[5002]: E1209 11:21:41.411278 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b04cab-6967-4943-a455-10892c9972b8" containerName="extract-utilities" Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.411295 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b04cab-6967-4943-a455-10892c9972b8" containerName="extract-utilities" Dec 09 11:21:41 crc kubenswrapper[5002]: E1209 11:21:41.411335 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b04cab-6967-4943-a455-10892c9972b8" containerName="registry-server" Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.411346 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b04cab-6967-4943-a455-10892c9972b8" containerName="registry-server" Dec 09 11:21:41 crc kubenswrapper[5002]: E1209 11:21:41.411361 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b04cab-6967-4943-a455-10892c9972b8" containerName="extract-content" Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.411369 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b04cab-6967-4943-a455-10892c9972b8" containerName="extract-content" Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.411561 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b04cab-6967-4943-a455-10892c9972b8" containerName="registry-server" Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.412208 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.415131 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-knjzb" Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.417682 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.498721 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hjpp\" (UniqueName: \"kubernetes.io/projected/8e338404-c88f-4ea9-9776-da0a2b677eef-kube-api-access-9hjpp\") pod \"mariadb-client-1-default\" (UID: \"8e338404-c88f-4ea9-9776-da0a2b677eef\") " pod="openstack/mariadb-client-1-default" Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.600235 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hjpp\" (UniqueName: \"kubernetes.io/projected/8e338404-c88f-4ea9-9776-da0a2b677eef-kube-api-access-9hjpp\") pod \"mariadb-client-1-default\" (UID: \"8e338404-c88f-4ea9-9776-da0a2b677eef\") " pod="openstack/mariadb-client-1-default" Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.638629 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hjpp\" (UniqueName: \"kubernetes.io/projected/8e338404-c88f-4ea9-9776-da0a2b677eef-kube-api-access-9hjpp\") pod \"mariadb-client-1-default\" (UID: \"8e338404-c88f-4ea9-9776-da0a2b677eef\") " pod="openstack/mariadb-client-1-default" Dec 09 11:21:41 crc kubenswrapper[5002]: I1209 11:21:41.741003 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 09 11:21:42 crc kubenswrapper[5002]: I1209 11:21:42.226831 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 09 11:21:42 crc kubenswrapper[5002]: W1209 11:21:42.231312 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e338404_c88f_4ea9_9776_da0a2b677eef.slice/crio-e6013ecaf767d9a389cd67664de61ee015e1cc0b56451f84d0b0c28a635100a0 WatchSource:0}: Error finding container e6013ecaf767d9a389cd67664de61ee015e1cc0b56451f84d0b0c28a635100a0: Status 404 returned error can't find the container with id e6013ecaf767d9a389cd67664de61ee015e1cc0b56451f84d0b0c28a635100a0 Dec 09 11:21:42 crc kubenswrapper[5002]: I1209 11:21:42.323039 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"8e338404-c88f-4ea9-9776-da0a2b677eef","Type":"ContainerStarted","Data":"e6013ecaf767d9a389cd67664de61ee015e1cc0b56451f84d0b0c28a635100a0"} Dec 09 11:21:43 crc kubenswrapper[5002]: I1209 11:21:43.332937 5002 generic.go:334] "Generic (PLEG): container finished" podID="8e338404-c88f-4ea9-9776-da0a2b677eef" containerID="d76b8f1ca3e41a863ad26f13d30b47efbc2d6bc729867dcc212de7678385d8d3" exitCode=0 Dec 09 11:21:43 crc kubenswrapper[5002]: I1209 11:21:43.333107 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"8e338404-c88f-4ea9-9776-da0a2b677eef","Type":"ContainerDied","Data":"d76b8f1ca3e41a863ad26f13d30b47efbc2d6bc729867dcc212de7678385d8d3"} Dec 09 11:21:44 crc kubenswrapper[5002]: I1209 11:21:44.735774 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 09 11:21:44 crc kubenswrapper[5002]: I1209 11:21:44.765109 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_8e338404-c88f-4ea9-9776-da0a2b677eef/mariadb-client-1-default/0.log" Dec 09 11:21:44 crc kubenswrapper[5002]: I1209 11:21:44.787727 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hjpp\" (UniqueName: \"kubernetes.io/projected/8e338404-c88f-4ea9-9776-da0a2b677eef-kube-api-access-9hjpp\") pod \"8e338404-c88f-4ea9-9776-da0a2b677eef\" (UID: \"8e338404-c88f-4ea9-9776-da0a2b677eef\") " Dec 09 11:21:44 crc kubenswrapper[5002]: I1209 11:21:44.792134 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 09 11:21:44 crc kubenswrapper[5002]: I1209 11:21:44.793694 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e338404-c88f-4ea9-9776-da0a2b677eef-kube-api-access-9hjpp" (OuterVolumeSpecName: "kube-api-access-9hjpp") pod "8e338404-c88f-4ea9-9776-da0a2b677eef" (UID: "8e338404-c88f-4ea9-9776-da0a2b677eef"). InnerVolumeSpecName "kube-api-access-9hjpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:21:44 crc kubenswrapper[5002]: I1209 11:21:44.797478 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 09 11:21:44 crc kubenswrapper[5002]: I1209 11:21:44.889722 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hjpp\" (UniqueName: \"kubernetes.io/projected/8e338404-c88f-4ea9-9776-da0a2b677eef-kube-api-access-9hjpp\") on node \"crc\" DevicePath \"\"" Dec 09 11:21:45 crc kubenswrapper[5002]: I1209 11:21:45.211082 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 09 11:21:45 crc kubenswrapper[5002]: E1209 11:21:45.211578 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e338404-c88f-4ea9-9776-da0a2b677eef" containerName="mariadb-client-1-default" Dec 09 11:21:45 crc kubenswrapper[5002]: I1209 11:21:45.211609 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e338404-c88f-4ea9-9776-da0a2b677eef" containerName="mariadb-client-1-default" Dec 09 11:21:45 crc kubenswrapper[5002]: I1209 11:21:45.212022 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e338404-c88f-4ea9-9776-da0a2b677eef" containerName="mariadb-client-1-default" Dec 09 11:21:45 crc kubenswrapper[5002]: I1209 11:21:45.212839 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 09 11:21:45 crc kubenswrapper[5002]: I1209 11:21:45.218571 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 09 11:21:45 crc kubenswrapper[5002]: I1209 11:21:45.295520 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5zm2\" (UniqueName: \"kubernetes.io/projected/3bb931e9-0955-4b71-9bd9-b8c8542c350c-kube-api-access-g5zm2\") pod \"mariadb-client-2-default\" (UID: \"3bb931e9-0955-4b71-9bd9-b8c8542c350c\") " pod="openstack/mariadb-client-2-default" Dec 09 11:21:45 crc kubenswrapper[5002]: I1209 11:21:45.359016 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6013ecaf767d9a389cd67664de61ee015e1cc0b56451f84d0b0c28a635100a0" Dec 09 11:21:45 crc kubenswrapper[5002]: I1209 11:21:45.359071 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 09 11:21:45 crc kubenswrapper[5002]: I1209 11:21:45.398099 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5zm2\" (UniqueName: \"kubernetes.io/projected/3bb931e9-0955-4b71-9bd9-b8c8542c350c-kube-api-access-g5zm2\") pod \"mariadb-client-2-default\" (UID: \"3bb931e9-0955-4b71-9bd9-b8c8542c350c\") " pod="openstack/mariadb-client-2-default" Dec 09 11:21:45 crc kubenswrapper[5002]: I1209 11:21:45.416491 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5zm2\" (UniqueName: \"kubernetes.io/projected/3bb931e9-0955-4b71-9bd9-b8c8542c350c-kube-api-access-g5zm2\") pod \"mariadb-client-2-default\" (UID: \"3bb931e9-0955-4b71-9bd9-b8c8542c350c\") " pod="openstack/mariadb-client-2-default" Dec 09 11:21:45 crc kubenswrapper[5002]: I1209 11:21:45.536466 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 09 11:21:46 crc kubenswrapper[5002]: I1209 11:21:46.048185 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 09 11:21:46 crc kubenswrapper[5002]: W1209 11:21:46.055776 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bb931e9_0955_4b71_9bd9_b8c8542c350c.slice/crio-2f3c26abad4ca814486693d26dcce41a7e8e49114744f4f3ef347c7a52d03400 WatchSource:0}: Error finding container 2f3c26abad4ca814486693d26dcce41a7e8e49114744f4f3ef347c7a52d03400: Status 404 returned error can't find the container with id 2f3c26abad4ca814486693d26dcce41a7e8e49114744f4f3ef347c7a52d03400 Dec 09 11:21:46 crc kubenswrapper[5002]: I1209 11:21:46.072976 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e338404-c88f-4ea9-9776-da0a2b677eef" path="/var/lib/kubelet/pods/8e338404-c88f-4ea9-9776-da0a2b677eef/volumes" Dec 09 11:21:46 crc kubenswrapper[5002]: I1209 11:21:46.372200 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"3bb931e9-0955-4b71-9bd9-b8c8542c350c","Type":"ContainerStarted","Data":"3de0260c7640fc1e04644721698c15389678326747faefc82235ae395beb1afb"} Dec 09 11:21:46 crc kubenswrapper[5002]: I1209 11:21:46.372243 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"3bb931e9-0955-4b71-9bd9-b8c8542c350c","Type":"ContainerStarted","Data":"2f3c26abad4ca814486693d26dcce41a7e8e49114744f4f3ef347c7a52d03400"} Dec 09 11:21:46 crc kubenswrapper[5002]: I1209 11:21:46.394155 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.394132878 podStartE2EDuration="1.394132878s" podCreationTimestamp="2025-12-09 11:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:21:46.390698606 +0000 UTC m=+4838.782749687" watchObservedRunningTime="2025-12-09 11:21:46.394132878 +0000 UTC m=+4838.786183969" Dec 09 11:21:47 crc kubenswrapper[5002]: I1209 11:21:47.386410 5002 generic.go:334] "Generic (PLEG): container finished" podID="3bb931e9-0955-4b71-9bd9-b8c8542c350c" containerID="3de0260c7640fc1e04644721698c15389678326747faefc82235ae395beb1afb" exitCode=1 Dec 09 11:21:47 crc kubenswrapper[5002]: I1209 11:21:47.386916 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"3bb931e9-0955-4b71-9bd9-b8c8542c350c","Type":"ContainerDied","Data":"3de0260c7640fc1e04644721698c15389678326747faefc82235ae395beb1afb"} Dec 09 11:21:48 crc kubenswrapper[5002]: I1209 11:21:48.741132 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 09 11:21:48 crc kubenswrapper[5002]: I1209 11:21:48.780632 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 09 11:21:48 crc kubenswrapper[5002]: I1209 11:21:48.786374 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 09 11:21:48 crc kubenswrapper[5002]: I1209 11:21:48.843195 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5zm2\" (UniqueName: \"kubernetes.io/projected/3bb931e9-0955-4b71-9bd9-b8c8542c350c-kube-api-access-g5zm2\") pod \"3bb931e9-0955-4b71-9bd9-b8c8542c350c\" (UID: \"3bb931e9-0955-4b71-9bd9-b8c8542c350c\") " Dec 09 11:21:48 crc kubenswrapper[5002]: I1209 11:21:48.852310 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb931e9-0955-4b71-9bd9-b8c8542c350c-kube-api-access-g5zm2" (OuterVolumeSpecName: "kube-api-access-g5zm2") pod "3bb931e9-0955-4b71-9bd9-b8c8542c350c" (UID: "3bb931e9-0955-4b71-9bd9-b8c8542c350c"). InnerVolumeSpecName "kube-api-access-g5zm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:21:48 crc kubenswrapper[5002]: I1209 11:21:48.945011 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5zm2\" (UniqueName: \"kubernetes.io/projected/3bb931e9-0955-4b71-9bd9-b8c8542c350c-kube-api-access-g5zm2\") on node \"crc\" DevicePath \"\"" Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.192702 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 09 11:21:49 crc kubenswrapper[5002]: E1209 11:21:49.193297 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb931e9-0955-4b71-9bd9-b8c8542c350c" containerName="mariadb-client-2-default" Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.193333 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb931e9-0955-4b71-9bd9-b8c8542c350c" containerName="mariadb-client-2-default" Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.193585 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb931e9-0955-4b71-9bd9-b8c8542c350c" containerName="mariadb-client-2-default" Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.194515 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.214980 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.248394 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj5zh\" (UniqueName: \"kubernetes.io/projected/4a16ce52-9f69-48e9-9a8b-8cf94f98c95c-kube-api-access-cj5zh\") pod \"mariadb-client-1\" (UID: \"4a16ce52-9f69-48e9-9a8b-8cf94f98c95c\") " pod="openstack/mariadb-client-1" Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.349712 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj5zh\" (UniqueName: \"kubernetes.io/projected/4a16ce52-9f69-48e9-9a8b-8cf94f98c95c-kube-api-access-cj5zh\") pod \"mariadb-client-1\" (UID: \"4a16ce52-9f69-48e9-9a8b-8cf94f98c95c\") " pod="openstack/mariadb-client-1" Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.365762 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj5zh\" (UniqueName: \"kubernetes.io/projected/4a16ce52-9f69-48e9-9a8b-8cf94f98c95c-kube-api-access-cj5zh\") pod \"mariadb-client-1\" (UID: \"4a16ce52-9f69-48e9-9a8b-8cf94f98c95c\") " pod="openstack/mariadb-client-1" Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.406545 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f3c26abad4ca814486693d26dcce41a7e8e49114744f4f3ef347c7a52d03400" Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.406608 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.515216 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 09 11:21:49 crc kubenswrapper[5002]: I1209 11:21:49.987622 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 09 11:21:50 crc kubenswrapper[5002]: I1209 11:21:50.077778 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb931e9-0955-4b71-9bd9-b8c8542c350c" path="/var/lib/kubelet/pods/3bb931e9-0955-4b71-9bd9-b8c8542c350c/volumes" Dec 09 11:21:50 crc kubenswrapper[5002]: I1209 11:21:50.415588 5002 generic.go:334] "Generic (PLEG): container finished" podID="4a16ce52-9f69-48e9-9a8b-8cf94f98c95c" containerID="2ec923818432c5547047282c04d7c0602c82b786facd041bab6e873451fb0191" exitCode=0 Dec 09 11:21:50 crc kubenswrapper[5002]: I1209 11:21:50.415645 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"4a16ce52-9f69-48e9-9a8b-8cf94f98c95c","Type":"ContainerDied","Data":"2ec923818432c5547047282c04d7c0602c82b786facd041bab6e873451fb0191"} Dec 09 11:21:50 crc kubenswrapper[5002]: I1209 11:21:50.415685 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"4a16ce52-9f69-48e9-9a8b-8cf94f98c95c","Type":"ContainerStarted","Data":"51a302709392cc45cef9782285cf20b61a17c554e625c79e08db5c4ca3a0400e"} Dec 09 11:21:51 crc kubenswrapper[5002]: I1209 11:21:51.903226 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 09 11:21:51 crc kubenswrapper[5002]: I1209 11:21:51.925906 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_4a16ce52-9f69-48e9-9a8b-8cf94f98c95c/mariadb-client-1/0.log" Dec 09 11:21:51 crc kubenswrapper[5002]: I1209 11:21:51.963722 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 09 11:21:51 crc kubenswrapper[5002]: I1209 11:21:51.973111 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 09 11:21:51 crc kubenswrapper[5002]: I1209 11:21:51.987133 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj5zh\" (UniqueName: \"kubernetes.io/projected/4a16ce52-9f69-48e9-9a8b-8cf94f98c95c-kube-api-access-cj5zh\") pod \"4a16ce52-9f69-48e9-9a8b-8cf94f98c95c\" (UID: \"4a16ce52-9f69-48e9-9a8b-8cf94f98c95c\") " Dec 09 11:21:51 crc kubenswrapper[5002]: I1209 11:21:51.992803 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a16ce52-9f69-48e9-9a8b-8cf94f98c95c-kube-api-access-cj5zh" (OuterVolumeSpecName: "kube-api-access-cj5zh") pod "4a16ce52-9f69-48e9-9a8b-8cf94f98c95c" (UID: "4a16ce52-9f69-48e9-9a8b-8cf94f98c95c"). InnerVolumeSpecName "kube-api-access-cj5zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.070174 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a16ce52-9f69-48e9-9a8b-8cf94f98c95c" path="/var/lib/kubelet/pods/4a16ce52-9f69-48e9-9a8b-8cf94f98c95c/volumes" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.089242 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj5zh\" (UniqueName: \"kubernetes.io/projected/4a16ce52-9f69-48e9-9a8b-8cf94f98c95c-kube-api-access-cj5zh\") on node \"crc\" DevicePath \"\"" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.423068 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 09 11:21:52 crc kubenswrapper[5002]: E1209 11:21:52.423490 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a16ce52-9f69-48e9-9a8b-8cf94f98c95c" containerName="mariadb-client-1" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.423503 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a16ce52-9f69-48e9-9a8b-8cf94f98c95c" containerName="mariadb-client-1" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.423730 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a16ce52-9f69-48e9-9a8b-8cf94f98c95c" containerName="mariadb-client-1" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.424254 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.428730 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.432391 5002 scope.go:117] "RemoveContainer" containerID="2ec923818432c5547047282c04d7c0602c82b786facd041bab6e873451fb0191" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.432501 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.495487 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cwlh\" (UniqueName: \"kubernetes.io/projected/60704fe5-bcbb-4089-97a2-4a3226172993-kube-api-access-8cwlh\") pod \"mariadb-client-4-default\" (UID: \"60704fe5-bcbb-4089-97a2-4a3226172993\") " pod="openstack/mariadb-client-4-default" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.596747 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cwlh\" (UniqueName: \"kubernetes.io/projected/60704fe5-bcbb-4089-97a2-4a3226172993-kube-api-access-8cwlh\") pod \"mariadb-client-4-default\" (UID: \"60704fe5-bcbb-4089-97a2-4a3226172993\") " pod="openstack/mariadb-client-4-default" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.618896 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cwlh\" (UniqueName: \"kubernetes.io/projected/60704fe5-bcbb-4089-97a2-4a3226172993-kube-api-access-8cwlh\") pod \"mariadb-client-4-default\" (UID: \"60704fe5-bcbb-4089-97a2-4a3226172993\") " pod="openstack/mariadb-client-4-default" Dec 09 11:21:52 crc kubenswrapper[5002]: I1209 11:21:52.804292 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 09 11:21:53 crc kubenswrapper[5002]: I1209 11:21:53.060568 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:21:53 crc kubenswrapper[5002]: E1209 11:21:53.061058 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:21:53 crc kubenswrapper[5002]: I1209 11:21:53.100513 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 09 11:21:53 crc kubenswrapper[5002]: W1209 11:21:53.103650 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60704fe5_bcbb_4089_97a2_4a3226172993.slice/crio-accbd26a3ae9249263ae4071f9fd5bb2aa3d15f82ebf35c0930e9661a67e6c30 WatchSource:0}: Error finding container accbd26a3ae9249263ae4071f9fd5bb2aa3d15f82ebf35c0930e9661a67e6c30: Status 404 returned error can't find the container with id accbd26a3ae9249263ae4071f9fd5bb2aa3d15f82ebf35c0930e9661a67e6c30 Dec 09 11:21:53 crc kubenswrapper[5002]: I1209 11:21:53.439741 5002 generic.go:334] "Generic (PLEG): container finished" podID="60704fe5-bcbb-4089-97a2-4a3226172993" containerID="27aea40106f6f495ed696184538025476c29d6c8f42d9ed863f9baca11bb8c2d" exitCode=0 Dec 09 11:21:53 crc kubenswrapper[5002]: I1209 11:21:53.439854 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"60704fe5-bcbb-4089-97a2-4a3226172993","Type":"ContainerDied","Data":"27aea40106f6f495ed696184538025476c29d6c8f42d9ed863f9baca11bb8c2d"} Dec 09 11:21:53 crc kubenswrapper[5002]: I1209 11:21:53.440107 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"60704fe5-bcbb-4089-97a2-4a3226172993","Type":"ContainerStarted","Data":"accbd26a3ae9249263ae4071f9fd5bb2aa3d15f82ebf35c0930e9661a67e6c30"} Dec 09 11:21:54 crc kubenswrapper[5002]: I1209 11:21:54.827553 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 09 11:21:54 crc kubenswrapper[5002]: I1209 11:21:54.844592 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_60704fe5-bcbb-4089-97a2-4a3226172993/mariadb-client-4-default/0.log" Dec 09 11:21:54 crc kubenswrapper[5002]: I1209 11:21:54.866614 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 09 11:21:54 crc kubenswrapper[5002]: I1209 11:21:54.872794 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 09 11:21:54 crc kubenswrapper[5002]: I1209 11:21:54.927342 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cwlh\" (UniqueName: \"kubernetes.io/projected/60704fe5-bcbb-4089-97a2-4a3226172993-kube-api-access-8cwlh\") pod \"60704fe5-bcbb-4089-97a2-4a3226172993\" (UID: \"60704fe5-bcbb-4089-97a2-4a3226172993\") " Dec 09 11:21:54 crc kubenswrapper[5002]: I1209 11:21:54.932364 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60704fe5-bcbb-4089-97a2-4a3226172993-kube-api-access-8cwlh" (OuterVolumeSpecName: "kube-api-access-8cwlh") pod "60704fe5-bcbb-4089-97a2-4a3226172993" (UID: "60704fe5-bcbb-4089-97a2-4a3226172993"). InnerVolumeSpecName "kube-api-access-8cwlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:21:55 crc kubenswrapper[5002]: I1209 11:21:55.029561 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cwlh\" (UniqueName: \"kubernetes.io/projected/60704fe5-bcbb-4089-97a2-4a3226172993-kube-api-access-8cwlh\") on node \"crc\" DevicePath \"\"" Dec 09 11:21:55 crc kubenswrapper[5002]: I1209 11:21:55.461767 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="accbd26a3ae9249263ae4071f9fd5bb2aa3d15f82ebf35c0930e9661a67e6c30" Dec 09 11:21:55 crc kubenswrapper[5002]: I1209 11:21:55.461940 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 09 11:21:56 crc kubenswrapper[5002]: I1209 11:21:56.080230 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60704fe5-bcbb-4089-97a2-4a3226172993" path="/var/lib/kubelet/pods/60704fe5-bcbb-4089-97a2-4a3226172993/volumes" Dec 09 11:21:58 crc kubenswrapper[5002]: I1209 11:21:58.301649 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 09 11:21:58 crc kubenswrapper[5002]: E1209 11:21:58.302542 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60704fe5-bcbb-4089-97a2-4a3226172993" containerName="mariadb-client-4-default" Dec 09 11:21:58 crc kubenswrapper[5002]: I1209 11:21:58.302563 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="60704fe5-bcbb-4089-97a2-4a3226172993" containerName="mariadb-client-4-default" Dec 09 11:21:58 crc kubenswrapper[5002]: I1209 11:21:58.302759 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="60704fe5-bcbb-4089-97a2-4a3226172993" containerName="mariadb-client-4-default" Dec 09 11:21:58 crc kubenswrapper[5002]: I1209 11:21:58.303447 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 09 11:21:58 crc kubenswrapper[5002]: I1209 11:21:58.309014 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-knjzb" Dec 09 11:21:58 crc kubenswrapper[5002]: I1209 11:21:58.315031 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 09 11:21:58 crc kubenswrapper[5002]: I1209 11:21:58.496235 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s6tl\" (UniqueName: \"kubernetes.io/projected/43c377e7-93ef-454d-a2bd-10d67937015a-kube-api-access-7s6tl\") pod \"mariadb-client-5-default\" (UID: \"43c377e7-93ef-454d-a2bd-10d67937015a\") " pod="openstack/mariadb-client-5-default" Dec 09 11:21:58 crc kubenswrapper[5002]: I1209 11:21:58.597488 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s6tl\" (UniqueName: \"kubernetes.io/projected/43c377e7-93ef-454d-a2bd-10d67937015a-kube-api-access-7s6tl\") pod \"mariadb-client-5-default\" (UID: \"43c377e7-93ef-454d-a2bd-10d67937015a\") " pod="openstack/mariadb-client-5-default" Dec 09 11:21:58 crc kubenswrapper[5002]: I1209 11:21:58.619003 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s6tl\" (UniqueName: \"kubernetes.io/projected/43c377e7-93ef-454d-a2bd-10d67937015a-kube-api-access-7s6tl\") pod \"mariadb-client-5-default\" (UID: \"43c377e7-93ef-454d-a2bd-10d67937015a\") " pod="openstack/mariadb-client-5-default" Dec 09 11:21:58 crc kubenswrapper[5002]: I1209 11:21:58.630468 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 09 11:21:59 crc kubenswrapper[5002]: I1209 11:21:59.153215 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 09 11:21:59 crc kubenswrapper[5002]: I1209 11:21:59.508887 5002 generic.go:334] "Generic (PLEG): container finished" podID="43c377e7-93ef-454d-a2bd-10d67937015a" containerID="44c387a05f789df1d5f06c593f8965b02fb9324150d9aeb353cf6431adc931d5" exitCode=0 Dec 09 11:21:59 crc kubenswrapper[5002]: I1209 11:21:59.508991 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"43c377e7-93ef-454d-a2bd-10d67937015a","Type":"ContainerDied","Data":"44c387a05f789df1d5f06c593f8965b02fb9324150d9aeb353cf6431adc931d5"} Dec 09 11:21:59 crc kubenswrapper[5002]: I1209 11:21:59.509070 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"43c377e7-93ef-454d-a2bd-10d67937015a","Type":"ContainerStarted","Data":"09f144b73355c8e5bfc6b6cd94253f2e16fb819ae7e8ac235cdd152ab0acf38a"} Dec 09 11:22:00 crc kubenswrapper[5002]: I1209 11:22:00.906440 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 09 11:22:00 crc kubenswrapper[5002]: I1209 11:22:00.924250 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_43c377e7-93ef-454d-a2bd-10d67937015a/mariadb-client-5-default/0.log" Dec 09 11:22:00 crc kubenswrapper[5002]: I1209 11:22:00.952419 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 09 11:22:00 crc kubenswrapper[5002]: I1209 11:22:00.959031 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.036962 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s6tl\" (UniqueName: \"kubernetes.io/projected/43c377e7-93ef-454d-a2bd-10d67937015a-kube-api-access-7s6tl\") pod \"43c377e7-93ef-454d-a2bd-10d67937015a\" (UID: \"43c377e7-93ef-454d-a2bd-10d67937015a\") " Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.043100 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43c377e7-93ef-454d-a2bd-10d67937015a-kube-api-access-7s6tl" (OuterVolumeSpecName: "kube-api-access-7s6tl") pod "43c377e7-93ef-454d-a2bd-10d67937015a" (UID: "43c377e7-93ef-454d-a2bd-10d67937015a"). InnerVolumeSpecName "kube-api-access-7s6tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.096981 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 09 11:22:01 crc kubenswrapper[5002]: E1209 11:22:01.097477 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c377e7-93ef-454d-a2bd-10d67937015a" containerName="mariadb-client-5-default" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.097509 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c377e7-93ef-454d-a2bd-10d67937015a" containerName="mariadb-client-5-default" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.097867 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="43c377e7-93ef-454d-a2bd-10d67937015a" containerName="mariadb-client-5-default" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.098725 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.104906 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.138318 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s6tl\" (UniqueName: \"kubernetes.io/projected/43c377e7-93ef-454d-a2bd-10d67937015a-kube-api-access-7s6tl\") on node \"crc\" DevicePath \"\"" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.241804 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc9ll\" (UniqueName: \"kubernetes.io/projected/40050f31-907b-4038-8adf-9935226bcd9f-kube-api-access-xc9ll\") pod \"mariadb-client-6-default\" (UID: \"40050f31-907b-4038-8adf-9935226bcd9f\") " pod="openstack/mariadb-client-6-default" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.346697 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc9ll\" (UniqueName: \"kubernetes.io/projected/40050f31-907b-4038-8adf-9935226bcd9f-kube-api-access-xc9ll\") pod \"mariadb-client-6-default\" (UID: \"40050f31-907b-4038-8adf-9935226bcd9f\") " pod="openstack/mariadb-client-6-default" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.375295 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc9ll\" (UniqueName: \"kubernetes.io/projected/40050f31-907b-4038-8adf-9935226bcd9f-kube-api-access-xc9ll\") pod \"mariadb-client-6-default\" (UID: \"40050f31-907b-4038-8adf-9935226bcd9f\") " pod="openstack/mariadb-client-6-default" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.426664 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.532789 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09f144b73355c8e5bfc6b6cd94253f2e16fb819ae7e8ac235cdd152ab0acf38a" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.532895 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 09 11:22:01 crc kubenswrapper[5002]: I1209 11:22:01.957991 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 09 11:22:01 crc kubenswrapper[5002]: W1209 11:22:01.962177 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40050f31_907b_4038_8adf_9935226bcd9f.slice/crio-e83d29f0ce7ead58ce53d51b17d882c586ecdaca0c366374ca427096aa1f9b19 WatchSource:0}: Error finding container e83d29f0ce7ead58ce53d51b17d882c586ecdaca0c366374ca427096aa1f9b19: Status 404 returned error can't find the container with id e83d29f0ce7ead58ce53d51b17d882c586ecdaca0c366374ca427096aa1f9b19 Dec 09 11:22:02 crc kubenswrapper[5002]: I1209 11:22:02.071555 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43c377e7-93ef-454d-a2bd-10d67937015a" path="/var/lib/kubelet/pods/43c377e7-93ef-454d-a2bd-10d67937015a/volumes" Dec 09 11:22:02 crc kubenswrapper[5002]: I1209 11:22:02.545191 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"40050f31-907b-4038-8adf-9935226bcd9f","Type":"ContainerStarted","Data":"3d8dfa4b55b6b9e0cf5513ebf04960913f759767090c55b2a1f6dddcc4f7bff0"} Dec 09 11:22:02 crc kubenswrapper[5002]: I1209 11:22:02.545269 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"40050f31-907b-4038-8adf-9935226bcd9f","Type":"ContainerStarted","Data":"e83d29f0ce7ead58ce53d51b17d882c586ecdaca0c366374ca427096aa1f9b19"} Dec 09 11:22:02 crc kubenswrapper[5002]: I1209 11:22:02.574986 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.574954594 podStartE2EDuration="1.574954594s" podCreationTimestamp="2025-12-09 11:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:22:02.567338581 +0000 UTC m=+4854.959389722" watchObservedRunningTime="2025-12-09 11:22:02.574954594 +0000 UTC m=+4854.967005735" Dec 09 11:22:03 crc kubenswrapper[5002]: I1209 11:22:03.560721 5002 generic.go:334] "Generic (PLEG): container finished" podID="40050f31-907b-4038-8adf-9935226bcd9f" containerID="3d8dfa4b55b6b9e0cf5513ebf04960913f759767090c55b2a1f6dddcc4f7bff0" exitCode=1 Dec 09 11:22:03 crc kubenswrapper[5002]: I1209 11:22:03.560883 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"40050f31-907b-4038-8adf-9935226bcd9f","Type":"ContainerDied","Data":"3d8dfa4b55b6b9e0cf5513ebf04960913f759767090c55b2a1f6dddcc4f7bff0"} Dec 09 11:22:04 crc kubenswrapper[5002]: I1209 11:22:04.939204 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 09 11:22:04 crc kubenswrapper[5002]: I1209 11:22:04.985044 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 09 11:22:04 crc kubenswrapper[5002]: I1209 11:22:04.993555 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.110133 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc9ll\" (UniqueName: \"kubernetes.io/projected/40050f31-907b-4038-8adf-9935226bcd9f-kube-api-access-xc9ll\") pod \"40050f31-907b-4038-8adf-9935226bcd9f\" (UID: \"40050f31-907b-4038-8adf-9935226bcd9f\") " Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.116182 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40050f31-907b-4038-8adf-9935226bcd9f-kube-api-access-xc9ll" (OuterVolumeSpecName: "kube-api-access-xc9ll") pod "40050f31-907b-4038-8adf-9935226bcd9f" (UID: "40050f31-907b-4038-8adf-9935226bcd9f"). InnerVolumeSpecName "kube-api-access-xc9ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.152969 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 09 11:22:05 crc kubenswrapper[5002]: E1209 11:22:05.153370 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40050f31-907b-4038-8adf-9935226bcd9f" containerName="mariadb-client-6-default" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.153386 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="40050f31-907b-4038-8adf-9935226bcd9f" containerName="mariadb-client-6-default" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.153604 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="40050f31-907b-4038-8adf-9935226bcd9f" containerName="mariadb-client-6-default" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.154245 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.159928 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.212156 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h72bh\" (UniqueName: \"kubernetes.io/projected/058ece2e-24e4-4bd6-8783-6f935adafddc-kube-api-access-h72bh\") pod \"mariadb-client-7-default\" (UID: \"058ece2e-24e4-4bd6-8783-6f935adafddc\") " pod="openstack/mariadb-client-7-default" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.212284 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc9ll\" (UniqueName: \"kubernetes.io/projected/40050f31-907b-4038-8adf-9935226bcd9f-kube-api-access-xc9ll\") on node \"crc\" DevicePath \"\"" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.313541 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h72bh\" (UniqueName: \"kubernetes.io/projected/058ece2e-24e4-4bd6-8783-6f935adafddc-kube-api-access-h72bh\") pod \"mariadb-client-7-default\" (UID: \"058ece2e-24e4-4bd6-8783-6f935adafddc\") " pod="openstack/mariadb-client-7-default" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.347980 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h72bh\" (UniqueName: \"kubernetes.io/projected/058ece2e-24e4-4bd6-8783-6f935adafddc-kube-api-access-h72bh\") pod \"mariadb-client-7-default\" (UID: \"058ece2e-24e4-4bd6-8783-6f935adafddc\") " pod="openstack/mariadb-client-7-default" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.481134 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.583713 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e83d29f0ce7ead58ce53d51b17d882c586ecdaca0c366374ca427096aa1f9b19" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.583838 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 09 11:22:05 crc kubenswrapper[5002]: I1209 11:22:05.768506 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 09 11:22:06 crc kubenswrapper[5002]: I1209 11:22:06.072589 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40050f31-907b-4038-8adf-9935226bcd9f" path="/var/lib/kubelet/pods/40050f31-907b-4038-8adf-9935226bcd9f/volumes" Dec 09 11:22:06 crc kubenswrapper[5002]: I1209 11:22:06.595262 5002 generic.go:334] "Generic (PLEG): container finished" podID="058ece2e-24e4-4bd6-8783-6f935adafddc" containerID="57854ddb2a9873061bcbb8a6d226bae003ab7d6a442fade6983a2b5804d5f35a" exitCode=0 Dec 09 11:22:06 crc kubenswrapper[5002]: I1209 11:22:06.595346 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"058ece2e-24e4-4bd6-8783-6f935adafddc","Type":"ContainerDied","Data":"57854ddb2a9873061bcbb8a6d226bae003ab7d6a442fade6983a2b5804d5f35a"} Dec 09 11:22:06 crc kubenswrapper[5002]: I1209 11:22:06.595589 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"058ece2e-24e4-4bd6-8783-6f935adafddc","Type":"ContainerStarted","Data":"860e9070fe67a4aa04d94133f6eca665359e346b686e787ee796459bfa445845"} Dec 09 11:22:07 crc kubenswrapper[5002]: I1209 11:22:07.996548 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.022867 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_058ece2e-24e4-4bd6-8783-6f935adafddc/mariadb-client-7-default/0.log" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.053157 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.066388 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:22:08 crc kubenswrapper[5002]: E1209 11:22:08.066666 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.074998 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.153475 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h72bh\" (UniqueName: \"kubernetes.io/projected/058ece2e-24e4-4bd6-8783-6f935adafddc-kube-api-access-h72bh\") pod \"058ece2e-24e4-4bd6-8783-6f935adafddc\" (UID: \"058ece2e-24e4-4bd6-8783-6f935adafddc\") " Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.158185 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058ece2e-24e4-4bd6-8783-6f935adafddc-kube-api-access-h72bh" (OuterVolumeSpecName: "kube-api-access-h72bh") pod "058ece2e-24e4-4bd6-8783-6f935adafddc" (UID: "058ece2e-24e4-4bd6-8783-6f935adafddc"). InnerVolumeSpecName "kube-api-access-h72bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.190272 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 09 11:22:08 crc kubenswrapper[5002]: E1209 11:22:08.190590 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058ece2e-24e4-4bd6-8783-6f935adafddc" containerName="mariadb-client-7-default" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.190606 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="058ece2e-24e4-4bd6-8783-6f935adafddc" containerName="mariadb-client-7-default" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.190753 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="058ece2e-24e4-4bd6-8783-6f935adafddc" containerName="mariadb-client-7-default" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.191223 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.197794 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.255825 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgzgn\" (UniqueName: \"kubernetes.io/projected/1212ded3-b6ce-436a-a645-8d178efebe07-kube-api-access-rgzgn\") pod \"mariadb-client-2\" (UID: \"1212ded3-b6ce-436a-a645-8d178efebe07\") " pod="openstack/mariadb-client-2" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.255930 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h72bh\" (UniqueName: \"kubernetes.io/projected/058ece2e-24e4-4bd6-8783-6f935adafddc-kube-api-access-h72bh\") on node \"crc\" DevicePath \"\"" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.356646 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgzgn\" (UniqueName: \"kubernetes.io/projected/1212ded3-b6ce-436a-a645-8d178efebe07-kube-api-access-rgzgn\") pod \"mariadb-client-2\" (UID: \"1212ded3-b6ce-436a-a645-8d178efebe07\") " pod="openstack/mariadb-client-2" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.386752 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgzgn\" (UniqueName: \"kubernetes.io/projected/1212ded3-b6ce-436a-a645-8d178efebe07-kube-api-access-rgzgn\") pod \"mariadb-client-2\" (UID: \"1212ded3-b6ce-436a-a645-8d178efebe07\") " pod="openstack/mariadb-client-2" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.516715 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.622030 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="860e9070fe67a4aa04d94133f6eca665359e346b686e787ee796459bfa445845" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.622103 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 09 11:22:08 crc kubenswrapper[5002]: I1209 11:22:08.863031 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 09 11:22:08 crc kubenswrapper[5002]: W1209 11:22:08.867241 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1212ded3_b6ce_436a_a645_8d178efebe07.slice/crio-16a4fffddbff992f96e17bafac082a26a4b81eea830601fb6f3e57278515d8a4 WatchSource:0}: Error finding container 16a4fffddbff992f96e17bafac082a26a4b81eea830601fb6f3e57278515d8a4: Status 404 returned error can't find the container with id 16a4fffddbff992f96e17bafac082a26a4b81eea830601fb6f3e57278515d8a4 Dec 09 11:22:09 crc kubenswrapper[5002]: I1209 11:22:09.634035 5002 generic.go:334] "Generic (PLEG): container finished" podID="1212ded3-b6ce-436a-a645-8d178efebe07" containerID="bb51280d69f0d38a25167ca4662f5ad638a23d874ff4dbc668934fa9a1a5cf9b" exitCode=0 Dec 09 11:22:09 crc kubenswrapper[5002]: I1209 11:22:09.634192 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"1212ded3-b6ce-436a-a645-8d178efebe07","Type":"ContainerDied","Data":"bb51280d69f0d38a25167ca4662f5ad638a23d874ff4dbc668934fa9a1a5cf9b"} Dec 09 11:22:09 crc kubenswrapper[5002]: I1209 11:22:09.634543 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"1212ded3-b6ce-436a-a645-8d178efebe07","Type":"ContainerStarted","Data":"16a4fffddbff992f96e17bafac082a26a4b81eea830601fb6f3e57278515d8a4"} Dec 09 11:22:10 crc kubenswrapper[5002]: I1209 11:22:10.072058 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058ece2e-24e4-4bd6-8783-6f935adafddc" path="/var/lib/kubelet/pods/058ece2e-24e4-4bd6-8783-6f935adafddc/volumes" Dec 09 11:22:11 crc kubenswrapper[5002]: I1209 11:22:11.115224 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 09 11:22:11 crc kubenswrapper[5002]: I1209 11:22:11.144066 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_1212ded3-b6ce-436a-a645-8d178efebe07/mariadb-client-2/0.log" Dec 09 11:22:11 crc kubenswrapper[5002]: I1209 11:22:11.166624 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 09 11:22:11 crc kubenswrapper[5002]: I1209 11:22:11.172196 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 09 11:22:11 crc kubenswrapper[5002]: I1209 11:22:11.229555 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgzgn\" (UniqueName: \"kubernetes.io/projected/1212ded3-b6ce-436a-a645-8d178efebe07-kube-api-access-rgzgn\") pod \"1212ded3-b6ce-436a-a645-8d178efebe07\" (UID: \"1212ded3-b6ce-436a-a645-8d178efebe07\") " Dec 09 11:22:11 crc kubenswrapper[5002]: I1209 11:22:11.244919 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1212ded3-b6ce-436a-a645-8d178efebe07-kube-api-access-rgzgn" (OuterVolumeSpecName: "kube-api-access-rgzgn") pod "1212ded3-b6ce-436a-a645-8d178efebe07" (UID: "1212ded3-b6ce-436a-a645-8d178efebe07"). InnerVolumeSpecName "kube-api-access-rgzgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:22:11 crc kubenswrapper[5002]: I1209 11:22:11.331355 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgzgn\" (UniqueName: \"kubernetes.io/projected/1212ded3-b6ce-436a-a645-8d178efebe07-kube-api-access-rgzgn\") on node \"crc\" DevicePath \"\"" Dec 09 11:22:11 crc kubenswrapper[5002]: I1209 11:22:11.659124 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16a4fffddbff992f96e17bafac082a26a4b81eea830601fb6f3e57278515d8a4" Dec 09 11:22:11 crc kubenswrapper[5002]: I1209 11:22:11.659211 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 09 11:22:12 crc kubenswrapper[5002]: I1209 11:22:12.077272 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1212ded3-b6ce-436a-a645-8d178efebe07" path="/var/lib/kubelet/pods/1212ded3-b6ce-436a-a645-8d178efebe07/volumes" Dec 09 11:22:23 crc kubenswrapper[5002]: I1209 11:22:23.060898 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:22:23 crc kubenswrapper[5002]: I1209 11:22:23.801726 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"4dcfb78f9776dceb7a4279696fc5ccc4c505ca97a007b3ec6ade5fcf452969c8"} Dec 09 11:22:45 crc kubenswrapper[5002]: I1209 11:22:45.575229 5002 scope.go:117] "RemoveContainer" containerID="914e993c4badace72ab6cb15e7954efe7851adf6447e9d9fdde84d6738f9a6dc" Dec 09 11:24:37 crc kubenswrapper[5002]: I1209 11:24:37.968868 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:24:37 crc kubenswrapper[5002]: I1209 11:24:37.969548 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:25:07 crc kubenswrapper[5002]: I1209 11:25:07.965259 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:25:07 crc kubenswrapper[5002]: I1209 11:25:07.965984 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:25:37 crc kubenswrapper[5002]: I1209 11:25:37.965093 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:25:37 crc kubenswrapper[5002]: I1209 11:25:37.965752 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:25:37 crc kubenswrapper[5002]: I1209 11:25:37.965805 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 11:25:37 crc kubenswrapper[5002]: I1209 11:25:37.966540 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dcfb78f9776dceb7a4279696fc5ccc4c505ca97a007b3ec6ade5fcf452969c8"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:25:37 crc kubenswrapper[5002]: I1209 11:25:37.966601 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://4dcfb78f9776dceb7a4279696fc5ccc4c505ca97a007b3ec6ade5fcf452969c8" gracePeriod=600 Dec 09 11:25:38 crc kubenswrapper[5002]: I1209 11:25:38.432400 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="4dcfb78f9776dceb7a4279696fc5ccc4c505ca97a007b3ec6ade5fcf452969c8" exitCode=0 Dec 09 11:25:38 crc kubenswrapper[5002]: I1209 11:25:38.432469 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"4dcfb78f9776dceb7a4279696fc5ccc4c505ca97a007b3ec6ade5fcf452969c8"} Dec 09 11:25:38 crc kubenswrapper[5002]: I1209 11:25:38.433025 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679"} Dec 09 11:25:38 crc kubenswrapper[5002]: I1209 11:25:38.433047 5002 scope.go:117] "RemoveContainer" containerID="77659b1c47a86560c49c87566b1a9998228a70363d6028c0f54d97d529fda9b3" Dec 09 11:26:08 crc kubenswrapper[5002]: I1209 11:26:08.973364 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tg2r8"] Dec 09 11:26:08 crc kubenswrapper[5002]: E1209 11:26:08.974353 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1212ded3-b6ce-436a-a645-8d178efebe07" containerName="mariadb-client-2" Dec 09 11:26:08 crc kubenswrapper[5002]: I1209 11:26:08.974371 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1212ded3-b6ce-436a-a645-8d178efebe07" containerName="mariadb-client-2" Dec 09 11:26:08 crc kubenswrapper[5002]: I1209 11:26:08.974750 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1212ded3-b6ce-436a-a645-8d178efebe07" containerName="mariadb-client-2" Dec 09 11:26:08 crc kubenswrapper[5002]: I1209 11:26:08.978134 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:08 crc kubenswrapper[5002]: I1209 11:26:08.981442 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tg2r8"] Dec 09 11:26:09 crc kubenswrapper[5002]: I1209 11:26:09.145588 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-catalog-content\") pod \"redhat-marketplace-tg2r8\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:09 crc kubenswrapper[5002]: I1209 11:26:09.145657 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6f4v\" (UniqueName: \"kubernetes.io/projected/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-kube-api-access-h6f4v\") pod \"redhat-marketplace-tg2r8\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:09 crc kubenswrapper[5002]: I1209 11:26:09.145679 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-utilities\") pod \"redhat-marketplace-tg2r8\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:09 crc kubenswrapper[5002]: I1209 11:26:09.247228 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-catalog-content\") pod \"redhat-marketplace-tg2r8\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:09 crc kubenswrapper[5002]: I1209 11:26:09.247340 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6f4v\" (UniqueName: \"kubernetes.io/projected/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-kube-api-access-h6f4v\") pod \"redhat-marketplace-tg2r8\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:09 crc kubenswrapper[5002]: I1209 11:26:09.247368 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-utilities\") pod \"redhat-marketplace-tg2r8\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:09 crc kubenswrapper[5002]: I1209 11:26:09.247765 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-catalog-content\") pod \"redhat-marketplace-tg2r8\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:09 crc kubenswrapper[5002]: I1209 11:26:09.248092 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-utilities\") pod \"redhat-marketplace-tg2r8\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:09 crc kubenswrapper[5002]: I1209 11:26:09.266283 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6f4v\" (UniqueName: \"kubernetes.io/projected/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-kube-api-access-h6f4v\") pod \"redhat-marketplace-tg2r8\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:09 crc kubenswrapper[5002]: I1209 11:26:09.302334 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:09 crc kubenswrapper[5002]: I1209 11:26:09.779820 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tg2r8"] Dec 09 11:26:10 crc kubenswrapper[5002]: I1209 11:26:10.714597 5002 generic.go:334] "Generic (PLEG): container finished" podID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" containerID="1d95e9b787d15b4f8d1a9d6538ee6c9e99224a99a1e6a264e2d91a41ad3a4595" exitCode=0 Dec 09 11:26:10 crc kubenswrapper[5002]: I1209 11:26:10.714651 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tg2r8" event={"ID":"055f8e94-e686-4bfe-afe9-a1cdd55fcae4","Type":"ContainerDied","Data":"1d95e9b787d15b4f8d1a9d6538ee6c9e99224a99a1e6a264e2d91a41ad3a4595"} Dec 09 11:26:10 crc kubenswrapper[5002]: I1209 11:26:10.714676 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tg2r8" event={"ID":"055f8e94-e686-4bfe-afe9-a1cdd55fcae4","Type":"ContainerStarted","Data":"6b28c7a2e4889e6c93ade32cb4126c5c300c3d6ca470adea38eaaf2fd79f1784"} Dec 09 11:26:10 crc kubenswrapper[5002]: I1209 11:26:10.717286 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:26:11 crc kubenswrapper[5002]: I1209 11:26:11.723577 5002 generic.go:334] "Generic (PLEG): container finished" podID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" containerID="385470063e34d16fc863ceec58ec34e0cc782a03de0eef479abea72d127d3dcb" exitCode=0 Dec 09 11:26:11 crc kubenswrapper[5002]: I1209 11:26:11.723673 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tg2r8" event={"ID":"055f8e94-e686-4bfe-afe9-a1cdd55fcae4","Type":"ContainerDied","Data":"385470063e34d16fc863ceec58ec34e0cc782a03de0eef479abea72d127d3dcb"} Dec 09 11:26:12 crc kubenswrapper[5002]: I1209 11:26:12.733522 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tg2r8" event={"ID":"055f8e94-e686-4bfe-afe9-a1cdd55fcae4","Type":"ContainerStarted","Data":"65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f"} Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.264033 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tg2r8" podStartSLOduration=3.879575828 podStartE2EDuration="5.264011706s" podCreationTimestamp="2025-12-09 11:26:08 +0000 UTC" firstStartedPulling="2025-12-09 11:26:10.716855299 +0000 UTC m=+5103.108906390" lastFinishedPulling="2025-12-09 11:26:12.101291187 +0000 UTC m=+5104.493342268" observedRunningTime="2025-12-09 11:26:12.758512725 +0000 UTC m=+5105.150563806" watchObservedRunningTime="2025-12-09 11:26:13.264011706 +0000 UTC m=+5105.656062787" Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.266245 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.269996 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.271831 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-knjzb" Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.282069 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.404918 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfm6d\" (UniqueName: \"kubernetes.io/projected/64e3023c-35e4-43fb-b414-e7c2abbcd64d-kube-api-access-bfm6d\") pod \"mariadb-copy-data\" (UID: \"64e3023c-35e4-43fb-b414-e7c2abbcd64d\") " pod="openstack/mariadb-copy-data" Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.404991 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7452df44-56a9-4ffd-999a-668c99b19f56\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7452df44-56a9-4ffd-999a-668c99b19f56\") pod \"mariadb-copy-data\" (UID: \"64e3023c-35e4-43fb-b414-e7c2abbcd64d\") " pod="openstack/mariadb-copy-data" Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.506666 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7452df44-56a9-4ffd-999a-668c99b19f56\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7452df44-56a9-4ffd-999a-668c99b19f56\") pod \"mariadb-copy-data\" (UID: \"64e3023c-35e4-43fb-b414-e7c2abbcd64d\") " pod="openstack/mariadb-copy-data" Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.506848 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfm6d\" (UniqueName: \"kubernetes.io/projected/64e3023c-35e4-43fb-b414-e7c2abbcd64d-kube-api-access-bfm6d\") pod \"mariadb-copy-data\" (UID: \"64e3023c-35e4-43fb-b414-e7c2abbcd64d\") " pod="openstack/mariadb-copy-data" Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.510612 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.510645 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7452df44-56a9-4ffd-999a-668c99b19f56\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7452df44-56a9-4ffd-999a-668c99b19f56\") pod \"mariadb-copy-data\" (UID: \"64e3023c-35e4-43fb-b414-e7c2abbcd64d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e9b98b6ef2dbfa8cadd8e26d58bdb41d2a721ac1950e64d4b4125efa38f8f1ab/globalmount\"" pod="openstack/mariadb-copy-data" Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.526050 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfm6d\" (UniqueName: \"kubernetes.io/projected/64e3023c-35e4-43fb-b414-e7c2abbcd64d-kube-api-access-bfm6d\") pod \"mariadb-copy-data\" (UID: \"64e3023c-35e4-43fb-b414-e7c2abbcd64d\") " pod="openstack/mariadb-copy-data" Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.534340 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7452df44-56a9-4ffd-999a-668c99b19f56\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7452df44-56a9-4ffd-999a-668c99b19f56\") pod \"mariadb-copy-data\" (UID: \"64e3023c-35e4-43fb-b414-e7c2abbcd64d\") " pod="openstack/mariadb-copy-data" Dec 09 11:26:13 crc kubenswrapper[5002]: I1209 11:26:13.631391 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 09 11:26:14 crc kubenswrapper[5002]: I1209 11:26:14.239206 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 11:26:14 crc kubenswrapper[5002]: W1209 11:26:14.248045 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64e3023c_35e4_43fb_b414_e7c2abbcd64d.slice/crio-00f9c63990c46149b68ea76671d594a55c85b27cfbc8d24b9ec42c07c48e59c0 WatchSource:0}: Error finding container 00f9c63990c46149b68ea76671d594a55c85b27cfbc8d24b9ec42c07c48e59c0: Status 404 returned error can't find the container with id 00f9c63990c46149b68ea76671d594a55c85b27cfbc8d24b9ec42c07c48e59c0 Dec 09 11:26:14 crc kubenswrapper[5002]: I1209 11:26:14.753945 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"64e3023c-35e4-43fb-b414-e7c2abbcd64d","Type":"ContainerStarted","Data":"a3e07ec08f505c0916ba42005e65f044690c20195ecda081e0183617c886894f"} Dec 09 11:26:14 crc kubenswrapper[5002]: I1209 11:26:14.754937 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"64e3023c-35e4-43fb-b414-e7c2abbcd64d","Type":"ContainerStarted","Data":"00f9c63990c46149b68ea76671d594a55c85b27cfbc8d24b9ec42c07c48e59c0"} Dec 09 11:26:14 crc kubenswrapper[5002]: I1209 11:26:14.776293 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.776268763 podStartE2EDuration="2.776268763s" podCreationTimestamp="2025-12-09 11:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:26:14.771501036 +0000 UTC m=+5107.163552137" watchObservedRunningTime="2025-12-09 11:26:14.776268763 +0000 UTC m=+5107.168319884" Dec 09 11:26:17 crc kubenswrapper[5002]: I1209 11:26:17.455041 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 09 11:26:17 crc kubenswrapper[5002]: I1209 11:26:17.457307 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 11:26:17 crc kubenswrapper[5002]: I1209 11:26:17.474293 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 09 11:26:17 crc kubenswrapper[5002]: I1209 11:26:17.576706 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgdgj\" (UniqueName: \"kubernetes.io/projected/c4f5b8db-018b-4951-9aee-c6a851124b14-kube-api-access-lgdgj\") pod \"mariadb-client\" (UID: \"c4f5b8db-018b-4951-9aee-c6a851124b14\") " pod="openstack/mariadb-client" Dec 09 11:26:17 crc kubenswrapper[5002]: I1209 11:26:17.679216 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgdgj\" (UniqueName: \"kubernetes.io/projected/c4f5b8db-018b-4951-9aee-c6a851124b14-kube-api-access-lgdgj\") pod \"mariadb-client\" (UID: \"c4f5b8db-018b-4951-9aee-c6a851124b14\") " pod="openstack/mariadb-client" Dec 09 11:26:17 crc kubenswrapper[5002]: I1209 11:26:17.705453 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgdgj\" (UniqueName: \"kubernetes.io/projected/c4f5b8db-018b-4951-9aee-c6a851124b14-kube-api-access-lgdgj\") pod \"mariadb-client\" (UID: \"c4f5b8db-018b-4951-9aee-c6a851124b14\") " pod="openstack/mariadb-client" Dec 09 11:26:17 crc kubenswrapper[5002]: I1209 11:26:17.782509 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 11:26:18 crc kubenswrapper[5002]: I1209 11:26:18.003719 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 09 11:26:18 crc kubenswrapper[5002]: W1209 11:26:18.007640 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4f5b8db_018b_4951_9aee_c6a851124b14.slice/crio-d39f53d85588e2333972f9bdc5d3e9860eada568f6e5bb1e193899c5c924b18c WatchSource:0}: Error finding container d39f53d85588e2333972f9bdc5d3e9860eada568f6e5bb1e193899c5c924b18c: Status 404 returned error can't find the container with id d39f53d85588e2333972f9bdc5d3e9860eada568f6e5bb1e193899c5c924b18c Dec 09 11:26:18 crc kubenswrapper[5002]: I1209 11:26:18.795567 5002 generic.go:334] "Generic (PLEG): container finished" podID="c4f5b8db-018b-4951-9aee-c6a851124b14" containerID="0f7acec706fd6888f9f7eabc86bf59fe77a708e3db8637ae0bf85e2bcb4c655a" exitCode=0 Dec 09 11:26:18 crc kubenswrapper[5002]: I1209 11:26:18.795659 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"c4f5b8db-018b-4951-9aee-c6a851124b14","Type":"ContainerDied","Data":"0f7acec706fd6888f9f7eabc86bf59fe77a708e3db8637ae0bf85e2bcb4c655a"} Dec 09 11:26:18 crc kubenswrapper[5002]: I1209 11:26:18.795970 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"c4f5b8db-018b-4951-9aee-c6a851124b14","Type":"ContainerStarted","Data":"d39f53d85588e2333972f9bdc5d3e9860eada568f6e5bb1e193899c5c924b18c"} Dec 09 11:26:19 crc kubenswrapper[5002]: I1209 11:26:19.302707 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:19 crc kubenswrapper[5002]: I1209 11:26:19.302772 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:19 crc kubenswrapper[5002]: I1209 11:26:19.396778 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:19 crc kubenswrapper[5002]: I1209 11:26:19.855562 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:19 crc kubenswrapper[5002]: I1209 11:26:19.902685 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tg2r8"] Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.076532 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.112586 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_c4f5b8db-018b-4951-9aee-c6a851124b14/mariadb-client/0.log" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.142537 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.150240 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.152601 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgdgj\" (UniqueName: \"kubernetes.io/projected/c4f5b8db-018b-4951-9aee-c6a851124b14-kube-api-access-lgdgj\") pod \"c4f5b8db-018b-4951-9aee-c6a851124b14\" (UID: \"c4f5b8db-018b-4951-9aee-c6a851124b14\") " Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.157782 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4f5b8db-018b-4951-9aee-c6a851124b14-kube-api-access-lgdgj" (OuterVolumeSpecName: "kube-api-access-lgdgj") pod "c4f5b8db-018b-4951-9aee-c6a851124b14" (UID: "c4f5b8db-018b-4951-9aee-c6a851124b14"). InnerVolumeSpecName "kube-api-access-lgdgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.254118 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgdgj\" (UniqueName: \"kubernetes.io/projected/c4f5b8db-018b-4951-9aee-c6a851124b14-kube-api-access-lgdgj\") on node \"crc\" DevicePath \"\"" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.268430 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 09 11:26:21 crc kubenswrapper[5002]: E1209 11:26:21.269030 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4f5b8db-018b-4951-9aee-c6a851124b14" containerName="mariadb-client" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.269058 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4f5b8db-018b-4951-9aee-c6a851124b14" containerName="mariadb-client" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.269480 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4f5b8db-018b-4951-9aee-c6a851124b14" containerName="mariadb-client" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.270706 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.277372 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.355582 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dqh5\" (UniqueName: \"kubernetes.io/projected/5a401af7-5a24-4df0-8c1c-d565018a08c3-kube-api-access-8dqh5\") pod \"mariadb-client\" (UID: \"5a401af7-5a24-4df0-8c1c-d565018a08c3\") " pod="openstack/mariadb-client" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.461862 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dqh5\" (UniqueName: \"kubernetes.io/projected/5a401af7-5a24-4df0-8c1c-d565018a08c3-kube-api-access-8dqh5\") pod \"mariadb-client\" (UID: \"5a401af7-5a24-4df0-8c1c-d565018a08c3\") " pod="openstack/mariadb-client" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.479077 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dqh5\" (UniqueName: \"kubernetes.io/projected/5a401af7-5a24-4df0-8c1c-d565018a08c3-kube-api-access-8dqh5\") pod \"mariadb-client\" (UID: \"5a401af7-5a24-4df0-8c1c-d565018a08c3\") " pod="openstack/mariadb-client" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.595993 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.818159 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.818163 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d39f53d85588e2333972f9bdc5d3e9860eada568f6e5bb1e193899c5c924b18c" Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.818292 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tg2r8" podUID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" containerName="registry-server" containerID="cri-o://65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f" gracePeriod=2 Dec 09 11:26:21 crc kubenswrapper[5002]: I1209 11:26:21.845876 5002 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="c4f5b8db-018b-4951-9aee-c6a851124b14" podUID="5a401af7-5a24-4df0-8c1c-d565018a08c3" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.079673 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4f5b8db-018b-4951-9aee-c6a851124b14" path="/var/lib/kubelet/pods/c4f5b8db-018b-4951-9aee-c6a851124b14/volumes" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.080807 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.280288 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.379796 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-catalog-content\") pod \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.379936 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-utilities\") pod \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.380027 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6f4v\" (UniqueName: \"kubernetes.io/projected/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-kube-api-access-h6f4v\") pod \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\" (UID: \"055f8e94-e686-4bfe-afe9-a1cdd55fcae4\") " Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.380905 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-utilities" (OuterVolumeSpecName: "utilities") pod "055f8e94-e686-4bfe-afe9-a1cdd55fcae4" (UID: "055f8e94-e686-4bfe-afe9-a1cdd55fcae4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.387485 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-kube-api-access-h6f4v" (OuterVolumeSpecName: "kube-api-access-h6f4v") pod "055f8e94-e686-4bfe-afe9-a1cdd55fcae4" (UID: "055f8e94-e686-4bfe-afe9-a1cdd55fcae4"). InnerVolumeSpecName "kube-api-access-h6f4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.401460 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "055f8e94-e686-4bfe-afe9-a1cdd55fcae4" (UID: "055f8e94-e686-4bfe-afe9-a1cdd55fcae4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.481113 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.481146 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.481192 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6f4v\" (UniqueName: \"kubernetes.io/projected/055f8e94-e686-4bfe-afe9-a1cdd55fcae4-kube-api-access-h6f4v\") on node \"crc\" DevicePath \"\"" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.824912 5002 generic.go:334] "Generic (PLEG): container finished" podID="5a401af7-5a24-4df0-8c1c-d565018a08c3" containerID="29833e54ea38c9e6dd2ba4cc42962c851d100570e218478b0dc10f9edc718553" exitCode=0 Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.825271 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5a401af7-5a24-4df0-8c1c-d565018a08c3","Type":"ContainerDied","Data":"29833e54ea38c9e6dd2ba4cc42962c851d100570e218478b0dc10f9edc718553"} Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.825304 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5a401af7-5a24-4df0-8c1c-d565018a08c3","Type":"ContainerStarted","Data":"4599befe403ddca9223f08b483d050a5a7449486fae946e304683992b583bf72"} Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.827672 5002 generic.go:334] "Generic (PLEG): container finished" podID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" containerID="65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f" exitCode=0 Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.827723 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tg2r8" event={"ID":"055f8e94-e686-4bfe-afe9-a1cdd55fcae4","Type":"ContainerDied","Data":"65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f"} Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.827730 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tg2r8" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.827758 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tg2r8" event={"ID":"055f8e94-e686-4bfe-afe9-a1cdd55fcae4","Type":"ContainerDied","Data":"6b28c7a2e4889e6c93ade32cb4126c5c300c3d6ca470adea38eaaf2fd79f1784"} Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.827783 5002 scope.go:117] "RemoveContainer" containerID="65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.857705 5002 scope.go:117] "RemoveContainer" containerID="385470063e34d16fc863ceec58ec34e0cc782a03de0eef479abea72d127d3dcb" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.878799 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tg2r8"] Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.888087 5002 scope.go:117] "RemoveContainer" containerID="1d95e9b787d15b4f8d1a9d6538ee6c9e99224a99a1e6a264e2d91a41ad3a4595" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.895780 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tg2r8"] Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.927308 5002 scope.go:117] "RemoveContainer" containerID="65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f" Dec 09 11:26:22 crc kubenswrapper[5002]: E1209 11:26:22.929144 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f\": container with ID starting with 65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f not found: ID does not exist" containerID="65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.929234 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f"} err="failed to get container status \"65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f\": rpc error: code = NotFound desc = could not find container \"65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f\": container with ID starting with 65eea6c9597a6f99c31d1e83c37974dbdf00b9bc39672e28b07289161405658f not found: ID does not exist" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.929320 5002 scope.go:117] "RemoveContainer" containerID="385470063e34d16fc863ceec58ec34e0cc782a03de0eef479abea72d127d3dcb" Dec 09 11:26:22 crc kubenswrapper[5002]: E1209 11:26:22.929975 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385470063e34d16fc863ceec58ec34e0cc782a03de0eef479abea72d127d3dcb\": container with ID starting with 385470063e34d16fc863ceec58ec34e0cc782a03de0eef479abea72d127d3dcb not found: ID does not exist" containerID="385470063e34d16fc863ceec58ec34e0cc782a03de0eef479abea72d127d3dcb" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.930049 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385470063e34d16fc863ceec58ec34e0cc782a03de0eef479abea72d127d3dcb"} err="failed to get container status \"385470063e34d16fc863ceec58ec34e0cc782a03de0eef479abea72d127d3dcb\": rpc error: code = NotFound desc = could not find container \"385470063e34d16fc863ceec58ec34e0cc782a03de0eef479abea72d127d3dcb\": container with ID starting with 385470063e34d16fc863ceec58ec34e0cc782a03de0eef479abea72d127d3dcb not found: ID does not exist" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.930092 5002 scope.go:117] "RemoveContainer" containerID="1d95e9b787d15b4f8d1a9d6538ee6c9e99224a99a1e6a264e2d91a41ad3a4595" Dec 09 11:26:22 crc kubenswrapper[5002]: E1209 11:26:22.930485 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d95e9b787d15b4f8d1a9d6538ee6c9e99224a99a1e6a264e2d91a41ad3a4595\": container with ID starting with 1d95e9b787d15b4f8d1a9d6538ee6c9e99224a99a1e6a264e2d91a41ad3a4595 not found: ID does not exist" containerID="1d95e9b787d15b4f8d1a9d6538ee6c9e99224a99a1e6a264e2d91a41ad3a4595" Dec 09 11:26:22 crc kubenswrapper[5002]: I1209 11:26:22.930529 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d95e9b787d15b4f8d1a9d6538ee6c9e99224a99a1e6a264e2d91a41ad3a4595"} err="failed to get container status \"1d95e9b787d15b4f8d1a9d6538ee6c9e99224a99a1e6a264e2d91a41ad3a4595\": rpc error: code = NotFound desc = could not find container \"1d95e9b787d15b4f8d1a9d6538ee6c9e99224a99a1e6a264e2d91a41ad3a4595\": container with ID starting with 1d95e9b787d15b4f8d1a9d6538ee6c9e99224a99a1e6a264e2d91a41ad3a4595 not found: ID does not exist" Dec 09 11:26:24 crc kubenswrapper[5002]: I1209 11:26:24.070205 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" path="/var/lib/kubelet/pods/055f8e94-e686-4bfe-afe9-a1cdd55fcae4/volumes" Dec 09 11:26:24 crc kubenswrapper[5002]: I1209 11:26:24.187878 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 11:26:24 crc kubenswrapper[5002]: I1209 11:26:24.204214 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_5a401af7-5a24-4df0-8c1c-d565018a08c3/mariadb-client/0.log" Dec 09 11:26:24 crc kubenswrapper[5002]: I1209 11:26:24.235090 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 09 11:26:24 crc kubenswrapper[5002]: I1209 11:26:24.241107 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 09 11:26:24 crc kubenswrapper[5002]: I1209 11:26:24.316626 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dqh5\" (UniqueName: \"kubernetes.io/projected/5a401af7-5a24-4df0-8c1c-d565018a08c3-kube-api-access-8dqh5\") pod \"5a401af7-5a24-4df0-8c1c-d565018a08c3\" (UID: \"5a401af7-5a24-4df0-8c1c-d565018a08c3\") " Dec 09 11:26:24 crc kubenswrapper[5002]: I1209 11:26:24.323292 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a401af7-5a24-4df0-8c1c-d565018a08c3-kube-api-access-8dqh5" (OuterVolumeSpecName: "kube-api-access-8dqh5") pod "5a401af7-5a24-4df0-8c1c-d565018a08c3" (UID: "5a401af7-5a24-4df0-8c1c-d565018a08c3"). InnerVolumeSpecName "kube-api-access-8dqh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:26:24 crc kubenswrapper[5002]: I1209 11:26:24.418559 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dqh5\" (UniqueName: \"kubernetes.io/projected/5a401af7-5a24-4df0-8c1c-d565018a08c3-kube-api-access-8dqh5\") on node \"crc\" DevicePath \"\"" Dec 09 11:26:24 crc kubenswrapper[5002]: I1209 11:26:24.849960 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4599befe403ddca9223f08b483d050a5a7449486fae946e304683992b583bf72" Dec 09 11:26:24 crc kubenswrapper[5002]: I1209 11:26:24.850025 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 11:26:26 crc kubenswrapper[5002]: I1209 11:26:26.071496 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a401af7-5a24-4df0-8c1c-d565018a08c3" path="/var/lib/kubelet/pods/5a401af7-5a24-4df0-8c1c-d565018a08c3/volumes" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.347843 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9nsrf"] Dec 09 11:26:27 crc kubenswrapper[5002]: E1209 11:26:27.348320 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" containerName="registry-server" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.348341 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" containerName="registry-server" Dec 09 11:26:27 crc kubenswrapper[5002]: E1209 11:26:27.348386 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a401af7-5a24-4df0-8c1c-d565018a08c3" containerName="mariadb-client" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.348398 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a401af7-5a24-4df0-8c1c-d565018a08c3" containerName="mariadb-client" Dec 09 11:26:27 crc kubenswrapper[5002]: E1209 11:26:27.348418 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" containerName="extract-content" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.348429 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" containerName="extract-content" Dec 09 11:26:27 crc kubenswrapper[5002]: E1209 11:26:27.348451 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" containerName="extract-utilities" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.348462 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" containerName="extract-utilities" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.348765 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a401af7-5a24-4df0-8c1c-d565018a08c3" containerName="mariadb-client" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.348801 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="055f8e94-e686-4bfe-afe9-a1cdd55fcae4" containerName="registry-server" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.350573 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.355828 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9nsrf"] Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.464955 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67jbt\" (UniqueName: \"kubernetes.io/projected/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-kube-api-access-67jbt\") pod \"certified-operators-9nsrf\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.464996 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-catalog-content\") pod \"certified-operators-9nsrf\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.465071 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-utilities\") pod \"certified-operators-9nsrf\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.566272 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67jbt\" (UniqueName: \"kubernetes.io/projected/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-kube-api-access-67jbt\") pod \"certified-operators-9nsrf\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.566488 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-catalog-content\") pod \"certified-operators-9nsrf\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.566599 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-utilities\") pod \"certified-operators-9nsrf\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.567088 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-utilities\") pod \"certified-operators-9nsrf\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.567151 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-catalog-content\") pod \"certified-operators-9nsrf\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.584893 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67jbt\" (UniqueName: \"kubernetes.io/projected/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-kube-api-access-67jbt\") pod \"certified-operators-9nsrf\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:27 crc kubenswrapper[5002]: I1209 11:26:27.675295 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:28 crc kubenswrapper[5002]: I1209 11:26:28.052390 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9nsrf"] Dec 09 11:26:28 crc kubenswrapper[5002]: I1209 11:26:28.895247 5002 generic.go:334] "Generic (PLEG): container finished" podID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" containerID="39864f87756cbe7d761ecb0769b211b467683be60bf1ee9ddd1edffeefe4a228" exitCode=0 Dec 09 11:26:28 crc kubenswrapper[5002]: I1209 11:26:28.895325 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nsrf" event={"ID":"a80ee9f9-2277-45f3-8ec4-967b8a8b7894","Type":"ContainerDied","Data":"39864f87756cbe7d761ecb0769b211b467683be60bf1ee9ddd1edffeefe4a228"} Dec 09 11:26:28 crc kubenswrapper[5002]: I1209 11:26:28.895597 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nsrf" event={"ID":"a80ee9f9-2277-45f3-8ec4-967b8a8b7894","Type":"ContainerStarted","Data":"0ae05ec49609cd5d91feb26f4a43c7307ae7ac62cb605d3fa2b4d0565b3fd1f1"} Dec 09 11:26:29 crc kubenswrapper[5002]: I1209 11:26:29.904755 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nsrf" event={"ID":"a80ee9f9-2277-45f3-8ec4-967b8a8b7894","Type":"ContainerStarted","Data":"483580d0ddb3c2409dfb448f26dbb378e3a8e7253244e39907a0875ba84eae34"} Dec 09 11:26:30 crc kubenswrapper[5002]: I1209 11:26:30.912871 5002 generic.go:334] "Generic (PLEG): container finished" podID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" containerID="483580d0ddb3c2409dfb448f26dbb378e3a8e7253244e39907a0875ba84eae34" exitCode=0 Dec 09 11:26:30 crc kubenswrapper[5002]: I1209 11:26:30.912971 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nsrf" event={"ID":"a80ee9f9-2277-45f3-8ec4-967b8a8b7894","Type":"ContainerDied","Data":"483580d0ddb3c2409dfb448f26dbb378e3a8e7253244e39907a0875ba84eae34"} Dec 09 11:26:31 crc kubenswrapper[5002]: I1209 11:26:31.923891 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nsrf" event={"ID":"a80ee9f9-2277-45f3-8ec4-967b8a8b7894","Type":"ContainerStarted","Data":"ad265fc9c7550b48bb688a755d8caaf6aa2df24052a188c6043ff14bd2eb1cb3"} Dec 09 11:26:31 crc kubenswrapper[5002]: I1209 11:26:31.952596 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9nsrf" podStartSLOduration=2.535613632 podStartE2EDuration="4.952568278s" podCreationTimestamp="2025-12-09 11:26:27 +0000 UTC" firstStartedPulling="2025-12-09 11:26:28.89735384 +0000 UTC m=+5121.289404921" lastFinishedPulling="2025-12-09 11:26:31.314308486 +0000 UTC m=+5123.706359567" observedRunningTime="2025-12-09 11:26:31.941969965 +0000 UTC m=+5124.334021066" watchObservedRunningTime="2025-12-09 11:26:31.952568278 +0000 UTC m=+5124.344619359" Dec 09 11:26:37 crc kubenswrapper[5002]: I1209 11:26:37.675927 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:37 crc kubenswrapper[5002]: I1209 11:26:37.676794 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:37 crc kubenswrapper[5002]: I1209 11:26:37.736159 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:38 crc kubenswrapper[5002]: I1209 11:26:38.000629 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:38 crc kubenswrapper[5002]: I1209 11:26:38.052385 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9nsrf"] Dec 09 11:26:39 crc kubenswrapper[5002]: I1209 11:26:39.975437 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9nsrf" podUID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" containerName="registry-server" containerID="cri-o://ad265fc9c7550b48bb688a755d8caaf6aa2df24052a188c6043ff14bd2eb1cb3" gracePeriod=2 Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.002805 5002 generic.go:334] "Generic (PLEG): container finished" podID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" containerID="ad265fc9c7550b48bb688a755d8caaf6aa2df24052a188c6043ff14bd2eb1cb3" exitCode=0 Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.002916 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nsrf" event={"ID":"a80ee9f9-2277-45f3-8ec4-967b8a8b7894","Type":"ContainerDied","Data":"ad265fc9c7550b48bb688a755d8caaf6aa2df24052a188c6043ff14bd2eb1cb3"} Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.277165 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.406372 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-utilities\") pod \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.406666 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67jbt\" (UniqueName: \"kubernetes.io/projected/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-kube-api-access-67jbt\") pod \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.406860 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-catalog-content\") pod \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\" (UID: \"a80ee9f9-2277-45f3-8ec4-967b8a8b7894\") " Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.407201 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-utilities" (OuterVolumeSpecName: "utilities") pod "a80ee9f9-2277-45f3-8ec4-967b8a8b7894" (UID: "a80ee9f9-2277-45f3-8ec4-967b8a8b7894"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.407388 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.416570 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-kube-api-access-67jbt" (OuterVolumeSpecName: "kube-api-access-67jbt") pod "a80ee9f9-2277-45f3-8ec4-967b8a8b7894" (UID: "a80ee9f9-2277-45f3-8ec4-967b8a8b7894"). InnerVolumeSpecName "kube-api-access-67jbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.458930 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a80ee9f9-2277-45f3-8ec4-967b8a8b7894" (UID: "a80ee9f9-2277-45f3-8ec4-967b8a8b7894"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.508537 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67jbt\" (UniqueName: \"kubernetes.io/projected/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-kube-api-access-67jbt\") on node \"crc\" DevicePath \"\"" Dec 09 11:26:42 crc kubenswrapper[5002]: I1209 11:26:42.508586 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a80ee9f9-2277-45f3-8ec4-967b8a8b7894-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:26:43 crc kubenswrapper[5002]: I1209 11:26:43.015706 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nsrf" event={"ID":"a80ee9f9-2277-45f3-8ec4-967b8a8b7894","Type":"ContainerDied","Data":"0ae05ec49609cd5d91feb26f4a43c7307ae7ac62cb605d3fa2b4d0565b3fd1f1"} Dec 09 11:26:43 crc kubenswrapper[5002]: I1209 11:26:43.015773 5002 scope.go:117] "RemoveContainer" containerID="ad265fc9c7550b48bb688a755d8caaf6aa2df24052a188c6043ff14bd2eb1cb3" Dec 09 11:26:43 crc kubenswrapper[5002]: I1209 11:26:43.016944 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nsrf" Dec 09 11:26:43 crc kubenswrapper[5002]: I1209 11:26:43.063524 5002 scope.go:117] "RemoveContainer" containerID="483580d0ddb3c2409dfb448f26dbb378e3a8e7253244e39907a0875ba84eae34" Dec 09 11:26:43 crc kubenswrapper[5002]: I1209 11:26:43.063969 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9nsrf"] Dec 09 11:26:43 crc kubenswrapper[5002]: I1209 11:26:43.069773 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9nsrf"] Dec 09 11:26:43 crc kubenswrapper[5002]: I1209 11:26:43.084881 5002 scope.go:117] "RemoveContainer" containerID="39864f87756cbe7d761ecb0769b211b467683be60bf1ee9ddd1edffeefe4a228" Dec 09 11:26:44 crc kubenswrapper[5002]: I1209 11:26:44.069032 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" path="/var/lib/kubelet/pods/a80ee9f9-2277-45f3-8ec4-967b8a8b7894/volumes" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.501651 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 11:27:23 crc kubenswrapper[5002]: E1209 11:27:23.502475 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" containerName="extract-utilities" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.502489 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" containerName="extract-utilities" Dec 09 11:27:23 crc kubenswrapper[5002]: E1209 11:27:23.502505 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" containerName="registry-server" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.502511 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" containerName="registry-server" Dec 09 11:27:23 crc kubenswrapper[5002]: E1209 11:27:23.502527 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" containerName="extract-content" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.502536 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" containerName="extract-content" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.502677 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a80ee9f9-2277-45f3-8ec4-967b8a8b7894" containerName="registry-server" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.503447 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.505345 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wmc2n" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.506118 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.508762 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.515099 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.524527 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.525879 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.531410 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.533020 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.536756 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.551084 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653023 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmhz9\" (UniqueName: \"kubernetes.io/projected/e01218b6-7004-46d8-91c5-cc3206cc6804-kube-api-access-dmhz9\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653172 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e2b902-40f9-4150-b6c8-e88c10db76fe-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653228 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-config\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653287 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aa2e66e7-578c-4aae-88fd-f692a477384f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa2e66e7-578c-4aae-88fd-f692a477384f\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653416 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653464 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0657a93b-7b98-4e07-bae8-9a56afa24342\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0657a93b-7b98-4e07-bae8-9a56afa24342\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653492 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d04518aa-c36e-49fe-84dd-0cb043622f32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d04518aa-c36e-49fe-84dd-0cb043622f32\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653514 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e01218b6-7004-46d8-91c5-cc3206cc6804-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653534 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2e2b902-40f9-4150-b6c8-e88c10db76fe-config\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653678 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f84jc\" (UniqueName: \"kubernetes.io/projected/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-kube-api-access-f84jc\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653766 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653892 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e2b902-40f9-4150-b6c8-e88c10db76fe-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653934 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e01218b6-7004-46d8-91c5-cc3206cc6804-config\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.653978 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c2e2b902-40f9-4150-b6c8-e88c10db76fe-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.654027 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e01218b6-7004-46d8-91c5-cc3206cc6804-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.654134 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j556w\" (UniqueName: \"kubernetes.io/projected/c2e2b902-40f9-4150-b6c8-e88c10db76fe-kube-api-access-j556w\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.654181 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.654239 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e01218b6-7004-46d8-91c5-cc3206cc6804-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.712051 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.718960 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.723167 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.723351 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8k9t4" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.724040 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.731980 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.734333 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.739908 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.748993 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.750220 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.756805 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e01218b6-7004-46d8-91c5-cc3206cc6804-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.756864 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmhz9\" (UniqueName: \"kubernetes.io/projected/e01218b6-7004-46d8-91c5-cc3206cc6804-kube-api-access-dmhz9\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.756909 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e2b902-40f9-4150-b6c8-e88c10db76fe-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.756964 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-config\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.756994 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aa2e66e7-578c-4aae-88fd-f692a477384f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa2e66e7-578c-4aae-88fd-f692a477384f\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757020 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757058 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0657a93b-7b98-4e07-bae8-9a56afa24342\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0657a93b-7b98-4e07-bae8-9a56afa24342\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757092 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d04518aa-c36e-49fe-84dd-0cb043622f32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d04518aa-c36e-49fe-84dd-0cb043622f32\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757120 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e01218b6-7004-46d8-91c5-cc3206cc6804-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757142 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2e2b902-40f9-4150-b6c8-e88c10db76fe-config\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757169 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f84jc\" (UniqueName: \"kubernetes.io/projected/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-kube-api-access-f84jc\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757187 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757206 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e2b902-40f9-4150-b6c8-e88c10db76fe-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757221 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e01218b6-7004-46d8-91c5-cc3206cc6804-config\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757241 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c2e2b902-40f9-4150-b6c8-e88c10db76fe-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757257 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e01218b6-7004-46d8-91c5-cc3206cc6804-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757286 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j556w\" (UniqueName: \"kubernetes.io/projected/c2e2b902-40f9-4150-b6c8-e88c10db76fe-kube-api-access-j556w\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.757331 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.761449 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e01218b6-7004-46d8-91c5-cc3206cc6804-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.761875 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.762720 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.763406 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c2e2b902-40f9-4150-b6c8-e88c10db76fe-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.764271 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e2b902-40f9-4150-b6c8-e88c10db76fe-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.764356 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2e2b902-40f9-4150-b6c8-e88c10db76fe-config\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.765264 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e01218b6-7004-46d8-91c5-cc3206cc6804-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.766227 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.766256 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0657a93b-7b98-4e07-bae8-9a56afa24342\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0657a93b-7b98-4e07-bae8-9a56afa24342\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ca5d6dd9846a32dae89cae55ab89580abaf987dbf87a5131ad0d8d9b640cd914/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.766427 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.766453 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d04518aa-c36e-49fe-84dd-0cb043622f32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d04518aa-c36e-49fe-84dd-0cb043622f32\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d84ade7110ae21521e759a19ff090dbc95f1c9ea109144a6612dd55b923da7f0/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.767244 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-config\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.767255 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.767334 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.767384 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aa2e66e7-578c-4aae-88fd-f692a477384f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa2e66e7-578c-4aae-88fd-f692a477384f\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ad061032ad6aedaa4b708bf692ae2bcab1126cb9dba2ec9cc96ae4870952b0df/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.767969 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e01218b6-7004-46d8-91c5-cc3206cc6804-config\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.770133 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e2b902-40f9-4150-b6c8-e88c10db76fe-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.777008 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.783407 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e01218b6-7004-46d8-91c5-cc3206cc6804-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.787054 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j556w\" (UniqueName: \"kubernetes.io/projected/c2e2b902-40f9-4150-b6c8-e88c10db76fe-kube-api-access-j556w\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.787568 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f84jc\" (UniqueName: \"kubernetes.io/projected/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-kube-api-access-f84jc\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.788264 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmhz9\" (UniqueName: \"kubernetes.io/projected/e01218b6-7004-46d8-91c5-cc3206cc6804-kube-api-access-dmhz9\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.793607 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67bf2a52-93b6-4957-97e5-4c41c6b9fcb1-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.808625 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aa2e66e7-578c-4aae-88fd-f692a477384f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa2e66e7-578c-4aae-88fd-f692a477384f\") pod \"ovsdbserver-nb-2\" (UID: \"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1\") " pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.809547 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0657a93b-7b98-4e07-bae8-9a56afa24342\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0657a93b-7b98-4e07-bae8-9a56afa24342\") pod \"ovsdbserver-nb-0\" (UID: \"c2e2b902-40f9-4150-b6c8-e88c10db76fe\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.813246 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d04518aa-c36e-49fe-84dd-0cb043622f32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d04518aa-c36e-49fe-84dd-0cb043622f32\") pod \"ovsdbserver-nb-1\" (UID: \"e01218b6-7004-46d8-91c5-cc3206cc6804\") " pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.821853 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.840762 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.850637 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858091 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858133 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7e5ec030-d723-45ab-b8b2-c5e015b4722c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e5ec030-d723-45ab-b8b2-c5e015b4722c\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858190 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllwq\" (UniqueName: \"kubernetes.io/projected/0d72442d-739b-4835-a59a-f701330dd8d5-kube-api-access-zllwq\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858215 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73129e1-6a6c-485e-8601-f89ab669974d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858235 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-config\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858271 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d72442d-739b-4835-a59a-f701330dd8d5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858296 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-290d6cf4-c797-4aaf-a8d3-f123606544c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-290d6cf4-c797-4aaf-a8d3-f123606544c7\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858328 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcf6c\" (UniqueName: \"kubernetes.io/projected/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-kube-api-access-hcf6c\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858351 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d72442d-739b-4835-a59a-f701330dd8d5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858369 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d72442d-739b-4835-a59a-f701330dd8d5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858383 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73129e1-6a6c-485e-8601-f89ab669974d-config\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858403 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d72442d-739b-4835-a59a-f701330dd8d5-config\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.858962 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-77a1c0b3-8510-43e3-a896-cf447219d50d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77a1c0b3-8510-43e3-a896-cf447219d50d\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.859079 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxpf\" (UniqueName: \"kubernetes.io/projected/c73129e1-6a6c-485e-8601-f89ab669974d-kube-api-access-9hxpf\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.859114 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73129e1-6a6c-485e-8601-f89ab669974d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.859152 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.859182 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c73129e1-6a6c-485e-8601-f89ab669974d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.859246 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.960928 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d72442d-739b-4835-a59a-f701330dd8d5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.960974 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d72442d-739b-4835-a59a-f701330dd8d5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.960994 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73129e1-6a6c-485e-8601-f89ab669974d-config\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961017 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d72442d-739b-4835-a59a-f701330dd8d5-config\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961059 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-77a1c0b3-8510-43e3-a896-cf447219d50d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77a1c0b3-8510-43e3-a896-cf447219d50d\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961088 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hxpf\" (UniqueName: \"kubernetes.io/projected/c73129e1-6a6c-485e-8601-f89ab669974d-kube-api-access-9hxpf\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961106 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73129e1-6a6c-485e-8601-f89ab669974d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961123 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961143 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c73129e1-6a6c-485e-8601-f89ab669974d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961166 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961190 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961209 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7e5ec030-d723-45ab-b8b2-c5e015b4722c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e5ec030-d723-45ab-b8b2-c5e015b4722c\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961228 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllwq\" (UniqueName: \"kubernetes.io/projected/0d72442d-739b-4835-a59a-f701330dd8d5-kube-api-access-zllwq\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961244 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73129e1-6a6c-485e-8601-f89ab669974d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961259 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-config\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961283 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d72442d-739b-4835-a59a-f701330dd8d5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961303 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-290d6cf4-c797-4aaf-a8d3-f123606544c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-290d6cf4-c797-4aaf-a8d3-f123606544c7\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.961331 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcf6c\" (UniqueName: \"kubernetes.io/projected/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-kube-api-access-hcf6c\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.962195 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d72442d-739b-4835-a59a-f701330dd8d5-config\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.962253 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d72442d-739b-4835-a59a-f701330dd8d5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.963055 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73129e1-6a6c-485e-8601-f89ab669974d-config\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.967197 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-config\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.967481 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d72442d-739b-4835-a59a-f701330dd8d5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.967795 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73129e1-6a6c-485e-8601-f89ab669974d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.967891 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c73129e1-6a6c-485e-8601-f89ab669974d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.969303 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.969363 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.969421 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.969442 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7e5ec030-d723-45ab-b8b2-c5e015b4722c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e5ec030-d723-45ab-b8b2-c5e015b4722c\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/52e22221953944820f2f2431df6f564e78888302a4334ae8e23f0cfe67211742/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.970965 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.970987 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-290d6cf4-c797-4aaf-a8d3-f123606544c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-290d6cf4-c797-4aaf-a8d3-f123606544c7\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/db5d813e942012e9e189b8bf9d8808a4240033e1312271c5981b33c00e6165bf/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.972320 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.973317 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73129e1-6a6c-485e-8601-f89ab669974d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.980221 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.980252 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-77a1c0b3-8510-43e3-a896-cf447219d50d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77a1c0b3-8510-43e3-a896-cf447219d50d\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cb853b0d5f8e6fadb7940e9f486d201db6f301d07439e3d57e132d45d06e320f/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.982301 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d72442d-739b-4835-a59a-f701330dd8d5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.988829 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllwq\" (UniqueName: \"kubernetes.io/projected/0d72442d-739b-4835-a59a-f701330dd8d5-kube-api-access-zllwq\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.989493 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hxpf\" (UniqueName: \"kubernetes.io/projected/c73129e1-6a6c-485e-8601-f89ab669974d-kube-api-access-9hxpf\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:23 crc kubenswrapper[5002]: I1209 11:27:23.992170 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcf6c\" (UniqueName: \"kubernetes.io/projected/b41cb4fe-b7f6-4994-a92c-faa5675b32ce-kube-api-access-hcf6c\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:24 crc kubenswrapper[5002]: I1209 11:27:24.009967 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7e5ec030-d723-45ab-b8b2-c5e015b4722c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e5ec030-d723-45ab-b8b2-c5e015b4722c\") pod \"ovsdbserver-sb-1\" (UID: \"c73129e1-6a6c-485e-8601-f89ab669974d\") " pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:24 crc kubenswrapper[5002]: I1209 11:27:24.013457 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-290d6cf4-c797-4aaf-a8d3-f123606544c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-290d6cf4-c797-4aaf-a8d3-f123606544c7\") pod \"ovsdbserver-sb-0\" (UID: \"0d72442d-739b-4835-a59a-f701330dd8d5\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:24 crc kubenswrapper[5002]: I1209 11:27:24.023379 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-77a1c0b3-8510-43e3-a896-cf447219d50d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77a1c0b3-8510-43e3-a896-cf447219d50d\") pod \"ovsdbserver-sb-2\" (UID: \"b41cb4fe-b7f6-4994-a92c-faa5675b32ce\") " pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:24 crc kubenswrapper[5002]: I1209 11:27:24.053626 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:24 crc kubenswrapper[5002]: I1209 11:27:24.070022 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:24 crc kubenswrapper[5002]: I1209 11:27:24.240617 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:24 crc kubenswrapper[5002]: I1209 11:27:24.501873 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 11:27:24 crc kubenswrapper[5002]: I1209 11:27:24.589410 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 09 11:27:24 crc kubenswrapper[5002]: I1209 11:27:24.686886 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 09 11:27:24 crc kubenswrapper[5002]: W1209 11:27:24.704612 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc73129e1_6a6c_485e_8601_f89ab669974d.slice/crio-da42cfed6ae37b40c4284baa870a03ff3a5624b5413592abb624f7b5a0da43d5 WatchSource:0}: Error finding container da42cfed6ae37b40c4284baa870a03ff3a5624b5413592abb624f7b5a0da43d5: Status 404 returned error can't find the container with id da42cfed6ae37b40c4284baa870a03ff3a5624b5413592abb624f7b5a0da43d5 Dec 09 11:27:24 crc kubenswrapper[5002]: W1209 11:27:24.885955 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb41cb4fe_b7f6_4994_a92c_faa5675b32ce.slice/crio-414700709b8fc9eb65ecd0a4a887dabadb8375a1a87a51164f56d6efcf4328a6 WatchSource:0}: Error finding container 414700709b8fc9eb65ecd0a4a887dabadb8375a1a87a51164f56d6efcf4328a6: Status 404 returned error can't find the container with id 414700709b8fc9eb65ecd0a4a887dabadb8375a1a87a51164f56d6efcf4328a6 Dec 09 11:27:24 crc kubenswrapper[5002]: I1209 11:27:24.888155 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.277050 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 11:27:25 crc kubenswrapper[5002]: W1209 11:27:25.282984 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d72442d_739b_4835_a59a_f701330dd8d5.slice/crio-4caac02f10f4274ae40f20abebf78d93ec26c9dd903e7df28b8278a30e92eab4 WatchSource:0}: Error finding container 4caac02f10f4274ae40f20abebf78d93ec26c9dd903e7df28b8278a30e92eab4: Status 404 returned error can't find the container with id 4caac02f10f4274ae40f20abebf78d93ec26c9dd903e7df28b8278a30e92eab4 Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.380608 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c2e2b902-40f9-4150-b6c8-e88c10db76fe","Type":"ContainerStarted","Data":"2736d4aa24ed0b7702f02e6d7a29e89d625cf90ff7af6f33a7b3df3b6ba98926"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.380950 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c2e2b902-40f9-4150-b6c8-e88c10db76fe","Type":"ContainerStarted","Data":"a6301b9ab99067541df0e6efe7c83407316883ccf9f9e6b3a7940b6c1fc154bb"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.380998 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c2e2b902-40f9-4150-b6c8-e88c10db76fe","Type":"ContainerStarted","Data":"0835d33d3dc08f0feba7917b863648950fbe5ef0b61b78cad0c2614bff3ea48e"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.381552 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0d72442d-739b-4835-a59a-f701330dd8d5","Type":"ContainerStarted","Data":"4caac02f10f4274ae40f20abebf78d93ec26c9dd903e7df28b8278a30e92eab4"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.383188 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"e01218b6-7004-46d8-91c5-cc3206cc6804","Type":"ContainerStarted","Data":"1918980af768119a31de10cc30f2f1af5d23b8d7dc32561461c321176ee79bf8"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.383226 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"e01218b6-7004-46d8-91c5-cc3206cc6804","Type":"ContainerStarted","Data":"cec6ab1569fe823362db803c1bc86db2a95a84237756ad4c4e8f149630ae282e"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.383239 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"e01218b6-7004-46d8-91c5-cc3206cc6804","Type":"ContainerStarted","Data":"bf93d6ec416fdb9efb58916cf622726f8708a6c070915c5faa3b4bd963147380"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.384937 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c73129e1-6a6c-485e-8601-f89ab669974d","Type":"ContainerStarted","Data":"503ecf9b62f8ea2d77d2d51042d1b746a1c285e310b09eb46d81d6745f6594fd"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.384961 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c73129e1-6a6c-485e-8601-f89ab669974d","Type":"ContainerStarted","Data":"c290afea035758678da7054937eb73a1292705c4e65f187b3a5c9c57249b694d"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.384969 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c73129e1-6a6c-485e-8601-f89ab669974d","Type":"ContainerStarted","Data":"da42cfed6ae37b40c4284baa870a03ff3a5624b5413592abb624f7b5a0da43d5"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.386540 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b41cb4fe-b7f6-4994-a92c-faa5675b32ce","Type":"ContainerStarted","Data":"c644743ae93aa9f293c2a80a2786a0a20b5273ac46da56b7692d97bc76d55468"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.386559 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b41cb4fe-b7f6-4994-a92c-faa5675b32ce","Type":"ContainerStarted","Data":"ae8a5191d7b7e2b326908bee6222c5fc1fb68f54f2e2085737e9938b191b809b"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.386568 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b41cb4fe-b7f6-4994-a92c-faa5675b32ce","Type":"ContainerStarted","Data":"414700709b8fc9eb65ecd0a4a887dabadb8375a1a87a51164f56d6efcf4328a6"} Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.398825 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.39878692 podStartE2EDuration="3.39878692s" podCreationTimestamp="2025-12-09 11:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:27:25.397299171 +0000 UTC m=+5177.789350252" watchObservedRunningTime="2025-12-09 11:27:25.39878692 +0000 UTC m=+5177.790838001" Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.426059 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.426038279 podStartE2EDuration="3.426038279s" podCreationTimestamp="2025-12-09 11:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:27:25.420238774 +0000 UTC m=+5177.812289875" watchObservedRunningTime="2025-12-09 11:27:25.426038279 +0000 UTC m=+5177.818089370" Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.450990 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.450972536 podStartE2EDuration="3.450972536s" podCreationTimestamp="2025-12-09 11:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:27:25.440703411 +0000 UTC m=+5177.832754492" watchObservedRunningTime="2025-12-09 11:27:25.450972536 +0000 UTC m=+5177.843023617" Dec 09 11:27:25 crc kubenswrapper[5002]: I1209 11:27:25.473465 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.473442927 podStartE2EDuration="3.473442927s" podCreationTimestamp="2025-12-09 11:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:27:25.460239264 +0000 UTC m=+5177.852290355" watchObservedRunningTime="2025-12-09 11:27:25.473442927 +0000 UTC m=+5177.865494008" Dec 09 11:27:26 crc kubenswrapper[5002]: I1209 11:27:26.310925 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 09 11:27:26 crc kubenswrapper[5002]: I1209 11:27:26.397950 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0d72442d-739b-4835-a59a-f701330dd8d5","Type":"ContainerStarted","Data":"6d42b6721f484b3b5797729727ccc2a8decac58a8671e8451a3e7d23f263276d"} Dec 09 11:27:26 crc kubenswrapper[5002]: I1209 11:27:26.398015 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0d72442d-739b-4835-a59a-f701330dd8d5","Type":"ContainerStarted","Data":"dc3c8e6404c3c6f679de57efe786632de72718908ec238447cd0d9ba7100f8d0"} Dec 09 11:27:26 crc kubenswrapper[5002]: I1209 11:27:26.402028 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1","Type":"ContainerStarted","Data":"46bf23bfa3952085cb4e5087bd83c8ea73066443e9fd356a6899c3b2d505f906"} Dec 09 11:27:26 crc kubenswrapper[5002]: I1209 11:27:26.425733 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.425683446 podStartE2EDuration="4.425683446s" podCreationTimestamp="2025-12-09 11:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:27:26.42022141 +0000 UTC m=+5178.812272491" watchObservedRunningTime="2025-12-09 11:27:26.425683446 +0000 UTC m=+5178.817734527" Dec 09 11:27:26 crc kubenswrapper[5002]: I1209 11:27:26.822754 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:26 crc kubenswrapper[5002]: I1209 11:27:26.841912 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:27 crc kubenswrapper[5002]: I1209 11:27:27.054706 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:27 crc kubenswrapper[5002]: I1209 11:27:27.070319 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:27 crc kubenswrapper[5002]: I1209 11:27:27.241918 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:27 crc kubenswrapper[5002]: I1209 11:27:27.419589 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1","Type":"ContainerStarted","Data":"0e050e8154c7cc43d637829ff57220bcce8baf3d54952d032c0f6032dca08c98"} Dec 09 11:27:27 crc kubenswrapper[5002]: I1209 11:27:27.419676 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"67bf2a52-93b6-4957-97e5-4c41c6b9fcb1","Type":"ContainerStarted","Data":"e6757133260890d18c7831366ca8a47f3edcd10acdf768b6f049d0d5056c3ac9"} Dec 09 11:27:27 crc kubenswrapper[5002]: I1209 11:27:27.446871 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=5.446847858 podStartE2EDuration="5.446847858s" podCreationTimestamp="2025-12-09 11:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:27:27.44541452 +0000 UTC m=+5179.837465621" watchObservedRunningTime="2025-12-09 11:27:27.446847858 +0000 UTC m=+5179.838898949" Dec 09 11:27:28 crc kubenswrapper[5002]: I1209 11:27:28.822013 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:28 crc kubenswrapper[5002]: I1209 11:27:28.841433 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:28 crc kubenswrapper[5002]: I1209 11:27:28.851115 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:29 crc kubenswrapper[5002]: I1209 11:27:29.054593 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:29 crc kubenswrapper[5002]: I1209 11:27:29.070997 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:29 crc kubenswrapper[5002]: I1209 11:27:29.241757 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:29 crc kubenswrapper[5002]: I1209 11:27:29.851669 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:29 crc kubenswrapper[5002]: I1209 11:27:29.885594 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:29 crc kubenswrapper[5002]: I1209 11:27:29.895394 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:29 crc kubenswrapper[5002]: I1209 11:27:29.915026 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:29 crc kubenswrapper[5002]: I1209 11:27:29.960931 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 09 11:27:29 crc kubenswrapper[5002]: I1209 11:27:29.981164 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.114959 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.134314 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.158419 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64848558ff-db5fm"] Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.161457 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.167561 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.169713 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-db5fm"] Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.196074 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.272473 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldvnr\" (UniqueName: \"kubernetes.io/projected/eaddbcf0-f8ae-4e68-88e9-802f82d13438-kube-api-access-ldvnr\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.272519 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-config\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.272595 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-dns-svc\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.272653 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-ovsdbserver-nb\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.274761 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.309570 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.374201 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-dns-svc\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.374320 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-ovsdbserver-nb\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.374716 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldvnr\" (UniqueName: \"kubernetes.io/projected/eaddbcf0-f8ae-4e68-88e9-802f82d13438-kube-api-access-ldvnr\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.374740 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-config\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.375078 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-dns-svc\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.375428 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-ovsdbserver-nb\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.375719 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-config\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.396731 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldvnr\" (UniqueName: \"kubernetes.io/projected/eaddbcf0-f8ae-4e68-88e9-802f82d13438-kube-api-access-ldvnr\") pod \"dnsmasq-dns-64848558ff-db5fm\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.466294 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-db5fm"] Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.466859 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.510555 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864bc46885-4w56l"] Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.512041 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.514535 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.521589 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864bc46885-4w56l"] Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.530186 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.696244 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-dns-svc\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.696680 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-sb\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.696799 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-nb\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.696880 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6nh8\" (UniqueName: \"kubernetes.io/projected/b7215792-0269-4b4d-8733-5e8808c0efc8-kube-api-access-h6nh8\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.696915 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-config\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.797801 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-nb\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.797876 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6nh8\" (UniqueName: \"kubernetes.io/projected/b7215792-0269-4b4d-8733-5e8808c0efc8-kube-api-access-h6nh8\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.797905 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-config\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.797933 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-dns-svc\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.797971 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-sb\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.798706 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-nb\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.798755 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-sb\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.799375 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-config\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.799542 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-dns-svc\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.819885 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6nh8\" (UniqueName: \"kubernetes.io/projected/b7215792-0269-4b4d-8733-5e8808c0efc8-kube-api-access-h6nh8\") pod \"dnsmasq-dns-864bc46885-4w56l\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:30 crc kubenswrapper[5002]: I1209 11:27:30.867844 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.003223 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-db5fm"] Dec 09 11:27:31 crc kubenswrapper[5002]: W1209 11:27:31.011303 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaddbcf0_f8ae_4e68_88e9_802f82d13438.slice/crio-8a5d3218628b392158884b031f79757e178c4329b075efe0600b192e9a3cb9ae WatchSource:0}: Error finding container 8a5d3218628b392158884b031f79757e178c4329b075efe0600b192e9a3cb9ae: Status 404 returned error can't find the container with id 8a5d3218628b392158884b031f79757e178c4329b075efe0600b192e9a3cb9ae Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.311157 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864bc46885-4w56l"] Dec 09 11:27:31 crc kubenswrapper[5002]: W1209 11:27:31.333199 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7215792_0269_4b4d_8733_5e8808c0efc8.slice/crio-f8b7a347903cc2f243cb9f3d85ab8d4f254906df9cbfb4227e12a0d5df923783 WatchSource:0}: Error finding container f8b7a347903cc2f243cb9f3d85ab8d4f254906df9cbfb4227e12a0d5df923783: Status 404 returned error can't find the container with id f8b7a347903cc2f243cb9f3d85ab8d4f254906df9cbfb4227e12a0d5df923783 Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.454674 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864bc46885-4w56l" event={"ID":"b7215792-0269-4b4d-8733-5e8808c0efc8","Type":"ContainerStarted","Data":"f8b7a347903cc2f243cb9f3d85ab8d4f254906df9cbfb4227e12a0d5df923783"} Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.456853 5002 generic.go:334] "Generic (PLEG): container finished" podID="eaddbcf0-f8ae-4e68-88e9-802f82d13438" containerID="2ce1115862878390000489b5e4ac00861ba57b9d1e240bed94aa9ddb3a75dc21" exitCode=0 Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.456953 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64848558ff-db5fm" event={"ID":"eaddbcf0-f8ae-4e68-88e9-802f82d13438","Type":"ContainerDied","Data":"2ce1115862878390000489b5e4ac00861ba57b9d1e240bed94aa9ddb3a75dc21"} Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.458687 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64848558ff-db5fm" event={"ID":"eaddbcf0-f8ae-4e68-88e9-802f82d13438","Type":"ContainerStarted","Data":"8a5d3218628b392158884b031f79757e178c4329b075efe0600b192e9a3cb9ae"} Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.520001 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.812835 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.922724 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-config\") pod \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.922766 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-dns-svc\") pod \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.922951 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-ovsdbserver-nb\") pod \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.922979 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldvnr\" (UniqueName: \"kubernetes.io/projected/eaddbcf0-f8ae-4e68-88e9-802f82d13438-kube-api-access-ldvnr\") pod \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\" (UID: \"eaddbcf0-f8ae-4e68-88e9-802f82d13438\") " Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.928072 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaddbcf0-f8ae-4e68-88e9-802f82d13438-kube-api-access-ldvnr" (OuterVolumeSpecName: "kube-api-access-ldvnr") pod "eaddbcf0-f8ae-4e68-88e9-802f82d13438" (UID: "eaddbcf0-f8ae-4e68-88e9-802f82d13438"). InnerVolumeSpecName "kube-api-access-ldvnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.943140 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-config" (OuterVolumeSpecName: "config") pod "eaddbcf0-f8ae-4e68-88e9-802f82d13438" (UID: "eaddbcf0-f8ae-4e68-88e9-802f82d13438"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.946983 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eaddbcf0-f8ae-4e68-88e9-802f82d13438" (UID: "eaddbcf0-f8ae-4e68-88e9-802f82d13438"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:27:31 crc kubenswrapper[5002]: I1209 11:27:31.949664 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eaddbcf0-f8ae-4e68-88e9-802f82d13438" (UID: "eaddbcf0-f8ae-4e68-88e9-802f82d13438"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:27:32 crc kubenswrapper[5002]: I1209 11:27:32.024907 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:32 crc kubenswrapper[5002]: I1209 11:27:32.024945 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldvnr\" (UniqueName: \"kubernetes.io/projected/eaddbcf0-f8ae-4e68-88e9-802f82d13438-kube-api-access-ldvnr\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:32 crc kubenswrapper[5002]: I1209 11:27:32.024956 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:32 crc kubenswrapper[5002]: I1209 11:27:32.024965 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaddbcf0-f8ae-4e68-88e9-802f82d13438-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:32 crc kubenswrapper[5002]: I1209 11:27:32.466985 5002 generic.go:334] "Generic (PLEG): container finished" podID="b7215792-0269-4b4d-8733-5e8808c0efc8" containerID="d7d1634435eb72c2793ba351e988fc62498d6363d7cf86db28e0f92c09733f57" exitCode=0 Dec 09 11:27:32 crc kubenswrapper[5002]: I1209 11:27:32.467060 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864bc46885-4w56l" event={"ID":"b7215792-0269-4b4d-8733-5e8808c0efc8","Type":"ContainerDied","Data":"d7d1634435eb72c2793ba351e988fc62498d6363d7cf86db28e0f92c09733f57"} Dec 09 11:27:32 crc kubenswrapper[5002]: I1209 11:27:32.469423 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64848558ff-db5fm" event={"ID":"eaddbcf0-f8ae-4e68-88e9-802f82d13438","Type":"ContainerDied","Data":"8a5d3218628b392158884b031f79757e178c4329b075efe0600b192e9a3cb9ae"} Dec 09 11:27:32 crc kubenswrapper[5002]: I1209 11:27:32.469466 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-db5fm" Dec 09 11:27:32 crc kubenswrapper[5002]: I1209 11:27:32.469478 5002 scope.go:117] "RemoveContainer" containerID="2ce1115862878390000489b5e4ac00861ba57b9d1e240bed94aa9ddb3a75dc21" Dec 09 11:27:32 crc kubenswrapper[5002]: I1209 11:27:32.684550 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-db5fm"] Dec 09 11:27:32 crc kubenswrapper[5002]: I1209 11:27:32.691343 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-db5fm"] Dec 09 11:27:33 crc kubenswrapper[5002]: I1209 11:27:33.478443 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864bc46885-4w56l" event={"ID":"b7215792-0269-4b4d-8733-5e8808c0efc8","Type":"ContainerStarted","Data":"7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f"} Dec 09 11:27:33 crc kubenswrapper[5002]: I1209 11:27:33.478745 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:33 crc kubenswrapper[5002]: I1209 11:27:33.503062 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864bc46885-4w56l" podStartSLOduration=3.503042415 podStartE2EDuration="3.503042415s" podCreationTimestamp="2025-12-09 11:27:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:27:33.495468212 +0000 UTC m=+5185.887519293" watchObservedRunningTime="2025-12-09 11:27:33.503042415 +0000 UTC m=+5185.895093496" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.074059 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaddbcf0-f8ae-4e68-88e9-802f82d13438" path="/var/lib/kubelet/pods/eaddbcf0-f8ae-4e68-88e9-802f82d13438/volumes" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.303680 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 09 11:27:34 crc kubenswrapper[5002]: E1209 11:27:34.304118 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaddbcf0-f8ae-4e68-88e9-802f82d13438" containerName="init" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.304140 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaddbcf0-f8ae-4e68-88e9-802f82d13438" containerName="init" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.304345 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaddbcf0-f8ae-4e68-88e9-802f82d13438" containerName="init" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.304970 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.307343 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.313419 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.360331 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " pod="openstack/ovn-copy-data" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.360648 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\") pod \"ovn-copy-data\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " pod="openstack/ovn-copy-data" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.360689 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwzn\" (UniqueName: \"kubernetes.io/projected/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-kube-api-access-npwzn\") pod \"ovn-copy-data\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " pod="openstack/ovn-copy-data" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.462589 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " pod="openstack/ovn-copy-data" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.462661 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\") pod \"ovn-copy-data\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " pod="openstack/ovn-copy-data" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.462713 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwzn\" (UniqueName: \"kubernetes.io/projected/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-kube-api-access-npwzn\") pod \"ovn-copy-data\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " pod="openstack/ovn-copy-data" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.465453 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.465590 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\") pod \"ovn-copy-data\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/efe4e27df6bf7557c4329f1a41f45385b344023d7fbdb046ebb1712e738ab58a/globalmount\"" pod="openstack/ovn-copy-data" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.477629 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " pod="openstack/ovn-copy-data" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.478595 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwzn\" (UniqueName: \"kubernetes.io/projected/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-kube-api-access-npwzn\") pod \"ovn-copy-data\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " pod="openstack/ovn-copy-data" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.493549 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\") pod \"ovn-copy-data\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " pod="openstack/ovn-copy-data" Dec 09 11:27:34 crc kubenswrapper[5002]: I1209 11:27:34.640189 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 09 11:27:35 crc kubenswrapper[5002]: I1209 11:27:35.118411 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 09 11:27:35 crc kubenswrapper[5002]: W1209 11:27:35.119356 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode89a37d0_8021_46be_bf1e_9e9f9b6b0869.slice/crio-cf4c850ad41285eae556a4359ec47ff505da1290b49801d6f9cf9faa6e70a80c WatchSource:0}: Error finding container cf4c850ad41285eae556a4359ec47ff505da1290b49801d6f9cf9faa6e70a80c: Status 404 returned error can't find the container with id cf4c850ad41285eae556a4359ec47ff505da1290b49801d6f9cf9faa6e70a80c Dec 09 11:27:35 crc kubenswrapper[5002]: I1209 11:27:35.503445 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"e89a37d0-8021-46be-bf1e-9e9f9b6b0869","Type":"ContainerStarted","Data":"cf4c850ad41285eae556a4359ec47ff505da1290b49801d6f9cf9faa6e70a80c"} Dec 09 11:27:36 crc kubenswrapper[5002]: I1209 11:27:36.513285 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"e89a37d0-8021-46be-bf1e-9e9f9b6b0869","Type":"ContainerStarted","Data":"5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698"} Dec 09 11:27:40 crc kubenswrapper[5002]: I1209 11:27:40.950608 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:27:40 crc kubenswrapper[5002]: I1209 11:27:40.980859 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=7.980801377 podStartE2EDuration="7.980801377s" podCreationTimestamp="2025-12-09 11:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:27:36.530078035 +0000 UTC m=+5188.922129136" watchObservedRunningTime="2025-12-09 11:27:40.980801377 +0000 UTC m=+5193.372852478" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.010881 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dpz7k"] Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.011097 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" podUID="eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" containerName="dnsmasq-dns" containerID="cri-o://0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967" gracePeriod=10 Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.461121 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.555697 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-config\") pod \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.574970 5002 generic.go:334] "Generic (PLEG): container finished" podID="eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" containerID="0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967" exitCode=0 Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.575038 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.575028 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" event={"ID":"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba","Type":"ContainerDied","Data":"0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967"} Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.575102 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-dpz7k" event={"ID":"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba","Type":"ContainerDied","Data":"3054edb575e045ae587f8e02401808bb624376baf6857a00d791acd5ccaecbb5"} Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.575128 5002 scope.go:117] "RemoveContainer" containerID="0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.596706 5002 scope.go:117] "RemoveContainer" containerID="a3a5af4e962e93fa331a9e92580bea90dc7636bf8e59e67d2ec4c35ebf7417bf" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.611515 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-config" (OuterVolumeSpecName: "config") pod "eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" (UID: "eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.657231 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-dns-svc\") pod \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.657495 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrznh\" (UniqueName: \"kubernetes.io/projected/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-kube-api-access-jrznh\") pod \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\" (UID: \"eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba\") " Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.657893 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.660792 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-kube-api-access-jrznh" (OuterVolumeSpecName: "kube-api-access-jrznh") pod "eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" (UID: "eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba"). InnerVolumeSpecName "kube-api-access-jrznh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.667093 5002 scope.go:117] "RemoveContainer" containerID="0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967" Dec 09 11:27:41 crc kubenswrapper[5002]: E1209 11:27:41.667628 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967\": container with ID starting with 0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967 not found: ID does not exist" containerID="0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.667667 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967"} err="failed to get container status \"0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967\": rpc error: code = NotFound desc = could not find container \"0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967\": container with ID starting with 0012bca043acab2f367ab8f18399810e3cae278d19cc2c29a78440d3dca75967 not found: ID does not exist" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.667713 5002 scope.go:117] "RemoveContainer" containerID="a3a5af4e962e93fa331a9e92580bea90dc7636bf8e59e67d2ec4c35ebf7417bf" Dec 09 11:27:41 crc kubenswrapper[5002]: E1209 11:27:41.668175 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3a5af4e962e93fa331a9e92580bea90dc7636bf8e59e67d2ec4c35ebf7417bf\": container with ID starting with a3a5af4e962e93fa331a9e92580bea90dc7636bf8e59e67d2ec4c35ebf7417bf not found: ID does not exist" containerID="a3a5af4e962e93fa331a9e92580bea90dc7636bf8e59e67d2ec4c35ebf7417bf" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.668224 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a5af4e962e93fa331a9e92580bea90dc7636bf8e59e67d2ec4c35ebf7417bf"} err="failed to get container status \"a3a5af4e962e93fa331a9e92580bea90dc7636bf8e59e67d2ec4c35ebf7417bf\": rpc error: code = NotFound desc = could not find container \"a3a5af4e962e93fa331a9e92580bea90dc7636bf8e59e67d2ec4c35ebf7417bf\": container with ID starting with a3a5af4e962e93fa331a9e92580bea90dc7636bf8e59e67d2ec4c35ebf7417bf not found: ID does not exist" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.696259 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" (UID: "eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.759280 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrznh\" (UniqueName: \"kubernetes.io/projected/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-kube-api-access-jrznh\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.759354 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.912018 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dpz7k"] Dec 09 11:27:41 crc kubenswrapper[5002]: I1209 11:27:41.925569 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dpz7k"] Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.072382 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" path="/var/lib/kubelet/pods/eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba/volumes" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.249678 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 09 11:27:42 crc kubenswrapper[5002]: E1209 11:27:42.250474 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" containerName="init" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.250493 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" containerName="init" Dec 09 11:27:42 crc kubenswrapper[5002]: E1209 11:27:42.250693 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" containerName="dnsmasq-dns" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.250710 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" containerName="dnsmasq-dns" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.252143 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb7a8f34-c2c4-4b8b-b7d1-c65e2cc14eba" containerName="dnsmasq-dns" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.253261 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.259317 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.259665 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.259833 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-pvcwd" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.261369 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.270526 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/603f27c8-09c3-4c86-8774-3a5937dfaf8e-scripts\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.270588 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603f27c8-09c3-4c86-8774-3a5937dfaf8e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.270614 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603f27c8-09c3-4c86-8774-3a5937dfaf8e-config\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.270651 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdwps\" (UniqueName: \"kubernetes.io/projected/603f27c8-09c3-4c86-8774-3a5937dfaf8e-kube-api-access-jdwps\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.270756 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/603f27c8-09c3-4c86-8774-3a5937dfaf8e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.372182 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/603f27c8-09c3-4c86-8774-3a5937dfaf8e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.372478 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603f27c8-09c3-4c86-8774-3a5937dfaf8e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.372659 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/603f27c8-09c3-4c86-8774-3a5937dfaf8e-scripts\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.372731 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603f27c8-09c3-4c86-8774-3a5937dfaf8e-config\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.372802 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdwps\" (UniqueName: \"kubernetes.io/projected/603f27c8-09c3-4c86-8774-3a5937dfaf8e-kube-api-access-jdwps\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.373435 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/603f27c8-09c3-4c86-8774-3a5937dfaf8e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.373977 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/603f27c8-09c3-4c86-8774-3a5937dfaf8e-scripts\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.374657 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603f27c8-09c3-4c86-8774-3a5937dfaf8e-config\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.379091 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603f27c8-09c3-4c86-8774-3a5937dfaf8e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.390225 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdwps\" (UniqueName: \"kubernetes.io/projected/603f27c8-09c3-4c86-8774-3a5937dfaf8e-kube-api-access-jdwps\") pod \"ovn-northd-0\" (UID: \"603f27c8-09c3-4c86-8774-3a5937dfaf8e\") " pod="openstack/ovn-northd-0" Dec 09 11:27:42 crc kubenswrapper[5002]: I1209 11:27:42.580674 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 11:27:43 crc kubenswrapper[5002]: I1209 11:27:43.011973 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 11:27:43 crc kubenswrapper[5002]: W1209 11:27:43.012783 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod603f27c8_09c3_4c86_8774_3a5937dfaf8e.slice/crio-e383a95ae9a4ab01be3a8a781e88d0048789411599bcb9c31766a13583027c44 WatchSource:0}: Error finding container e383a95ae9a4ab01be3a8a781e88d0048789411599bcb9c31766a13583027c44: Status 404 returned error can't find the container with id e383a95ae9a4ab01be3a8a781e88d0048789411599bcb9c31766a13583027c44 Dec 09 11:27:43 crc kubenswrapper[5002]: I1209 11:27:43.596274 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"603f27c8-09c3-4c86-8774-3a5937dfaf8e","Type":"ContainerStarted","Data":"efaec2ad1effab87ab7e5cc56dbdfb325629f7b8e8cf7c8eb218de2ca24ef805"} Dec 09 11:27:43 crc kubenswrapper[5002]: I1209 11:27:43.597688 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"603f27c8-09c3-4c86-8774-3a5937dfaf8e","Type":"ContainerStarted","Data":"97f0a48f33ba5a1a9dd3900150866821aba9f7131d7dc0f16ef50951580512bc"} Dec 09 11:27:43 crc kubenswrapper[5002]: I1209 11:27:43.597808 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"603f27c8-09c3-4c86-8774-3a5937dfaf8e","Type":"ContainerStarted","Data":"e383a95ae9a4ab01be3a8a781e88d0048789411599bcb9c31766a13583027c44"} Dec 09 11:27:43 crc kubenswrapper[5002]: I1209 11:27:43.597913 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 09 11:27:43 crc kubenswrapper[5002]: I1209 11:27:43.628168 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.6281331209999998 podStartE2EDuration="1.628133121s" podCreationTimestamp="2025-12-09 11:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:27:43.614929228 +0000 UTC m=+5196.006980369" watchObservedRunningTime="2025-12-09 11:27:43.628133121 +0000 UTC m=+5196.020184232" Dec 09 11:27:45 crc kubenswrapper[5002]: I1209 11:27:45.734469 5002 scope.go:117] "RemoveContainer" containerID="d76b8f1ca3e41a863ad26f13d30b47efbc2d6bc729867dcc212de7678385d8d3" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.613391 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bjz5w"] Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.614641 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bjz5w" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.634773 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bjz5w"] Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.776028 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62049528-7763-4cd2-bafd-4511f883ec84-operator-scripts\") pod \"keystone-db-create-bjz5w\" (UID: \"62049528-7763-4cd2-bafd-4511f883ec84\") " pod="openstack/keystone-db-create-bjz5w" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.776399 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zzfd\" (UniqueName: \"kubernetes.io/projected/62049528-7763-4cd2-bafd-4511f883ec84-kube-api-access-9zzfd\") pod \"keystone-db-create-bjz5w\" (UID: \"62049528-7763-4cd2-bafd-4511f883ec84\") " pod="openstack/keystone-db-create-bjz5w" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.818613 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3513-account-create-update-5vpps"] Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.819986 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3513-account-create-update-5vpps" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.821797 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.828283 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3513-account-create-update-5vpps"] Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.877925 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62049528-7763-4cd2-bafd-4511f883ec84-operator-scripts\") pod \"keystone-db-create-bjz5w\" (UID: \"62049528-7763-4cd2-bafd-4511f883ec84\") " pod="openstack/keystone-db-create-bjz5w" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.877987 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zzfd\" (UniqueName: \"kubernetes.io/projected/62049528-7763-4cd2-bafd-4511f883ec84-kube-api-access-9zzfd\") pod \"keystone-db-create-bjz5w\" (UID: \"62049528-7763-4cd2-bafd-4511f883ec84\") " pod="openstack/keystone-db-create-bjz5w" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.879072 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62049528-7763-4cd2-bafd-4511f883ec84-operator-scripts\") pod \"keystone-db-create-bjz5w\" (UID: \"62049528-7763-4cd2-bafd-4511f883ec84\") " pod="openstack/keystone-db-create-bjz5w" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.900559 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zzfd\" (UniqueName: \"kubernetes.io/projected/62049528-7763-4cd2-bafd-4511f883ec84-kube-api-access-9zzfd\") pod \"keystone-db-create-bjz5w\" (UID: \"62049528-7763-4cd2-bafd-4511f883ec84\") " pod="openstack/keystone-db-create-bjz5w" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.938167 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bjz5w" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.979880 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nksb\" (UniqueName: \"kubernetes.io/projected/ba634598-bd87-4c8f-a7c1-9db3e3324383-kube-api-access-4nksb\") pod \"keystone-3513-account-create-update-5vpps\" (UID: \"ba634598-bd87-4c8f-a7c1-9db3e3324383\") " pod="openstack/keystone-3513-account-create-update-5vpps" Dec 09 11:27:47 crc kubenswrapper[5002]: I1209 11:27:47.980012 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba634598-bd87-4c8f-a7c1-9db3e3324383-operator-scripts\") pod \"keystone-3513-account-create-update-5vpps\" (UID: \"ba634598-bd87-4c8f-a7c1-9db3e3324383\") " pod="openstack/keystone-3513-account-create-update-5vpps" Dec 09 11:27:48 crc kubenswrapper[5002]: I1209 11:27:48.081628 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nksb\" (UniqueName: \"kubernetes.io/projected/ba634598-bd87-4c8f-a7c1-9db3e3324383-kube-api-access-4nksb\") pod \"keystone-3513-account-create-update-5vpps\" (UID: \"ba634598-bd87-4c8f-a7c1-9db3e3324383\") " pod="openstack/keystone-3513-account-create-update-5vpps" Dec 09 11:27:48 crc kubenswrapper[5002]: I1209 11:27:48.082109 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba634598-bd87-4c8f-a7c1-9db3e3324383-operator-scripts\") pod \"keystone-3513-account-create-update-5vpps\" (UID: \"ba634598-bd87-4c8f-a7c1-9db3e3324383\") " pod="openstack/keystone-3513-account-create-update-5vpps" Dec 09 11:27:48 crc kubenswrapper[5002]: I1209 11:27:48.083225 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba634598-bd87-4c8f-a7c1-9db3e3324383-operator-scripts\") pod \"keystone-3513-account-create-update-5vpps\" (UID: \"ba634598-bd87-4c8f-a7c1-9db3e3324383\") " pod="openstack/keystone-3513-account-create-update-5vpps" Dec 09 11:27:48 crc kubenswrapper[5002]: I1209 11:27:48.105357 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nksb\" (UniqueName: \"kubernetes.io/projected/ba634598-bd87-4c8f-a7c1-9db3e3324383-kube-api-access-4nksb\") pod \"keystone-3513-account-create-update-5vpps\" (UID: \"ba634598-bd87-4c8f-a7c1-9db3e3324383\") " pod="openstack/keystone-3513-account-create-update-5vpps" Dec 09 11:27:48 crc kubenswrapper[5002]: I1209 11:27:48.147757 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3513-account-create-update-5vpps" Dec 09 11:27:48 crc kubenswrapper[5002]: I1209 11:27:48.397169 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bjz5w"] Dec 09 11:27:48 crc kubenswrapper[5002]: W1209 11:27:48.398209 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62049528_7763_4cd2_bafd_4511f883ec84.slice/crio-d801a4608a48e5d785aa64d1bc2a1bc3dc202a13a393733fe348de30b3279926 WatchSource:0}: Error finding container d801a4608a48e5d785aa64d1bc2a1bc3dc202a13a393733fe348de30b3279926: Status 404 returned error can't find the container with id d801a4608a48e5d785aa64d1bc2a1bc3dc202a13a393733fe348de30b3279926 Dec 09 11:27:48 crc kubenswrapper[5002]: I1209 11:27:48.573485 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3513-account-create-update-5vpps"] Dec 09 11:27:48 crc kubenswrapper[5002]: W1209 11:27:48.579121 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba634598_bd87_4c8f_a7c1_9db3e3324383.slice/crio-91ba34447d93b5a5db9784ad58f4ba5073a0af080d70c5516ae7a655ba7ea7e2 WatchSource:0}: Error finding container 91ba34447d93b5a5db9784ad58f4ba5073a0af080d70c5516ae7a655ba7ea7e2: Status 404 returned error can't find the container with id 91ba34447d93b5a5db9784ad58f4ba5073a0af080d70c5516ae7a655ba7ea7e2 Dec 09 11:27:48 crc kubenswrapper[5002]: I1209 11:27:48.637487 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bjz5w" event={"ID":"62049528-7763-4cd2-bafd-4511f883ec84","Type":"ContainerStarted","Data":"102d6b1e6d4be24cf22ae770d8c2354b76c69c07e396b1c5b329a14dfce8c125"} Dec 09 11:27:48 crc kubenswrapper[5002]: I1209 11:27:48.637542 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bjz5w" event={"ID":"62049528-7763-4cd2-bafd-4511f883ec84","Type":"ContainerStarted","Data":"d801a4608a48e5d785aa64d1bc2a1bc3dc202a13a393733fe348de30b3279926"} Dec 09 11:27:48 crc kubenswrapper[5002]: I1209 11:27:48.638713 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3513-account-create-update-5vpps" event={"ID":"ba634598-bd87-4c8f-a7c1-9db3e3324383","Type":"ContainerStarted","Data":"91ba34447d93b5a5db9784ad58f4ba5073a0af080d70c5516ae7a655ba7ea7e2"} Dec 09 11:27:48 crc kubenswrapper[5002]: I1209 11:27:48.655358 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-bjz5w" podStartSLOduration=1.655335834 podStartE2EDuration="1.655335834s" podCreationTimestamp="2025-12-09 11:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:27:48.652050276 +0000 UTC m=+5201.044101367" watchObservedRunningTime="2025-12-09 11:27:48.655335834 +0000 UTC m=+5201.047386925" Dec 09 11:27:49 crc kubenswrapper[5002]: I1209 11:27:49.651358 5002 generic.go:334] "Generic (PLEG): container finished" podID="62049528-7763-4cd2-bafd-4511f883ec84" containerID="102d6b1e6d4be24cf22ae770d8c2354b76c69c07e396b1c5b329a14dfce8c125" exitCode=0 Dec 09 11:27:49 crc kubenswrapper[5002]: I1209 11:27:49.651474 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bjz5w" event={"ID":"62049528-7763-4cd2-bafd-4511f883ec84","Type":"ContainerDied","Data":"102d6b1e6d4be24cf22ae770d8c2354b76c69c07e396b1c5b329a14dfce8c125"} Dec 09 11:27:49 crc kubenswrapper[5002]: I1209 11:27:49.653962 5002 generic.go:334] "Generic (PLEG): container finished" podID="ba634598-bd87-4c8f-a7c1-9db3e3324383" containerID="7d8fd369e838ef25f4aa137e4efdca0a3c0c18df473a76e109e6b92a716d2fb6" exitCode=0 Dec 09 11:27:49 crc kubenswrapper[5002]: I1209 11:27:49.653997 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3513-account-create-update-5vpps" event={"ID":"ba634598-bd87-4c8f-a7c1-9db3e3324383","Type":"ContainerDied","Data":"7d8fd369e838ef25f4aa137e4efdca0a3c0c18df473a76e109e6b92a716d2fb6"} Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.093114 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3513-account-create-update-5vpps" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.100206 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bjz5w" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.235888 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba634598-bd87-4c8f-a7c1-9db3e3324383-operator-scripts\") pod \"ba634598-bd87-4c8f-a7c1-9db3e3324383\" (UID: \"ba634598-bd87-4c8f-a7c1-9db3e3324383\") " Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.236011 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nksb\" (UniqueName: \"kubernetes.io/projected/ba634598-bd87-4c8f-a7c1-9db3e3324383-kube-api-access-4nksb\") pod \"ba634598-bd87-4c8f-a7c1-9db3e3324383\" (UID: \"ba634598-bd87-4c8f-a7c1-9db3e3324383\") " Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.236079 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62049528-7763-4cd2-bafd-4511f883ec84-operator-scripts\") pod \"62049528-7763-4cd2-bafd-4511f883ec84\" (UID: \"62049528-7763-4cd2-bafd-4511f883ec84\") " Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.236104 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zzfd\" (UniqueName: \"kubernetes.io/projected/62049528-7763-4cd2-bafd-4511f883ec84-kube-api-access-9zzfd\") pod \"62049528-7763-4cd2-bafd-4511f883ec84\" (UID: \"62049528-7763-4cd2-bafd-4511f883ec84\") " Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.237177 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62049528-7763-4cd2-bafd-4511f883ec84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62049528-7763-4cd2-bafd-4511f883ec84" (UID: "62049528-7763-4cd2-bafd-4511f883ec84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.237468 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba634598-bd87-4c8f-a7c1-9db3e3324383-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba634598-bd87-4c8f-a7c1-9db3e3324383" (UID: "ba634598-bd87-4c8f-a7c1-9db3e3324383"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.250148 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62049528-7763-4cd2-bafd-4511f883ec84-kube-api-access-9zzfd" (OuterVolumeSpecName: "kube-api-access-9zzfd") pod "62049528-7763-4cd2-bafd-4511f883ec84" (UID: "62049528-7763-4cd2-bafd-4511f883ec84"). InnerVolumeSpecName "kube-api-access-9zzfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.251013 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba634598-bd87-4c8f-a7c1-9db3e3324383-kube-api-access-4nksb" (OuterVolumeSpecName: "kube-api-access-4nksb") pod "ba634598-bd87-4c8f-a7c1-9db3e3324383" (UID: "ba634598-bd87-4c8f-a7c1-9db3e3324383"). InnerVolumeSpecName "kube-api-access-4nksb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.339209 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nksb\" (UniqueName: \"kubernetes.io/projected/ba634598-bd87-4c8f-a7c1-9db3e3324383-kube-api-access-4nksb\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.339282 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62049528-7763-4cd2-bafd-4511f883ec84-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.339308 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zzfd\" (UniqueName: \"kubernetes.io/projected/62049528-7763-4cd2-bafd-4511f883ec84-kube-api-access-9zzfd\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.339331 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba634598-bd87-4c8f-a7c1-9db3e3324383-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.686114 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bjz5w" event={"ID":"62049528-7763-4cd2-bafd-4511f883ec84","Type":"ContainerDied","Data":"d801a4608a48e5d785aa64d1bc2a1bc3dc202a13a393733fe348de30b3279926"} Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.686184 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d801a4608a48e5d785aa64d1bc2a1bc3dc202a13a393733fe348de30b3279926" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.686280 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bjz5w" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.700331 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3513-account-create-update-5vpps" event={"ID":"ba634598-bd87-4c8f-a7c1-9db3e3324383","Type":"ContainerDied","Data":"91ba34447d93b5a5db9784ad58f4ba5073a0af080d70c5516ae7a655ba7ea7e2"} Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.700400 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91ba34447d93b5a5db9784ad58f4ba5073a0af080d70c5516ae7a655ba7ea7e2" Dec 09 11:27:51 crc kubenswrapper[5002]: I1209 11:27:51.700519 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3513-account-create-update-5vpps" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.297214 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qw2g9"] Dec 09 11:27:53 crc kubenswrapper[5002]: E1209 11:27:53.297853 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba634598-bd87-4c8f-a7c1-9db3e3324383" containerName="mariadb-account-create-update" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.297865 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba634598-bd87-4c8f-a7c1-9db3e3324383" containerName="mariadb-account-create-update" Dec 09 11:27:53 crc kubenswrapper[5002]: E1209 11:27:53.297890 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62049528-7763-4cd2-bafd-4511f883ec84" containerName="mariadb-database-create" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.297896 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="62049528-7763-4cd2-bafd-4511f883ec84" containerName="mariadb-database-create" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.298043 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="62049528-7763-4cd2-bafd-4511f883ec84" containerName="mariadb-database-create" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.298064 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba634598-bd87-4c8f-a7c1-9db3e3324383" containerName="mariadb-account-create-update" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.298619 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.300700 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-srzzf" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.301146 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.301863 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.301916 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.312387 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qw2g9"] Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.473744 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-combined-ca-bundle\") pod \"keystone-db-sync-qw2g9\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.473882 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lthsn\" (UniqueName: \"kubernetes.io/projected/13cc15be-2b29-4477-bc55-dde55b36019d-kube-api-access-lthsn\") pod \"keystone-db-sync-qw2g9\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.473964 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-config-data\") pod \"keystone-db-sync-qw2g9\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.575427 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lthsn\" (UniqueName: \"kubernetes.io/projected/13cc15be-2b29-4477-bc55-dde55b36019d-kube-api-access-lthsn\") pod \"keystone-db-sync-qw2g9\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.575512 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-config-data\") pod \"keystone-db-sync-qw2g9\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.575549 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-combined-ca-bundle\") pod \"keystone-db-sync-qw2g9\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.580991 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-config-data\") pod \"keystone-db-sync-qw2g9\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.583303 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-combined-ca-bundle\") pod \"keystone-db-sync-qw2g9\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.598076 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lthsn\" (UniqueName: \"kubernetes.io/projected/13cc15be-2b29-4477-bc55-dde55b36019d-kube-api-access-lthsn\") pod \"keystone-db-sync-qw2g9\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:53 crc kubenswrapper[5002]: I1209 11:27:53.624969 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:54 crc kubenswrapper[5002]: I1209 11:27:54.136294 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qw2g9"] Dec 09 11:27:54 crc kubenswrapper[5002]: W1209 11:27:54.149893 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13cc15be_2b29_4477_bc55_dde55b36019d.slice/crio-e19fee0386976d101a883376d8ab8e8b72d0260c2363f5185793ea64d3f8eaf0 WatchSource:0}: Error finding container e19fee0386976d101a883376d8ab8e8b72d0260c2363f5185793ea64d3f8eaf0: Status 404 returned error can't find the container with id e19fee0386976d101a883376d8ab8e8b72d0260c2363f5185793ea64d3f8eaf0 Dec 09 11:27:54 crc kubenswrapper[5002]: I1209 11:27:54.727126 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qw2g9" event={"ID":"13cc15be-2b29-4477-bc55-dde55b36019d","Type":"ContainerStarted","Data":"f407febb120c9e6659c4785dd15bf5b765a05ed089d1b2040ed420e5dd438441"} Dec 09 11:27:54 crc kubenswrapper[5002]: I1209 11:27:54.727393 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qw2g9" event={"ID":"13cc15be-2b29-4477-bc55-dde55b36019d","Type":"ContainerStarted","Data":"e19fee0386976d101a883376d8ab8e8b72d0260c2363f5185793ea64d3f8eaf0"} Dec 09 11:27:54 crc kubenswrapper[5002]: I1209 11:27:54.746196 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qw2g9" podStartSLOduration=1.74618023 podStartE2EDuration="1.74618023s" podCreationTimestamp="2025-12-09 11:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:27:54.742778519 +0000 UTC m=+5207.134829630" watchObservedRunningTime="2025-12-09 11:27:54.74618023 +0000 UTC m=+5207.138231301" Dec 09 11:27:56 crc kubenswrapper[5002]: I1209 11:27:56.759709 5002 generic.go:334] "Generic (PLEG): container finished" podID="13cc15be-2b29-4477-bc55-dde55b36019d" containerID="f407febb120c9e6659c4785dd15bf5b765a05ed089d1b2040ed420e5dd438441" exitCode=0 Dec 09 11:27:56 crc kubenswrapper[5002]: I1209 11:27:56.759915 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qw2g9" event={"ID":"13cc15be-2b29-4477-bc55-dde55b36019d","Type":"ContainerDied","Data":"f407febb120c9e6659c4785dd15bf5b765a05ed089d1b2040ed420e5dd438441"} Dec 09 11:27:57 crc kubenswrapper[5002]: I1209 11:27:57.661942 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.091793 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.264852 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-combined-ca-bundle\") pod \"13cc15be-2b29-4477-bc55-dde55b36019d\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.265026 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lthsn\" (UniqueName: \"kubernetes.io/projected/13cc15be-2b29-4477-bc55-dde55b36019d-kube-api-access-lthsn\") pod \"13cc15be-2b29-4477-bc55-dde55b36019d\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.265059 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-config-data\") pod \"13cc15be-2b29-4477-bc55-dde55b36019d\" (UID: \"13cc15be-2b29-4477-bc55-dde55b36019d\") " Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.269982 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13cc15be-2b29-4477-bc55-dde55b36019d-kube-api-access-lthsn" (OuterVolumeSpecName: "kube-api-access-lthsn") pod "13cc15be-2b29-4477-bc55-dde55b36019d" (UID: "13cc15be-2b29-4477-bc55-dde55b36019d"). InnerVolumeSpecName "kube-api-access-lthsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.293983 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13cc15be-2b29-4477-bc55-dde55b36019d" (UID: "13cc15be-2b29-4477-bc55-dde55b36019d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.318579 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-config-data" (OuterVolumeSpecName: "config-data") pod "13cc15be-2b29-4477-bc55-dde55b36019d" (UID: "13cc15be-2b29-4477-bc55-dde55b36019d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.367328 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lthsn\" (UniqueName: \"kubernetes.io/projected/13cc15be-2b29-4477-bc55-dde55b36019d-kube-api-access-lthsn\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.367361 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.367371 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13cc15be-2b29-4477-bc55-dde55b36019d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.775207 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qw2g9" event={"ID":"13cc15be-2b29-4477-bc55-dde55b36019d","Type":"ContainerDied","Data":"e19fee0386976d101a883376d8ab8e8b72d0260c2363f5185793ea64d3f8eaf0"} Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.775253 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e19fee0386976d101a883376d8ab8e8b72d0260c2363f5185793ea64d3f8eaf0" Dec 09 11:27:58 crc kubenswrapper[5002]: I1209 11:27:58.775309 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qw2g9" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.024890 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58865cd75-7vg59"] Dec 09 11:27:59 crc kubenswrapper[5002]: E1209 11:27:59.026294 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13cc15be-2b29-4477-bc55-dde55b36019d" containerName="keystone-db-sync" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.026319 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cc15be-2b29-4477-bc55-dde55b36019d" containerName="keystone-db-sync" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.026511 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="13cc15be-2b29-4477-bc55-dde55b36019d" containerName="keystone-db-sync" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.027981 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.036178 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58865cd75-7vg59"] Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.064398 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tb74h"] Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.066925 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.069268 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-srzzf" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.069469 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.069682 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.069838 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.070284 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.094199 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tb74h"] Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.186023 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-config-data\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.186079 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-config\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.186226 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-nb\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.186335 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-fernet-keys\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.186360 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6xrz\" (UniqueName: \"kubernetes.io/projected/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-kube-api-access-t6xrz\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.186464 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-dns-svc\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.186523 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-combined-ca-bundle\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.186548 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b74n\" (UniqueName: \"kubernetes.io/projected/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-kube-api-access-9b74n\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.186571 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-scripts\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.186590 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-credential-keys\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.186649 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-sb\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.287509 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-nb\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.287575 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-fernet-keys\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.287593 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6xrz\" (UniqueName: \"kubernetes.io/projected/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-kube-api-access-t6xrz\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.287637 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-dns-svc\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.287661 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-combined-ca-bundle\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.287679 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b74n\" (UniqueName: \"kubernetes.io/projected/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-kube-api-access-9b74n\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.287696 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-scripts\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.287713 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-credential-keys\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.287735 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-sb\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.287759 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-config-data\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.287775 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-config\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.288542 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-config\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.289067 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-nb\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.289807 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-sb\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.289868 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-dns-svc\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.294594 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-config-data\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.295330 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-credential-keys\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.296948 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-fernet-keys\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.299217 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-scripts\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.304345 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-combined-ca-bundle\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.307874 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6xrz\" (UniqueName: \"kubernetes.io/projected/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-kube-api-access-t6xrz\") pod \"dnsmasq-dns-58865cd75-7vg59\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.308183 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b74n\" (UniqueName: \"kubernetes.io/projected/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-kube-api-access-9b74n\") pod \"keystone-bootstrap-tb74h\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.344412 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.417207 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.795239 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58865cd75-7vg59"] Dec 09 11:27:59 crc kubenswrapper[5002]: I1209 11:27:59.961161 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tb74h"] Dec 09 11:28:00 crc kubenswrapper[5002]: I1209 11:28:00.799619 5002 generic.go:334] "Generic (PLEG): container finished" podID="bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" containerID="914c82607a7c95b17d4bd573522f17f6b92d983a16fe95e794cfc586583572ae" exitCode=0 Dec 09 11:28:00 crc kubenswrapper[5002]: I1209 11:28:00.799688 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58865cd75-7vg59" event={"ID":"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e","Type":"ContainerDied","Data":"914c82607a7c95b17d4bd573522f17f6b92d983a16fe95e794cfc586583572ae"} Dec 09 11:28:00 crc kubenswrapper[5002]: I1209 11:28:00.800000 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58865cd75-7vg59" event={"ID":"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e","Type":"ContainerStarted","Data":"459bd6e37baf706c6c03869a3862b96b92f5235f4c35e4afe3a474fa8cdb0ee5"} Dec 09 11:28:00 crc kubenswrapper[5002]: I1209 11:28:00.806401 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tb74h" event={"ID":"670d7503-5cc2-4674-82bc-ae0ea0ecdf36","Type":"ContainerStarted","Data":"b680270350965904d5be19221fc23cfb017210cc89b8668f43e079f9af6459cb"} Dec 09 11:28:00 crc kubenswrapper[5002]: I1209 11:28:00.806473 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tb74h" event={"ID":"670d7503-5cc2-4674-82bc-ae0ea0ecdf36","Type":"ContainerStarted","Data":"b5d9b4f0900376158c5bf391bf4bb71b8019bc62ec3444d95add0573ed15febc"} Dec 09 11:28:00 crc kubenswrapper[5002]: I1209 11:28:00.864276 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tb74h" podStartSLOduration=1.864237112 podStartE2EDuration="1.864237112s" podCreationTimestamp="2025-12-09 11:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:28:00.851497631 +0000 UTC m=+5213.243548762" watchObservedRunningTime="2025-12-09 11:28:00.864237112 +0000 UTC m=+5213.256288193" Dec 09 11:28:01 crc kubenswrapper[5002]: I1209 11:28:01.821027 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58865cd75-7vg59" event={"ID":"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e","Type":"ContainerStarted","Data":"9d0624af58cec598367afc30db1261c4a854b1d14cb1a92379f8d3af9fbc6ea1"} Dec 09 11:28:01 crc kubenswrapper[5002]: I1209 11:28:01.863144 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58865cd75-7vg59" podStartSLOduration=3.863111331 podStartE2EDuration="3.863111331s" podCreationTimestamp="2025-12-09 11:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:28:01.854319116 +0000 UTC m=+5214.246370217" watchObservedRunningTime="2025-12-09 11:28:01.863111331 +0000 UTC m=+5214.255162412" Dec 09 11:28:02 crc kubenswrapper[5002]: I1209 11:28:02.832553 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:28:03 crc kubenswrapper[5002]: I1209 11:28:03.844156 5002 generic.go:334] "Generic (PLEG): container finished" podID="670d7503-5cc2-4674-82bc-ae0ea0ecdf36" containerID="b680270350965904d5be19221fc23cfb017210cc89b8668f43e079f9af6459cb" exitCode=0 Dec 09 11:28:03 crc kubenswrapper[5002]: I1209 11:28:03.844219 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tb74h" event={"ID":"670d7503-5cc2-4674-82bc-ae0ea0ecdf36","Type":"ContainerDied","Data":"b680270350965904d5be19221fc23cfb017210cc89b8668f43e079f9af6459cb"} Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.194804 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.301986 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-credential-keys\") pod \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.302066 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-scripts\") pod \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.302107 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-fernet-keys\") pod \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.302211 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-config-data\") pod \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.302290 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b74n\" (UniqueName: \"kubernetes.io/projected/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-kube-api-access-9b74n\") pod \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.302330 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-combined-ca-bundle\") pod \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\" (UID: \"670d7503-5cc2-4674-82bc-ae0ea0ecdf36\") " Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.308677 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "670d7503-5cc2-4674-82bc-ae0ea0ecdf36" (UID: "670d7503-5cc2-4674-82bc-ae0ea0ecdf36"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.309578 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-kube-api-access-9b74n" (OuterVolumeSpecName: "kube-api-access-9b74n") pod "670d7503-5cc2-4674-82bc-ae0ea0ecdf36" (UID: "670d7503-5cc2-4674-82bc-ae0ea0ecdf36"). InnerVolumeSpecName "kube-api-access-9b74n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.310596 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "670d7503-5cc2-4674-82bc-ae0ea0ecdf36" (UID: "670d7503-5cc2-4674-82bc-ae0ea0ecdf36"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.311383 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-scripts" (OuterVolumeSpecName: "scripts") pod "670d7503-5cc2-4674-82bc-ae0ea0ecdf36" (UID: "670d7503-5cc2-4674-82bc-ae0ea0ecdf36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.328157 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "670d7503-5cc2-4674-82bc-ae0ea0ecdf36" (UID: "670d7503-5cc2-4674-82bc-ae0ea0ecdf36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.330808 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-config-data" (OuterVolumeSpecName: "config-data") pod "670d7503-5cc2-4674-82bc-ae0ea0ecdf36" (UID: "670d7503-5cc2-4674-82bc-ae0ea0ecdf36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.406057 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.406113 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b74n\" (UniqueName: \"kubernetes.io/projected/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-kube-api-access-9b74n\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.406132 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.406147 5002 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.406161 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.406176 5002 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/670d7503-5cc2-4674-82bc-ae0ea0ecdf36-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.863360 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tb74h" event={"ID":"670d7503-5cc2-4674-82bc-ae0ea0ecdf36","Type":"ContainerDied","Data":"b5d9b4f0900376158c5bf391bf4bb71b8019bc62ec3444d95add0573ed15febc"} Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.864338 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5d9b4f0900376158c5bf391bf4bb71b8019bc62ec3444d95add0573ed15febc" Dec 09 11:28:05 crc kubenswrapper[5002]: I1209 11:28:05.864208 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tb74h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.035841 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tb74h"] Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.042775 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tb74h"] Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.069865 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="670d7503-5cc2-4674-82bc-ae0ea0ecdf36" path="/var/lib/kubelet/pods/670d7503-5cc2-4674-82bc-ae0ea0ecdf36/volumes" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.139597 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-22c2h"] Dec 09 11:28:06 crc kubenswrapper[5002]: E1209 11:28:06.140119 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="670d7503-5cc2-4674-82bc-ae0ea0ecdf36" containerName="keystone-bootstrap" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.140151 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="670d7503-5cc2-4674-82bc-ae0ea0ecdf36" containerName="keystone-bootstrap" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.140423 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="670d7503-5cc2-4674-82bc-ae0ea0ecdf36" containerName="keystone-bootstrap" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.141468 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.143565 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.144121 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.144185 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-srzzf" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.144694 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.144717 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.155189 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-22c2h"] Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.218711 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-config-data\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.218783 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-scripts\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.219015 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bxbk\" (UniqueName: \"kubernetes.io/projected/7cc546f3-0611-40c8-96bd-04cf184f0ff2-kube-api-access-9bxbk\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.219290 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-credential-keys\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.219370 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-combined-ca-bundle\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.219464 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-fernet-keys\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.320887 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bxbk\" (UniqueName: \"kubernetes.io/projected/7cc546f3-0611-40c8-96bd-04cf184f0ff2-kube-api-access-9bxbk\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.321029 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-credential-keys\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.321063 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-combined-ca-bundle\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.321101 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-fernet-keys\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.321136 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-config-data\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.321158 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-scripts\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.325445 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-scripts\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.325561 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-combined-ca-bundle\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.325723 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-config-data\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.325734 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-credential-keys\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.335735 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-fernet-keys\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.336218 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bxbk\" (UniqueName: \"kubernetes.io/projected/7cc546f3-0611-40c8-96bd-04cf184f0ff2-kube-api-access-9bxbk\") pod \"keystone-bootstrap-22c2h\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.473132 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:06 crc kubenswrapper[5002]: I1209 11:28:06.900040 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-22c2h"] Dec 09 11:28:06 crc kubenswrapper[5002]: W1209 11:28:06.902152 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cc546f3_0611_40c8_96bd_04cf184f0ff2.slice/crio-f02ef834c1c8e7fa4ee55c050c98e1724b24912c48b08642e091ad02e236ec68 WatchSource:0}: Error finding container f02ef834c1c8e7fa4ee55c050c98e1724b24912c48b08642e091ad02e236ec68: Status 404 returned error can't find the container with id f02ef834c1c8e7fa4ee55c050c98e1724b24912c48b08642e091ad02e236ec68 Dec 09 11:28:07 crc kubenswrapper[5002]: I1209 11:28:07.885334 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-22c2h" event={"ID":"7cc546f3-0611-40c8-96bd-04cf184f0ff2","Type":"ContainerStarted","Data":"974269e3701da7e1cff1171d1be5ffcfb4a1fc19631b0a3628e4c9713bc1afcd"} Dec 09 11:28:07 crc kubenswrapper[5002]: I1209 11:28:07.885654 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-22c2h" event={"ID":"7cc546f3-0611-40c8-96bd-04cf184f0ff2","Type":"ContainerStarted","Data":"f02ef834c1c8e7fa4ee55c050c98e1724b24912c48b08642e091ad02e236ec68"} Dec 09 11:28:07 crc kubenswrapper[5002]: I1209 11:28:07.908637 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-22c2h" podStartSLOduration=1.9086130319999999 podStartE2EDuration="1.908613032s" podCreationTimestamp="2025-12-09 11:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:28:07.904450811 +0000 UTC m=+5220.296501882" watchObservedRunningTime="2025-12-09 11:28:07.908613032 +0000 UTC m=+5220.300664123" Dec 09 11:28:07 crc kubenswrapper[5002]: I1209 11:28:07.964592 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:28:07 crc kubenswrapper[5002]: I1209 11:28:07.964655 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.345978 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.405238 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864bc46885-4w56l"] Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.406236 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864bc46885-4w56l" podUID="b7215792-0269-4b4d-8733-5e8808c0efc8" containerName="dnsmasq-dns" containerID="cri-o://7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f" gracePeriod=10 Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.865792 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.913487 5002 generic.go:334] "Generic (PLEG): container finished" podID="b7215792-0269-4b4d-8733-5e8808c0efc8" containerID="7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f" exitCode=0 Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.913664 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864bc46885-4w56l" event={"ID":"b7215792-0269-4b4d-8733-5e8808c0efc8","Type":"ContainerDied","Data":"7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f"} Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.913854 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864bc46885-4w56l" event={"ID":"b7215792-0269-4b4d-8733-5e8808c0efc8","Type":"ContainerDied","Data":"f8b7a347903cc2f243cb9f3d85ab8d4f254906df9cbfb4227e12a0d5df923783"} Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.913883 5002 scope.go:117] "RemoveContainer" containerID="7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f" Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.913772 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864bc46885-4w56l" Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.936839 5002 scope.go:117] "RemoveContainer" containerID="d7d1634435eb72c2793ba351e988fc62498d6363d7cf86db28e0f92c09733f57" Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.960222 5002 scope.go:117] "RemoveContainer" containerID="7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f" Dec 09 11:28:09 crc kubenswrapper[5002]: E1209 11:28:09.960664 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f\": container with ID starting with 7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f not found: ID does not exist" containerID="7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f" Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.960705 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f"} err="failed to get container status \"7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f\": rpc error: code = NotFound desc = could not find container \"7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f\": container with ID starting with 7cdb50b4a5d5906521ec696869814b394b76d874a14290aa3f6a461853c5546f not found: ID does not exist" Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.960730 5002 scope.go:117] "RemoveContainer" containerID="d7d1634435eb72c2793ba351e988fc62498d6363d7cf86db28e0f92c09733f57" Dec 09 11:28:09 crc kubenswrapper[5002]: E1209 11:28:09.960983 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7d1634435eb72c2793ba351e988fc62498d6363d7cf86db28e0f92c09733f57\": container with ID starting with d7d1634435eb72c2793ba351e988fc62498d6363d7cf86db28e0f92c09733f57 not found: ID does not exist" containerID="d7d1634435eb72c2793ba351e988fc62498d6363d7cf86db28e0f92c09733f57" Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.960999 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7d1634435eb72c2793ba351e988fc62498d6363d7cf86db28e0f92c09733f57"} err="failed to get container status \"d7d1634435eb72c2793ba351e988fc62498d6363d7cf86db28e0f92c09733f57\": rpc error: code = NotFound desc = could not find container \"d7d1634435eb72c2793ba351e988fc62498d6363d7cf86db28e0f92c09733f57\": container with ID starting with d7d1634435eb72c2793ba351e988fc62498d6363d7cf86db28e0f92c09733f57 not found: ID does not exist" Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.984209 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-dns-svc\") pod \"b7215792-0269-4b4d-8733-5e8808c0efc8\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.984297 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-sb\") pod \"b7215792-0269-4b4d-8733-5e8808c0efc8\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.984332 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-nb\") pod \"b7215792-0269-4b4d-8733-5e8808c0efc8\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.984360 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6nh8\" (UniqueName: \"kubernetes.io/projected/b7215792-0269-4b4d-8733-5e8808c0efc8-kube-api-access-h6nh8\") pod \"b7215792-0269-4b4d-8733-5e8808c0efc8\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.984450 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-config\") pod \"b7215792-0269-4b4d-8733-5e8808c0efc8\" (UID: \"b7215792-0269-4b4d-8733-5e8808c0efc8\") " Dec 09 11:28:09 crc kubenswrapper[5002]: I1209 11:28:09.990669 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7215792-0269-4b4d-8733-5e8808c0efc8-kube-api-access-h6nh8" (OuterVolumeSpecName: "kube-api-access-h6nh8") pod "b7215792-0269-4b4d-8733-5e8808c0efc8" (UID: "b7215792-0269-4b4d-8733-5e8808c0efc8"). InnerVolumeSpecName "kube-api-access-h6nh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.030196 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-config" (OuterVolumeSpecName: "config") pod "b7215792-0269-4b4d-8733-5e8808c0efc8" (UID: "b7215792-0269-4b4d-8733-5e8808c0efc8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.031324 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b7215792-0269-4b4d-8733-5e8808c0efc8" (UID: "b7215792-0269-4b4d-8733-5e8808c0efc8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.032516 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b7215792-0269-4b4d-8733-5e8808c0efc8" (UID: "b7215792-0269-4b4d-8733-5e8808c0efc8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.040110 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7215792-0269-4b4d-8733-5e8808c0efc8" (UID: "b7215792-0269-4b4d-8733-5e8808c0efc8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.085656 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.085680 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.085689 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.085699 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6nh8\" (UniqueName: \"kubernetes.io/projected/b7215792-0269-4b4d-8733-5e8808c0efc8-kube-api-access-h6nh8\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.085707 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7215792-0269-4b4d-8733-5e8808c0efc8-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.239244 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864bc46885-4w56l"] Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.246139 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864bc46885-4w56l"] Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.925682 5002 generic.go:334] "Generic (PLEG): container finished" podID="7cc546f3-0611-40c8-96bd-04cf184f0ff2" containerID="974269e3701da7e1cff1171d1be5ffcfb4a1fc19631b0a3628e4c9713bc1afcd" exitCode=0 Dec 09 11:28:10 crc kubenswrapper[5002]: I1209 11:28:10.925745 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-22c2h" event={"ID":"7cc546f3-0611-40c8-96bd-04cf184f0ff2","Type":"ContainerDied","Data":"974269e3701da7e1cff1171d1be5ffcfb4a1fc19631b0a3628e4c9713bc1afcd"} Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.073912 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7215792-0269-4b4d-8733-5e8808c0efc8" path="/var/lib/kubelet/pods/b7215792-0269-4b4d-8733-5e8808c0efc8/volumes" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.289151 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.424428 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-credential-keys\") pod \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.424472 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-combined-ca-bundle\") pod \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.424590 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-fernet-keys\") pod \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.424664 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-scripts\") pod \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.424683 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bxbk\" (UniqueName: \"kubernetes.io/projected/7cc546f3-0611-40c8-96bd-04cf184f0ff2-kube-api-access-9bxbk\") pod \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.424723 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-config-data\") pod \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\" (UID: \"7cc546f3-0611-40c8-96bd-04cf184f0ff2\") " Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.429695 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-scripts" (OuterVolumeSpecName: "scripts") pod "7cc546f3-0611-40c8-96bd-04cf184f0ff2" (UID: "7cc546f3-0611-40c8-96bd-04cf184f0ff2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.430185 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc546f3-0611-40c8-96bd-04cf184f0ff2-kube-api-access-9bxbk" (OuterVolumeSpecName: "kube-api-access-9bxbk") pod "7cc546f3-0611-40c8-96bd-04cf184f0ff2" (UID: "7cc546f3-0611-40c8-96bd-04cf184f0ff2"). InnerVolumeSpecName "kube-api-access-9bxbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.430794 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7cc546f3-0611-40c8-96bd-04cf184f0ff2" (UID: "7cc546f3-0611-40c8-96bd-04cf184f0ff2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.438802 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7cc546f3-0611-40c8-96bd-04cf184f0ff2" (UID: "7cc546f3-0611-40c8-96bd-04cf184f0ff2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.446482 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cc546f3-0611-40c8-96bd-04cf184f0ff2" (UID: "7cc546f3-0611-40c8-96bd-04cf184f0ff2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.470396 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-config-data" (OuterVolumeSpecName: "config-data") pod "7cc546f3-0611-40c8-96bd-04cf184f0ff2" (UID: "7cc546f3-0611-40c8-96bd-04cf184f0ff2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.527120 5002 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.527174 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.527191 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bxbk\" (UniqueName: \"kubernetes.io/projected/7cc546f3-0611-40c8-96bd-04cf184f0ff2-kube-api-access-9bxbk\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.527207 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.527223 5002 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.527239 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc546f3-0611-40c8-96bd-04cf184f0ff2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.947032 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-22c2h" event={"ID":"7cc546f3-0611-40c8-96bd-04cf184f0ff2","Type":"ContainerDied","Data":"f02ef834c1c8e7fa4ee55c050c98e1724b24912c48b08642e091ad02e236ec68"} Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.947073 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f02ef834c1c8e7fa4ee55c050c98e1724b24912c48b08642e091ad02e236ec68" Dec 09 11:28:12 crc kubenswrapper[5002]: I1209 11:28:12.947158 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-22c2h" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.037991 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55cb897df9-2xg2j"] Dec 09 11:28:13 crc kubenswrapper[5002]: E1209 11:28:13.038688 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7215792-0269-4b4d-8733-5e8808c0efc8" containerName="dnsmasq-dns" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.038766 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7215792-0269-4b4d-8733-5e8808c0efc8" containerName="dnsmasq-dns" Dec 09 11:28:13 crc kubenswrapper[5002]: E1209 11:28:13.038854 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc546f3-0611-40c8-96bd-04cf184f0ff2" containerName="keystone-bootstrap" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.038910 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc546f3-0611-40c8-96bd-04cf184f0ff2" containerName="keystone-bootstrap" Dec 09 11:28:13 crc kubenswrapper[5002]: E1209 11:28:13.038964 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7215792-0269-4b4d-8733-5e8808c0efc8" containerName="init" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.039026 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7215792-0269-4b4d-8733-5e8808c0efc8" containerName="init" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.039241 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc546f3-0611-40c8-96bd-04cf184f0ff2" containerName="keystone-bootstrap" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.039319 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7215792-0269-4b4d-8733-5e8808c0efc8" containerName="dnsmasq-dns" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.039954 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.042731 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.042926 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-srzzf" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.043178 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.046046 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.052221 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55cb897df9-2xg2j"] Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.137893 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-fernet-keys\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.138012 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-config-data\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.138040 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8cbp\" (UniqueName: \"kubernetes.io/projected/1b5c3bc5-7b21-43f4-9172-e5604eb79053-kube-api-access-n8cbp\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.138071 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-combined-ca-bundle\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.138105 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-credential-keys\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.138137 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-scripts\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.239464 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-config-data\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.239522 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8cbp\" (UniqueName: \"kubernetes.io/projected/1b5c3bc5-7b21-43f4-9172-e5604eb79053-kube-api-access-n8cbp\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.239566 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-combined-ca-bundle\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.239606 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-credential-keys\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.239647 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-scripts\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.239675 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-fernet-keys\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.243650 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-scripts\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.243910 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-credential-keys\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.243973 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-combined-ca-bundle\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.244106 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-config-data\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.246142 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b5c3bc5-7b21-43f4-9172-e5604eb79053-fernet-keys\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.255241 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8cbp\" (UniqueName: \"kubernetes.io/projected/1b5c3bc5-7b21-43f4-9172-e5604eb79053-kube-api-access-n8cbp\") pod \"keystone-55cb897df9-2xg2j\" (UID: \"1b5c3bc5-7b21-43f4-9172-e5604eb79053\") " pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.408704 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.878292 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55cb897df9-2xg2j"] Dec 09 11:28:13 crc kubenswrapper[5002]: I1209 11:28:13.960256 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55cb897df9-2xg2j" event={"ID":"1b5c3bc5-7b21-43f4-9172-e5604eb79053","Type":"ContainerStarted","Data":"c279865026499c31aefa59ed115645ab81e1f4830cbb17c2fa53a141c3c21c73"} Dec 09 11:28:14 crc kubenswrapper[5002]: I1209 11:28:14.968454 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55cb897df9-2xg2j" event={"ID":"1b5c3bc5-7b21-43f4-9172-e5604eb79053","Type":"ContainerStarted","Data":"6553e4acaa11d95b6b521f9c67537e27968016d30f6443d9dffa9653bec7ef3e"} Dec 09 11:28:14 crc kubenswrapper[5002]: I1209 11:28:14.968767 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:15 crc kubenswrapper[5002]: I1209 11:28:15.003709 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55cb897df9-2xg2j" podStartSLOduration=2.003679078 podStartE2EDuration="2.003679078s" podCreationTimestamp="2025-12-09 11:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:28:14.986809937 +0000 UTC m=+5227.378861068" watchObservedRunningTime="2025-12-09 11:28:15.003679078 +0000 UTC m=+5227.395730199" Dec 09 11:28:37 crc kubenswrapper[5002]: I1209 11:28:37.965002 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:28:37 crc kubenswrapper[5002]: I1209 11:28:37.966261 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:28:44 crc kubenswrapper[5002]: I1209 11:28:44.872714 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55cb897df9-2xg2j" Dec 09 11:28:45 crc kubenswrapper[5002]: I1209 11:28:45.828464 5002 scope.go:117] "RemoveContainer" containerID="bb51280d69f0d38a25167ca4662f5ad638a23d874ff4dbc668934fa9a1a5cf9b" Dec 09 11:28:45 crc kubenswrapper[5002]: I1209 11:28:45.860327 5002 scope.go:117] "RemoveContainer" containerID="44c387a05f789df1d5f06c593f8965b02fb9324150d9aeb353cf6431adc931d5" Dec 09 11:28:45 crc kubenswrapper[5002]: I1209 11:28:45.918845 5002 scope.go:117] "RemoveContainer" containerID="27aea40106f6f495ed696184538025476c29d6c8f42d9ed863f9baca11bb8c2d" Dec 09 11:28:45 crc kubenswrapper[5002]: I1209 11:28:45.970488 5002 scope.go:117] "RemoveContainer" containerID="3d8dfa4b55b6b9e0cf5513ebf04960913f759767090c55b2a1f6dddcc4f7bff0" Dec 09 11:28:46 crc kubenswrapper[5002]: I1209 11:28:46.012963 5002 scope.go:117] "RemoveContainer" containerID="3de0260c7640fc1e04644721698c15389678326747faefc82235ae395beb1afb" Dec 09 11:28:46 crc kubenswrapper[5002]: I1209 11:28:46.042655 5002 scope.go:117] "RemoveContainer" containerID="57854ddb2a9873061bcbb8a6d226bae003ab7d6a442fade6983a2b5804d5f35a" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.651164 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.652785 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.656161 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-b6fc9" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.656215 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.656763 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.664032 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.721129 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config-secret\") pod \"openstackclient\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " pod="openstack/openstackclient" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.721457 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config\") pod \"openstackclient\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " pod="openstack/openstackclient" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.721597 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6wgl\" (UniqueName: \"kubernetes.io/projected/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-kube-api-access-p6wgl\") pod \"openstackclient\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " pod="openstack/openstackclient" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.823382 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config-secret\") pod \"openstackclient\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " pod="openstack/openstackclient" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.823475 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config\") pod \"openstackclient\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " pod="openstack/openstackclient" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.823508 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6wgl\" (UniqueName: \"kubernetes.io/projected/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-kube-api-access-p6wgl\") pod \"openstackclient\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " pod="openstack/openstackclient" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.825206 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config\") pod \"openstackclient\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " pod="openstack/openstackclient" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.831231 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config-secret\") pod \"openstackclient\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " pod="openstack/openstackclient" Dec 09 11:28:47 crc kubenswrapper[5002]: I1209 11:28:47.843902 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6wgl\" (UniqueName: \"kubernetes.io/projected/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-kube-api-access-p6wgl\") pod \"openstackclient\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " pod="openstack/openstackclient" Dec 09 11:28:48 crc kubenswrapper[5002]: I1209 11:28:48.002699 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 11:28:48 crc kubenswrapper[5002]: I1209 11:28:48.566404 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 11:28:49 crc kubenswrapper[5002]: I1209 11:28:49.363310 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6","Type":"ContainerStarted","Data":"11d3bbafb80743c1c3019622d4bf1eba4fc2afb99b128f8cfabc5b6c598e0ba9"} Dec 09 11:28:49 crc kubenswrapper[5002]: I1209 11:28:49.363909 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6","Type":"ContainerStarted","Data":"448af98c7b1f3de443bf7738950f5ddc2c69816f39f88a7f89c2e9338ecabcc0"} Dec 09 11:28:49 crc kubenswrapper[5002]: I1209 11:28:49.387389 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.3873469 podStartE2EDuration="2.3873469s" podCreationTimestamp="2025-12-09 11:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:28:49.386529339 +0000 UTC m=+5261.778580430" watchObservedRunningTime="2025-12-09 11:28:49.3873469 +0000 UTC m=+5261.779398021" Dec 09 11:29:07 crc kubenswrapper[5002]: I1209 11:29:07.964615 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:29:07 crc kubenswrapper[5002]: I1209 11:29:07.965143 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:29:07 crc kubenswrapper[5002]: I1209 11:29:07.965185 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 11:29:07 crc kubenswrapper[5002]: I1209 11:29:07.965832 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:29:07 crc kubenswrapper[5002]: I1209 11:29:07.965878 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" gracePeriod=600 Dec 09 11:29:08 crc kubenswrapper[5002]: E1209 11:29:08.099512 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:29:08 crc kubenswrapper[5002]: I1209 11:29:08.559717 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" exitCode=0 Dec 09 11:29:08 crc kubenswrapper[5002]: I1209 11:29:08.559810 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679"} Dec 09 11:29:08 crc kubenswrapper[5002]: I1209 11:29:08.559936 5002 scope.go:117] "RemoveContainer" containerID="4dcfb78f9776dceb7a4279696fc5ccc4c505ca97a007b3ec6ade5fcf452969c8" Dec 09 11:29:08 crc kubenswrapper[5002]: I1209 11:29:08.560687 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:29:08 crc kubenswrapper[5002]: E1209 11:29:08.561200 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:29:24 crc kubenswrapper[5002]: I1209 11:29:24.061355 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:29:24 crc kubenswrapper[5002]: E1209 11:29:24.063095 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:29:38 crc kubenswrapper[5002]: I1209 11:29:38.065078 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:29:38 crc kubenswrapper[5002]: E1209 11:29:38.066722 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:29:53 crc kubenswrapper[5002]: I1209 11:29:53.060759 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:29:53 crc kubenswrapper[5002]: E1209 11:29:53.061627 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.158403 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd"] Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.160950 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.162863 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.163729 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.171479 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd"] Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.283948 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a5df655-0ab1-4b91-8815-ef28939edb6b-secret-volume\") pod \"collect-profiles-29421330-gjnvd\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.284015 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bnmg\" (UniqueName: \"kubernetes.io/projected/3a5df655-0ab1-4b91-8815-ef28939edb6b-kube-api-access-8bnmg\") pod \"collect-profiles-29421330-gjnvd\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.284036 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a5df655-0ab1-4b91-8815-ef28939edb6b-config-volume\") pod \"collect-profiles-29421330-gjnvd\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.386690 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a5df655-0ab1-4b91-8815-ef28939edb6b-secret-volume\") pod \"collect-profiles-29421330-gjnvd\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.386884 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bnmg\" (UniqueName: \"kubernetes.io/projected/3a5df655-0ab1-4b91-8815-ef28939edb6b-kube-api-access-8bnmg\") pod \"collect-profiles-29421330-gjnvd\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.386942 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a5df655-0ab1-4b91-8815-ef28939edb6b-config-volume\") pod \"collect-profiles-29421330-gjnvd\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.388994 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a5df655-0ab1-4b91-8815-ef28939edb6b-config-volume\") pod \"collect-profiles-29421330-gjnvd\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.400271 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a5df655-0ab1-4b91-8815-ef28939edb6b-secret-volume\") pod \"collect-profiles-29421330-gjnvd\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.402385 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bnmg\" (UniqueName: \"kubernetes.io/projected/3a5df655-0ab1-4b91-8815-ef28939edb6b-kube-api-access-8bnmg\") pod \"collect-profiles-29421330-gjnvd\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.493882 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:00 crc kubenswrapper[5002]: I1209 11:30:00.952502 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd"] Dec 09 11:30:01 crc kubenswrapper[5002]: I1209 11:30:01.080625 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" event={"ID":"3a5df655-0ab1-4b91-8815-ef28939edb6b","Type":"ContainerStarted","Data":"e9fe6d856fe73da5af53567287aa640409591ebf6bb36bb165f5fc9442e74d5b"} Dec 09 11:30:02 crc kubenswrapper[5002]: I1209 11:30:02.089858 5002 generic.go:334] "Generic (PLEG): container finished" podID="3a5df655-0ab1-4b91-8815-ef28939edb6b" containerID="7540502a9ab8de6a3a0ca4c160bf882c0f2c1d0e46a8d0f7bb9653ac963c8da5" exitCode=0 Dec 09 11:30:02 crc kubenswrapper[5002]: I1209 11:30:02.089904 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" event={"ID":"3a5df655-0ab1-4b91-8815-ef28939edb6b","Type":"ContainerDied","Data":"7540502a9ab8de6a3a0ca4c160bf882c0f2c1d0e46a8d0f7bb9653ac963c8da5"} Dec 09 11:30:03 crc kubenswrapper[5002]: I1209 11:30:03.462440 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:03 crc kubenswrapper[5002]: I1209 11:30:03.536361 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bnmg\" (UniqueName: \"kubernetes.io/projected/3a5df655-0ab1-4b91-8815-ef28939edb6b-kube-api-access-8bnmg\") pod \"3a5df655-0ab1-4b91-8815-ef28939edb6b\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " Dec 09 11:30:03 crc kubenswrapper[5002]: I1209 11:30:03.536524 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a5df655-0ab1-4b91-8815-ef28939edb6b-secret-volume\") pod \"3a5df655-0ab1-4b91-8815-ef28939edb6b\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " Dec 09 11:30:03 crc kubenswrapper[5002]: I1209 11:30:03.536599 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a5df655-0ab1-4b91-8815-ef28939edb6b-config-volume\") pod \"3a5df655-0ab1-4b91-8815-ef28939edb6b\" (UID: \"3a5df655-0ab1-4b91-8815-ef28939edb6b\") " Dec 09 11:30:03 crc kubenswrapper[5002]: I1209 11:30:03.537901 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a5df655-0ab1-4b91-8815-ef28939edb6b-config-volume" (OuterVolumeSpecName: "config-volume") pod "3a5df655-0ab1-4b91-8815-ef28939edb6b" (UID: "3a5df655-0ab1-4b91-8815-ef28939edb6b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:30:03 crc kubenswrapper[5002]: I1209 11:30:03.546670 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a5df655-0ab1-4b91-8815-ef28939edb6b-kube-api-access-8bnmg" (OuterVolumeSpecName: "kube-api-access-8bnmg") pod "3a5df655-0ab1-4b91-8815-ef28939edb6b" (UID: "3a5df655-0ab1-4b91-8815-ef28939edb6b"). InnerVolumeSpecName "kube-api-access-8bnmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:30:03 crc kubenswrapper[5002]: I1209 11:30:03.546835 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5df655-0ab1-4b91-8815-ef28939edb6b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3a5df655-0ab1-4b91-8815-ef28939edb6b" (UID: "3a5df655-0ab1-4b91-8815-ef28939edb6b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:30:03 crc kubenswrapper[5002]: I1209 11:30:03.638113 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a5df655-0ab1-4b91-8815-ef28939edb6b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:03 crc kubenswrapper[5002]: I1209 11:30:03.638147 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bnmg\" (UniqueName: \"kubernetes.io/projected/3a5df655-0ab1-4b91-8815-ef28939edb6b-kube-api-access-8bnmg\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:03 crc kubenswrapper[5002]: I1209 11:30:03.638157 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a5df655-0ab1-4b91-8815-ef28939edb6b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:04 crc kubenswrapper[5002]: I1209 11:30:04.110124 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" event={"ID":"3a5df655-0ab1-4b91-8815-ef28939edb6b","Type":"ContainerDied","Data":"e9fe6d856fe73da5af53567287aa640409591ebf6bb36bb165f5fc9442e74d5b"} Dec 09 11:30:04 crc kubenswrapper[5002]: I1209 11:30:04.110184 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9fe6d856fe73da5af53567287aa640409591ebf6bb36bb165f5fc9442e74d5b" Dec 09 11:30:04 crc kubenswrapper[5002]: I1209 11:30:04.110197 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd" Dec 09 11:30:04 crc kubenswrapper[5002]: I1209 11:30:04.536907 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt"] Dec 09 11:30:04 crc kubenswrapper[5002]: I1209 11:30:04.543840 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421285-gnmlt"] Dec 09 11:30:06 crc kubenswrapper[5002]: I1209 11:30:06.075840 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf8ce2eb-0991-40ef-8aa1-2bb668739592" path="/var/lib/kubelet/pods/bf8ce2eb-0991-40ef-8aa1-2bb668739592/volumes" Dec 09 11:30:08 crc kubenswrapper[5002]: I1209 11:30:08.065248 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:30:08 crc kubenswrapper[5002]: E1209 11:30:08.065782 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.222726 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gfn2w"] Dec 09 11:30:17 crc kubenswrapper[5002]: E1209 11:30:17.223868 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5df655-0ab1-4b91-8815-ef28939edb6b" containerName="collect-profiles" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.223885 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5df655-0ab1-4b91-8815-ef28939edb6b" containerName="collect-profiles" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.224092 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5df655-0ab1-4b91-8815-ef28939edb6b" containerName="collect-profiles" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.225520 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.237787 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gfn2w"] Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.272144 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-catalog-content\") pod \"community-operators-gfn2w\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.272239 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbkhq\" (UniqueName: \"kubernetes.io/projected/8666185d-b040-42cd-aa80-e5e38b20109d-kube-api-access-xbkhq\") pod \"community-operators-gfn2w\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.272327 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-utilities\") pod \"community-operators-gfn2w\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.373283 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-catalog-content\") pod \"community-operators-gfn2w\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.373359 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbkhq\" (UniqueName: \"kubernetes.io/projected/8666185d-b040-42cd-aa80-e5e38b20109d-kube-api-access-xbkhq\") pod \"community-operators-gfn2w\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.373403 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-utilities\") pod \"community-operators-gfn2w\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.373946 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-utilities\") pod \"community-operators-gfn2w\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.374316 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-catalog-content\") pod \"community-operators-gfn2w\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.398965 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbkhq\" (UniqueName: \"kubernetes.io/projected/8666185d-b040-42cd-aa80-e5e38b20109d-kube-api-access-xbkhq\") pod \"community-operators-gfn2w\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:17 crc kubenswrapper[5002]: I1209 11:30:17.552048 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:18 crc kubenswrapper[5002]: I1209 11:30:18.115951 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gfn2w"] Dec 09 11:30:18 crc kubenswrapper[5002]: I1209 11:30:18.232296 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfn2w" event={"ID":"8666185d-b040-42cd-aa80-e5e38b20109d","Type":"ContainerStarted","Data":"687615299a81901d787a7a3dcb3c23e3dda5fa45f66894a42595f034bd4d649f"} Dec 09 11:30:19 crc kubenswrapper[5002]: I1209 11:30:19.244720 5002 generic.go:334] "Generic (PLEG): container finished" podID="8666185d-b040-42cd-aa80-e5e38b20109d" containerID="e05b86f0faa077eb9d20d79d809faecd0b77a1c65a3c6bd51408bb8d58c123db" exitCode=0 Dec 09 11:30:19 crc kubenswrapper[5002]: I1209 11:30:19.245127 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfn2w" event={"ID":"8666185d-b040-42cd-aa80-e5e38b20109d","Type":"ContainerDied","Data":"e05b86f0faa077eb9d20d79d809faecd0b77a1c65a3c6bd51408bb8d58c123db"} Dec 09 11:30:20 crc kubenswrapper[5002]: I1209 11:30:20.256754 5002 generic.go:334] "Generic (PLEG): container finished" podID="8666185d-b040-42cd-aa80-e5e38b20109d" containerID="41b584b677afbc2f6db01c1dfc1f860ed32d799835dc23514f30c488453fccab" exitCode=0 Dec 09 11:30:20 crc kubenswrapper[5002]: I1209 11:30:20.256889 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfn2w" event={"ID":"8666185d-b040-42cd-aa80-e5e38b20109d","Type":"ContainerDied","Data":"41b584b677afbc2f6db01c1dfc1f860ed32d799835dc23514f30c488453fccab"} Dec 09 11:30:21 crc kubenswrapper[5002]: I1209 11:30:21.061190 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:30:21 crc kubenswrapper[5002]: E1209 11:30:21.061520 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:30:21 crc kubenswrapper[5002]: I1209 11:30:21.266729 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfn2w" event={"ID":"8666185d-b040-42cd-aa80-e5e38b20109d","Type":"ContainerStarted","Data":"9a2824378a1ad3e33ed8e41e35489cc956ad817c19f5545b58810daa83a69c10"} Dec 09 11:30:21 crc kubenswrapper[5002]: I1209 11:30:21.307541 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gfn2w" podStartSLOduration=2.899789155 podStartE2EDuration="4.30751919s" podCreationTimestamp="2025-12-09 11:30:17 +0000 UTC" firstStartedPulling="2025-12-09 11:30:19.247663611 +0000 UTC m=+5351.639714702" lastFinishedPulling="2025-12-09 11:30:20.655393646 +0000 UTC m=+5353.047444737" observedRunningTime="2025-12-09 11:30:21.306944175 +0000 UTC m=+5353.698995286" watchObservedRunningTime="2025-12-09 11:30:21.30751919 +0000 UTC m=+5353.699570271" Dec 09 11:30:27 crc kubenswrapper[5002]: I1209 11:30:27.552681 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:27 crc kubenswrapper[5002]: I1209 11:30:27.553547 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:27 crc kubenswrapper[5002]: I1209 11:30:27.635591 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:27 crc kubenswrapper[5002]: I1209 11:30:27.846908 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-tdh2j"] Dec 09 11:30:27 crc kubenswrapper[5002]: I1209 11:30:27.848313 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tdh2j" Dec 09 11:30:27 crc kubenswrapper[5002]: I1209 11:30:27.859829 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-tdh2j"] Dec 09 11:30:27 crc kubenswrapper[5002]: I1209 11:30:27.939407 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-132e-account-create-update-nq22q"] Dec 09 11:30:27 crc kubenswrapper[5002]: I1209 11:30:27.940385 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-132e-account-create-update-nq22q" Dec 09 11:30:27 crc kubenswrapper[5002]: I1209 11:30:27.990569 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 09 11:30:27 crc kubenswrapper[5002]: I1209 11:30:27.992776 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcrl9\" (UniqueName: \"kubernetes.io/projected/7c8f540f-8211-4f60-befd-2b8d255d97aa-kube-api-access-hcrl9\") pod \"barbican-db-create-tdh2j\" (UID: \"7c8f540f-8211-4f60-befd-2b8d255d97aa\") " pod="openstack/barbican-db-create-tdh2j" Dec 09 11:30:27 crc kubenswrapper[5002]: I1209 11:30:27.993024 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8f540f-8211-4f60-befd-2b8d255d97aa-operator-scripts\") pod \"barbican-db-create-tdh2j\" (UID: \"7c8f540f-8211-4f60-befd-2b8d255d97aa\") " pod="openstack/barbican-db-create-tdh2j" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.043435 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-132e-account-create-update-nq22q"] Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.101149 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcrl9\" (UniqueName: \"kubernetes.io/projected/7c8f540f-8211-4f60-befd-2b8d255d97aa-kube-api-access-hcrl9\") pod \"barbican-db-create-tdh2j\" (UID: \"7c8f540f-8211-4f60-befd-2b8d255d97aa\") " pod="openstack/barbican-db-create-tdh2j" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.101218 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8f540f-8211-4f60-befd-2b8d255d97aa-operator-scripts\") pod \"barbican-db-create-tdh2j\" (UID: \"7c8f540f-8211-4f60-befd-2b8d255d97aa\") " pod="openstack/barbican-db-create-tdh2j" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.101256 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5254\" (UniqueName: \"kubernetes.io/projected/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-kube-api-access-m5254\") pod \"barbican-132e-account-create-update-nq22q\" (UID: \"600f358b-6892-4b2c-8f7a-f1f011c7a1e5\") " pod="openstack/barbican-132e-account-create-update-nq22q" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.101282 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-operator-scripts\") pod \"barbican-132e-account-create-update-nq22q\" (UID: \"600f358b-6892-4b2c-8f7a-f1f011c7a1e5\") " pod="openstack/barbican-132e-account-create-update-nq22q" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.102200 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8f540f-8211-4f60-befd-2b8d255d97aa-operator-scripts\") pod \"barbican-db-create-tdh2j\" (UID: \"7c8f540f-8211-4f60-befd-2b8d255d97aa\") " pod="openstack/barbican-db-create-tdh2j" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.118380 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcrl9\" (UniqueName: \"kubernetes.io/projected/7c8f540f-8211-4f60-befd-2b8d255d97aa-kube-api-access-hcrl9\") pod \"barbican-db-create-tdh2j\" (UID: \"7c8f540f-8211-4f60-befd-2b8d255d97aa\") " pod="openstack/barbican-db-create-tdh2j" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.203485 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5254\" (UniqueName: \"kubernetes.io/projected/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-kube-api-access-m5254\") pod \"barbican-132e-account-create-update-nq22q\" (UID: \"600f358b-6892-4b2c-8f7a-f1f011c7a1e5\") " pod="openstack/barbican-132e-account-create-update-nq22q" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.203572 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-operator-scripts\") pod \"barbican-132e-account-create-update-nq22q\" (UID: \"600f358b-6892-4b2c-8f7a-f1f011c7a1e5\") " pod="openstack/barbican-132e-account-create-update-nq22q" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.205673 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-operator-scripts\") pod \"barbican-132e-account-create-update-nq22q\" (UID: \"600f358b-6892-4b2c-8f7a-f1f011c7a1e5\") " pod="openstack/barbican-132e-account-create-update-nq22q" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.224488 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5254\" (UniqueName: \"kubernetes.io/projected/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-kube-api-access-m5254\") pod \"barbican-132e-account-create-update-nq22q\" (UID: \"600f358b-6892-4b2c-8f7a-f1f011c7a1e5\") " pod="openstack/barbican-132e-account-create-update-nq22q" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.331351 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tdh2j" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.338538 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-132e-account-create-update-nq22q" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.377787 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.433072 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gfn2w"] Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.810067 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-132e-account-create-update-nq22q"] Dec 09 11:30:28 crc kubenswrapper[5002]: I1209 11:30:28.902256 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-tdh2j"] Dec 09 11:30:29 crc kubenswrapper[5002]: I1209 11:30:29.334548 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-132e-account-create-update-nq22q" event={"ID":"600f358b-6892-4b2c-8f7a-f1f011c7a1e5","Type":"ContainerStarted","Data":"0bb7b6d3b9f14b2b69c0d19ee3c6187bb8bb1a718105d6809dceec5b568d421e"} Dec 09 11:30:29 crc kubenswrapper[5002]: I1209 11:30:29.336054 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tdh2j" event={"ID":"7c8f540f-8211-4f60-befd-2b8d255d97aa","Type":"ContainerStarted","Data":"bab06cf694c0ed92b3132788b64b67e5589fed1cf706d50e2dbe3f69dffd82b0"} Dec 09 11:30:30 crc kubenswrapper[5002]: I1209 11:30:30.350335 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gfn2w" podUID="8666185d-b040-42cd-aa80-e5e38b20109d" containerName="registry-server" containerID="cri-o://9a2824378a1ad3e33ed8e41e35489cc956ad817c19f5545b58810daa83a69c10" gracePeriod=2 Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.360935 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tdh2j" event={"ID":"7c8f540f-8211-4f60-befd-2b8d255d97aa","Type":"ContainerStarted","Data":"661d25efeb4a9e4d58d461b6867ea7e1d8db71eabef039493c4574bb5905533f"} Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.368326 5002 generic.go:334] "Generic (PLEG): container finished" podID="8666185d-b040-42cd-aa80-e5e38b20109d" containerID="9a2824378a1ad3e33ed8e41e35489cc956ad817c19f5545b58810daa83a69c10" exitCode=0 Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.368422 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfn2w" event={"ID":"8666185d-b040-42cd-aa80-e5e38b20109d","Type":"ContainerDied","Data":"9a2824378a1ad3e33ed8e41e35489cc956ad817c19f5545b58810daa83a69c10"} Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.370731 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-132e-account-create-update-nq22q" event={"ID":"600f358b-6892-4b2c-8f7a-f1f011c7a1e5","Type":"ContainerStarted","Data":"737a0a8e432adfc8cb9edb16c5409db21219565eaef2c6787d5e97490f7d880c"} Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.379684 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-tdh2j" podStartSLOduration=4.37966828 podStartE2EDuration="4.37966828s" podCreationTimestamp="2025-12-09 11:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:30:31.376493325 +0000 UTC m=+5363.768544436" watchObservedRunningTime="2025-12-09 11:30:31.37966828 +0000 UTC m=+5363.771719361" Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.397023 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-132e-account-create-update-nq22q" podStartSLOduration=4.397003764 podStartE2EDuration="4.397003764s" podCreationTimestamp="2025-12-09 11:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:30:31.389072371 +0000 UTC m=+5363.781123492" watchObservedRunningTime="2025-12-09 11:30:31.397003764 +0000 UTC m=+5363.789054845" Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.799010 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.973517 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-catalog-content\") pod \"8666185d-b040-42cd-aa80-e5e38b20109d\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.973618 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbkhq\" (UniqueName: \"kubernetes.io/projected/8666185d-b040-42cd-aa80-e5e38b20109d-kube-api-access-xbkhq\") pod \"8666185d-b040-42cd-aa80-e5e38b20109d\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.973806 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-utilities\") pod \"8666185d-b040-42cd-aa80-e5e38b20109d\" (UID: \"8666185d-b040-42cd-aa80-e5e38b20109d\") " Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.975179 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-utilities" (OuterVolumeSpecName: "utilities") pod "8666185d-b040-42cd-aa80-e5e38b20109d" (UID: "8666185d-b040-42cd-aa80-e5e38b20109d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:30:31 crc kubenswrapper[5002]: I1209 11:30:31.988018 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8666185d-b040-42cd-aa80-e5e38b20109d-kube-api-access-xbkhq" (OuterVolumeSpecName: "kube-api-access-xbkhq") pod "8666185d-b040-42cd-aa80-e5e38b20109d" (UID: "8666185d-b040-42cd-aa80-e5e38b20109d"). InnerVolumeSpecName "kube-api-access-xbkhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.040466 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8666185d-b040-42cd-aa80-e5e38b20109d" (UID: "8666185d-b040-42cd-aa80-e5e38b20109d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.079121 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.079180 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8666185d-b040-42cd-aa80-e5e38b20109d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.079216 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbkhq\" (UniqueName: \"kubernetes.io/projected/8666185d-b040-42cd-aa80-e5e38b20109d-kube-api-access-xbkhq\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.392137 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfn2w" event={"ID":"8666185d-b040-42cd-aa80-e5e38b20109d","Type":"ContainerDied","Data":"687615299a81901d787a7a3dcb3c23e3dda5fa45f66894a42595f034bd4d649f"} Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.392220 5002 scope.go:117] "RemoveContainer" containerID="9a2824378a1ad3e33ed8e41e35489cc956ad817c19f5545b58810daa83a69c10" Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.392422 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfn2w" Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.405074 5002 generic.go:334] "Generic (PLEG): container finished" podID="600f358b-6892-4b2c-8f7a-f1f011c7a1e5" containerID="737a0a8e432adfc8cb9edb16c5409db21219565eaef2c6787d5e97490f7d880c" exitCode=0 Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.405264 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-132e-account-create-update-nq22q" event={"ID":"600f358b-6892-4b2c-8f7a-f1f011c7a1e5","Type":"ContainerDied","Data":"737a0a8e432adfc8cb9edb16c5409db21219565eaef2c6787d5e97490f7d880c"} Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.410788 5002 generic.go:334] "Generic (PLEG): container finished" podID="7c8f540f-8211-4f60-befd-2b8d255d97aa" containerID="661d25efeb4a9e4d58d461b6867ea7e1d8db71eabef039493c4574bb5905533f" exitCode=0 Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.410909 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tdh2j" event={"ID":"7c8f540f-8211-4f60-befd-2b8d255d97aa","Type":"ContainerDied","Data":"661d25efeb4a9e4d58d461b6867ea7e1d8db71eabef039493c4574bb5905533f"} Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.437352 5002 scope.go:117] "RemoveContainer" containerID="41b584b677afbc2f6db01c1dfc1f860ed32d799835dc23514f30c488453fccab" Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.449985 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gfn2w"] Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.469067 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gfn2w"] Dec 09 11:30:32 crc kubenswrapper[5002]: I1209 11:30:32.473907 5002 scope.go:117] "RemoveContainer" containerID="e05b86f0faa077eb9d20d79d809faecd0b77a1c65a3c6bd51408bb8d58c123db" Dec 09 11:30:33 crc kubenswrapper[5002]: I1209 11:30:33.829749 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-132e-account-create-update-nq22q" Dec 09 11:30:33 crc kubenswrapper[5002]: I1209 11:30:33.831666 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tdh2j" Dec 09 11:30:33 crc kubenswrapper[5002]: I1209 11:30:33.910226 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcrl9\" (UniqueName: \"kubernetes.io/projected/7c8f540f-8211-4f60-befd-2b8d255d97aa-kube-api-access-hcrl9\") pod \"7c8f540f-8211-4f60-befd-2b8d255d97aa\" (UID: \"7c8f540f-8211-4f60-befd-2b8d255d97aa\") " Dec 09 11:30:33 crc kubenswrapper[5002]: I1209 11:30:33.910314 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-operator-scripts\") pod \"600f358b-6892-4b2c-8f7a-f1f011c7a1e5\" (UID: \"600f358b-6892-4b2c-8f7a-f1f011c7a1e5\") " Dec 09 11:30:33 crc kubenswrapper[5002]: I1209 11:30:33.910354 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5254\" (UniqueName: \"kubernetes.io/projected/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-kube-api-access-m5254\") pod \"600f358b-6892-4b2c-8f7a-f1f011c7a1e5\" (UID: \"600f358b-6892-4b2c-8f7a-f1f011c7a1e5\") " Dec 09 11:30:33 crc kubenswrapper[5002]: I1209 11:30:33.910415 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8f540f-8211-4f60-befd-2b8d255d97aa-operator-scripts\") pod \"7c8f540f-8211-4f60-befd-2b8d255d97aa\" (UID: \"7c8f540f-8211-4f60-befd-2b8d255d97aa\") " Dec 09 11:30:33 crc kubenswrapper[5002]: I1209 11:30:33.911110 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c8f540f-8211-4f60-befd-2b8d255d97aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c8f540f-8211-4f60-befd-2b8d255d97aa" (UID: "7c8f540f-8211-4f60-befd-2b8d255d97aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:30:33 crc kubenswrapper[5002]: I1209 11:30:33.911141 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "600f358b-6892-4b2c-8f7a-f1f011c7a1e5" (UID: "600f358b-6892-4b2c-8f7a-f1f011c7a1e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:30:33 crc kubenswrapper[5002]: I1209 11:30:33.916316 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c8f540f-8211-4f60-befd-2b8d255d97aa-kube-api-access-hcrl9" (OuterVolumeSpecName: "kube-api-access-hcrl9") pod "7c8f540f-8211-4f60-befd-2b8d255d97aa" (UID: "7c8f540f-8211-4f60-befd-2b8d255d97aa"). InnerVolumeSpecName "kube-api-access-hcrl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:30:33 crc kubenswrapper[5002]: I1209 11:30:33.926114 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-kube-api-access-m5254" (OuterVolumeSpecName: "kube-api-access-m5254") pod "600f358b-6892-4b2c-8f7a-f1f011c7a1e5" (UID: "600f358b-6892-4b2c-8f7a-f1f011c7a1e5"). InnerVolumeSpecName "kube-api-access-m5254". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:30:34 crc kubenswrapper[5002]: I1209 11:30:34.011992 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:34 crc kubenswrapper[5002]: I1209 11:30:34.012034 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5254\" (UniqueName: \"kubernetes.io/projected/600f358b-6892-4b2c-8f7a-f1f011c7a1e5-kube-api-access-m5254\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:34 crc kubenswrapper[5002]: I1209 11:30:34.012050 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8f540f-8211-4f60-befd-2b8d255d97aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:34 crc kubenswrapper[5002]: I1209 11:30:34.012062 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcrl9\" (UniqueName: \"kubernetes.io/projected/7c8f540f-8211-4f60-befd-2b8d255d97aa-kube-api-access-hcrl9\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:34 crc kubenswrapper[5002]: I1209 11:30:34.075052 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8666185d-b040-42cd-aa80-e5e38b20109d" path="/var/lib/kubelet/pods/8666185d-b040-42cd-aa80-e5e38b20109d/volumes" Dec 09 11:30:34 crc kubenswrapper[5002]: I1209 11:30:34.443880 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tdh2j" event={"ID":"7c8f540f-8211-4f60-befd-2b8d255d97aa","Type":"ContainerDied","Data":"bab06cf694c0ed92b3132788b64b67e5589fed1cf706d50e2dbe3f69dffd82b0"} Dec 09 11:30:34 crc kubenswrapper[5002]: I1209 11:30:34.443942 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bab06cf694c0ed92b3132788b64b67e5589fed1cf706d50e2dbe3f69dffd82b0" Dec 09 11:30:34 crc kubenswrapper[5002]: I1209 11:30:34.443955 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tdh2j" Dec 09 11:30:34 crc kubenswrapper[5002]: I1209 11:30:34.447300 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-132e-account-create-update-nq22q" event={"ID":"600f358b-6892-4b2c-8f7a-f1f011c7a1e5","Type":"ContainerDied","Data":"0bb7b6d3b9f14b2b69c0d19ee3c6187bb8bb1a718105d6809dceec5b568d421e"} Dec 09 11:30:34 crc kubenswrapper[5002]: I1209 11:30:34.447350 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bb7b6d3b9f14b2b69c0d19ee3c6187bb8bb1a718105d6809dceec5b568d421e" Dec 09 11:30:34 crc kubenswrapper[5002]: I1209 11:30:34.447442 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-132e-account-create-update-nq22q" Dec 09 11:30:35 crc kubenswrapper[5002]: I1209 11:30:35.060263 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:30:35 crc kubenswrapper[5002]: E1209 11:30:35.061937 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.328346 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-sfv58"] Dec 09 11:30:38 crc kubenswrapper[5002]: E1209 11:30:38.330383 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8f540f-8211-4f60-befd-2b8d255d97aa" containerName="mariadb-database-create" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.330407 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8f540f-8211-4f60-befd-2b8d255d97aa" containerName="mariadb-database-create" Dec 09 11:30:38 crc kubenswrapper[5002]: E1209 11:30:38.330424 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8666185d-b040-42cd-aa80-e5e38b20109d" containerName="extract-utilities" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.330430 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8666185d-b040-42cd-aa80-e5e38b20109d" containerName="extract-utilities" Dec 09 11:30:38 crc kubenswrapper[5002]: E1209 11:30:38.330439 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8666185d-b040-42cd-aa80-e5e38b20109d" containerName="registry-server" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.330445 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8666185d-b040-42cd-aa80-e5e38b20109d" containerName="registry-server" Dec 09 11:30:38 crc kubenswrapper[5002]: E1209 11:30:38.330463 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="600f358b-6892-4b2c-8f7a-f1f011c7a1e5" containerName="mariadb-account-create-update" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.330469 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="600f358b-6892-4b2c-8f7a-f1f011c7a1e5" containerName="mariadb-account-create-update" Dec 09 11:30:38 crc kubenswrapper[5002]: E1209 11:30:38.330486 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8666185d-b040-42cd-aa80-e5e38b20109d" containerName="extract-content" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.330491 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8666185d-b040-42cd-aa80-e5e38b20109d" containerName="extract-content" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.330681 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="8666185d-b040-42cd-aa80-e5e38b20109d" containerName="registry-server" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.330708 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="600f358b-6892-4b2c-8f7a-f1f011c7a1e5" containerName="mariadb-account-create-update" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.330719 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8f540f-8211-4f60-befd-2b8d255d97aa" containerName="mariadb-database-create" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.331274 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.334011 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.337066 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9c25v" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.341710 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-sfv58"] Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.493059 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-combined-ca-bundle\") pod \"barbican-db-sync-sfv58\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.493104 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-db-sync-config-data\") pod \"barbican-db-sync-sfv58\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.493336 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxsd\" (UniqueName: \"kubernetes.io/projected/56b1e4c9-eb29-4110-8379-85134901c4db-kube-api-access-8cxsd\") pod \"barbican-db-sync-sfv58\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.595071 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxsd\" (UniqueName: \"kubernetes.io/projected/56b1e4c9-eb29-4110-8379-85134901c4db-kube-api-access-8cxsd\") pod \"barbican-db-sync-sfv58\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.595289 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-combined-ca-bundle\") pod \"barbican-db-sync-sfv58\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.595353 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-db-sync-config-data\") pod \"barbican-db-sync-sfv58\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.599967 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-combined-ca-bundle\") pod \"barbican-db-sync-sfv58\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.605705 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-db-sync-config-data\") pod \"barbican-db-sync-sfv58\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.620014 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxsd\" (UniqueName: \"kubernetes.io/projected/56b1e4c9-eb29-4110-8379-85134901c4db-kube-api-access-8cxsd\") pod \"barbican-db-sync-sfv58\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:38 crc kubenswrapper[5002]: I1209 11:30:38.647968 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:39 crc kubenswrapper[5002]: I1209 11:30:39.115827 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-sfv58"] Dec 09 11:30:39 crc kubenswrapper[5002]: I1209 11:30:39.498067 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sfv58" event={"ID":"56b1e4c9-eb29-4110-8379-85134901c4db","Type":"ContainerStarted","Data":"4b315336bfdc7b0f0e2cdaf3df1f2b62e25b6819f8f96c21ed763017270ccd5f"} Dec 09 11:30:39 crc kubenswrapper[5002]: I1209 11:30:39.498115 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sfv58" event={"ID":"56b1e4c9-eb29-4110-8379-85134901c4db","Type":"ContainerStarted","Data":"65f2067c61ae1884ba101345d3640a29974490ec4f1a4edbf66085db79c49e48"} Dec 09 11:30:39 crc kubenswrapper[5002]: I1209 11:30:39.521945 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-sfv58" podStartSLOduration=1.521926077 podStartE2EDuration="1.521926077s" podCreationTimestamp="2025-12-09 11:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:30:39.512924086 +0000 UTC m=+5371.904975187" watchObservedRunningTime="2025-12-09 11:30:39.521926077 +0000 UTC m=+5371.913977158" Dec 09 11:30:42 crc kubenswrapper[5002]: I1209 11:30:42.524552 5002 generic.go:334] "Generic (PLEG): container finished" podID="56b1e4c9-eb29-4110-8379-85134901c4db" containerID="4b315336bfdc7b0f0e2cdaf3df1f2b62e25b6819f8f96c21ed763017270ccd5f" exitCode=0 Dec 09 11:30:42 crc kubenswrapper[5002]: I1209 11:30:42.524641 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sfv58" event={"ID":"56b1e4c9-eb29-4110-8379-85134901c4db","Type":"ContainerDied","Data":"4b315336bfdc7b0f0e2cdaf3df1f2b62e25b6819f8f96c21ed763017270ccd5f"} Dec 09 11:30:43 crc kubenswrapper[5002]: I1209 11:30:43.938505 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.096731 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cxsd\" (UniqueName: \"kubernetes.io/projected/56b1e4c9-eb29-4110-8379-85134901c4db-kube-api-access-8cxsd\") pod \"56b1e4c9-eb29-4110-8379-85134901c4db\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.096878 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-combined-ca-bundle\") pod \"56b1e4c9-eb29-4110-8379-85134901c4db\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.096964 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-db-sync-config-data\") pod \"56b1e4c9-eb29-4110-8379-85134901c4db\" (UID: \"56b1e4c9-eb29-4110-8379-85134901c4db\") " Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.105493 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b1e4c9-eb29-4110-8379-85134901c4db-kube-api-access-8cxsd" (OuterVolumeSpecName: "kube-api-access-8cxsd") pod "56b1e4c9-eb29-4110-8379-85134901c4db" (UID: "56b1e4c9-eb29-4110-8379-85134901c4db"). InnerVolumeSpecName "kube-api-access-8cxsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.107034 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "56b1e4c9-eb29-4110-8379-85134901c4db" (UID: "56b1e4c9-eb29-4110-8379-85134901c4db"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.123835 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56b1e4c9-eb29-4110-8379-85134901c4db" (UID: "56b1e4c9-eb29-4110-8379-85134901c4db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.199391 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cxsd\" (UniqueName: \"kubernetes.io/projected/56b1e4c9-eb29-4110-8379-85134901c4db-kube-api-access-8cxsd\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.199421 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.199435 5002 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56b1e4c9-eb29-4110-8379-85134901c4db-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.551516 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sfv58" event={"ID":"56b1e4c9-eb29-4110-8379-85134901c4db","Type":"ContainerDied","Data":"65f2067c61ae1884ba101345d3640a29974490ec4f1a4edbf66085db79c49e48"} Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.552052 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65f2067c61ae1884ba101345d3640a29974490ec4f1a4edbf66085db79c49e48" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.551610 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sfv58" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.785603 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-69b68b4587-6nfgt"] Dec 09 11:30:44 crc kubenswrapper[5002]: E1209 11:30:44.786066 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b1e4c9-eb29-4110-8379-85134901c4db" containerName="barbican-db-sync" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.786090 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b1e4c9-eb29-4110-8379-85134901c4db" containerName="barbican-db-sync" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.786289 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="56b1e4c9-eb29-4110-8379-85134901c4db" containerName="barbican-db-sync" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.787352 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.789210 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9c25v" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.798801 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.800294 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.823088 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7c6866897d-sgztm"] Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.824693 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.829054 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.841681 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c6866897d-sgztm"] Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.893569 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-x5jj2"] Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.895195 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.901384 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69b68b4587-6nfgt"] Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.911021 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-x5jj2"] Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.930715 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgpr6\" (UniqueName: \"kubernetes.io/projected/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-kube-api-access-rgpr6\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.930786 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785d609c-802a-48b2-b58a-7eaa54867270-combined-ca-bundle\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.930903 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785d609c-802a-48b2-b58a-7eaa54867270-config-data\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.931001 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-546vd\" (UniqueName: \"kubernetes.io/projected/785d609c-802a-48b2-b58a-7eaa54867270-kube-api-access-546vd\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.931020 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-combined-ca-bundle\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.931065 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-config-data-custom\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.931111 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-config-data\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.931140 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/785d609c-802a-48b2-b58a-7eaa54867270-config-data-custom\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.931176 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-logs\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.931378 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785d609c-802a-48b2-b58a-7eaa54867270-logs\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.974688 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-746d5c8778-w7l5z"] Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.976496 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.978757 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 09 11:30:44 crc kubenswrapper[5002]: I1209 11:30:44.988157 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-746d5c8778-w7l5z"] Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.034906 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785d609c-802a-48b2-b58a-7eaa54867270-logs\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.034969 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-sb\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035007 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-nb\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035061 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgpr6\" (UniqueName: \"kubernetes.io/projected/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-kube-api-access-rgpr6\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035082 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-config\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035128 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785d609c-802a-48b2-b58a-7eaa54867270-combined-ca-bundle\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035152 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785d609c-802a-48b2-b58a-7eaa54867270-config-data\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035212 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-546vd\" (UniqueName: \"kubernetes.io/projected/785d609c-802a-48b2-b58a-7eaa54867270-kube-api-access-546vd\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035227 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-combined-ca-bundle\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035247 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkv85\" (UniqueName: \"kubernetes.io/projected/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-kube-api-access-vkv85\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035275 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-config-data-custom\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035305 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-config-data\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035330 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/785d609c-802a-48b2-b58a-7eaa54867270-config-data-custom\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035353 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-logs\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035412 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-dns-svc\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.035771 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785d609c-802a-48b2-b58a-7eaa54867270-logs\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.040848 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-logs\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.044167 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-config-data-custom\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.045035 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/785d609c-802a-48b2-b58a-7eaa54867270-config-data-custom\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.046122 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-config-data\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.049709 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785d609c-802a-48b2-b58a-7eaa54867270-combined-ca-bundle\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.053090 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-combined-ca-bundle\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.054049 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785d609c-802a-48b2-b58a-7eaa54867270-config-data\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.055206 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-546vd\" (UniqueName: \"kubernetes.io/projected/785d609c-802a-48b2-b58a-7eaa54867270-kube-api-access-546vd\") pod \"barbican-worker-69b68b4587-6nfgt\" (UID: \"785d609c-802a-48b2-b58a-7eaa54867270\") " pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.062653 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgpr6\" (UniqueName: \"kubernetes.io/projected/04e2ad2f-2d91-4eb3-95bb-62229762bc1c-kube-api-access-rgpr6\") pod \"barbican-keystone-listener-7c6866897d-sgztm\" (UID: \"04e2ad2f-2d91-4eb3-95bb-62229762bc1c\") " pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.117701 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69b68b4587-6nfgt" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.136443 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhcbj\" (UniqueName: \"kubernetes.io/projected/88442869-dcc3-4d84-9ee5-91756a2394ef-kube-api-access-rhcbj\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.136491 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88442869-dcc3-4d84-9ee5-91756a2394ef-config-data\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.136525 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-dns-svc\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.136567 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-sb\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.136591 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-nb\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.136614 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88442869-dcc3-4d84-9ee5-91756a2394ef-logs\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.136636 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88442869-dcc3-4d84-9ee5-91756a2394ef-config-data-custom\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.136661 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-config\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.136697 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkv85\" (UniqueName: \"kubernetes.io/projected/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-kube-api-access-vkv85\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.136725 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88442869-dcc3-4d84-9ee5-91756a2394ef-combined-ca-bundle\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.137619 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-sb\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.137623 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-nb\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.138033 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-config\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.138606 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-dns-svc\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.156356 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkv85\" (UniqueName: \"kubernetes.io/projected/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-kube-api-access-vkv85\") pod \"dnsmasq-dns-596df78cd9-x5jj2\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.172449 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.248996 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88442869-dcc3-4d84-9ee5-91756a2394ef-combined-ca-bundle\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.249056 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhcbj\" (UniqueName: \"kubernetes.io/projected/88442869-dcc3-4d84-9ee5-91756a2394ef-kube-api-access-rhcbj\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.249075 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88442869-dcc3-4d84-9ee5-91756a2394ef-config-data\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.249125 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88442869-dcc3-4d84-9ee5-91756a2394ef-logs\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.249150 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88442869-dcc3-4d84-9ee5-91756a2394ef-config-data-custom\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.250717 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.251261 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88442869-dcc3-4d84-9ee5-91756a2394ef-logs\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.253452 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88442869-dcc3-4d84-9ee5-91756a2394ef-config-data-custom\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.256510 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88442869-dcc3-4d84-9ee5-91756a2394ef-config-data\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.258509 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88442869-dcc3-4d84-9ee5-91756a2394ef-combined-ca-bundle\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.271066 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhcbj\" (UniqueName: \"kubernetes.io/projected/88442869-dcc3-4d84-9ee5-91756a2394ef-kube-api-access-rhcbj\") pod \"barbican-api-746d5c8778-w7l5z\" (UID: \"88442869-dcc3-4d84-9ee5-91756a2394ef\") " pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.305625 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.556998 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69b68b4587-6nfgt"] Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.681284 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c6866897d-sgztm"] Dec 09 11:30:45 crc kubenswrapper[5002]: W1209 11:30:45.687374 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04e2ad2f_2d91_4eb3_95bb_62229762bc1c.slice/crio-13c750d2e426e58afcf25965caf8165a319ec2912aca75abbf91d239f835ace7 WatchSource:0}: Error finding container 13c750d2e426e58afcf25965caf8165a319ec2912aca75abbf91d239f835ace7: Status 404 returned error can't find the container with id 13c750d2e426e58afcf25965caf8165a319ec2912aca75abbf91d239f835ace7 Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.817945 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-x5jj2"] Dec 09 11:30:45 crc kubenswrapper[5002]: I1209 11:30:45.830129 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-746d5c8778-w7l5z"] Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.190075 5002 scope.go:117] "RemoveContainer" containerID="07303139730ec5d256bff4a041d4b5957d21f0bad9f3bfd59e40a44f3190736a" Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.572476 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-746d5c8778-w7l5z" event={"ID":"88442869-dcc3-4d84-9ee5-91756a2394ef","Type":"ContainerStarted","Data":"1effd571bd3885aa121c8ce99844049406cc3976e6745d0ca5fce5100cd7cb4d"} Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.572549 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-746d5c8778-w7l5z" event={"ID":"88442869-dcc3-4d84-9ee5-91756a2394ef","Type":"ContainerStarted","Data":"de1f6fa4b6b35e7284086b8c5b76d2990b21ab51a74d62ce28adf921b19cd9f5"} Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.575211 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" event={"ID":"04e2ad2f-2d91-4eb3-95bb-62229762bc1c","Type":"ContainerStarted","Data":"25cc4c0bc35d1b8a245caa4cac0990825951488bb06939249239674a83999b27"} Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.575256 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" event={"ID":"04e2ad2f-2d91-4eb3-95bb-62229762bc1c","Type":"ContainerStarted","Data":"c1dd23599ccf6e907c1f914297b061a6ac1b5af0dac2dc54af80f0cb803eb1c9"} Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.575273 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" event={"ID":"04e2ad2f-2d91-4eb3-95bb-62229762bc1c","Type":"ContainerStarted","Data":"13c750d2e426e58afcf25965caf8165a319ec2912aca75abbf91d239f835ace7"} Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.577161 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" event={"ID":"9e373294-c60d-4f3f-b56b-4c73c06b2ce9","Type":"ContainerStarted","Data":"ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041"} Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.577205 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" event={"ID":"9e373294-c60d-4f3f-b56b-4c73c06b2ce9","Type":"ContainerStarted","Data":"3492010e59380e9ca81197aa160eab0a09f173294c3ea57c65ff08a19f6a8aac"} Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.579262 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69b68b4587-6nfgt" event={"ID":"785d609c-802a-48b2-b58a-7eaa54867270","Type":"ContainerStarted","Data":"00b2cefc53b80530915603eb53d7ee335e364dcf3fa73f044dedeb9b8adf052d"} Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.579305 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69b68b4587-6nfgt" event={"ID":"785d609c-802a-48b2-b58a-7eaa54867270","Type":"ContainerStarted","Data":"bc67666c97f5b7965cb5fe798ef9d0f59cd021715b35c5da6e3bb0c40716a794"} Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.579317 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69b68b4587-6nfgt" event={"ID":"785d609c-802a-48b2-b58a-7eaa54867270","Type":"ContainerStarted","Data":"e34b62f1abb12de6e42831b5a165c86f1fe5dfb94ccd447c168d7f60ea2ba39d"} Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.592799 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7c6866897d-sgztm" podStartSLOduration=2.592780515 podStartE2EDuration="2.592780515s" podCreationTimestamp="2025-12-09 11:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:30:46.590544445 +0000 UTC m=+5378.982595526" watchObservedRunningTime="2025-12-09 11:30:46.592780515 +0000 UTC m=+5378.984831596" Dec 09 11:30:46 crc kubenswrapper[5002]: I1209 11:30:46.653802 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-69b68b4587-6nfgt" podStartSLOduration=2.653783677 podStartE2EDuration="2.653783677s" podCreationTimestamp="2025-12-09 11:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:30:46.617292231 +0000 UTC m=+5379.009343322" watchObservedRunningTime="2025-12-09 11:30:46.653783677 +0000 UTC m=+5379.045834758" Dec 09 11:30:47 crc kubenswrapper[5002]: I1209 11:30:47.590505 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-746d5c8778-w7l5z" event={"ID":"88442869-dcc3-4d84-9ee5-91756a2394ef","Type":"ContainerStarted","Data":"04e6ba45a32076dc4bc53e91e70661cdb1824dc61cea56293a9ce6e95840faa4"} Dec 09 11:30:47 crc kubenswrapper[5002]: I1209 11:30:47.590828 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:47 crc kubenswrapper[5002]: I1209 11:30:47.590845 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:47 crc kubenswrapper[5002]: I1209 11:30:47.592089 5002 generic.go:334] "Generic (PLEG): container finished" podID="9e373294-c60d-4f3f-b56b-4c73c06b2ce9" containerID="ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041" exitCode=0 Dec 09 11:30:47 crc kubenswrapper[5002]: I1209 11:30:47.592165 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" event={"ID":"9e373294-c60d-4f3f-b56b-4c73c06b2ce9","Type":"ContainerDied","Data":"ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041"} Dec 09 11:30:47 crc kubenswrapper[5002]: I1209 11:30:47.638057 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-746d5c8778-w7l5z" podStartSLOduration=3.6380402739999997 podStartE2EDuration="3.638040274s" podCreationTimestamp="2025-12-09 11:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:30:47.614437403 +0000 UTC m=+5380.006488524" watchObservedRunningTime="2025-12-09 11:30:47.638040274 +0000 UTC m=+5380.030091355" Dec 09 11:30:48 crc kubenswrapper[5002]: I1209 11:30:48.600882 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" event={"ID":"9e373294-c60d-4f3f-b56b-4c73c06b2ce9","Type":"ContainerStarted","Data":"bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2"} Dec 09 11:30:48 crc kubenswrapper[5002]: I1209 11:30:48.623825 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" podStartSLOduration=4.623786852 podStartE2EDuration="4.623786852s" podCreationTimestamp="2025-12-09 11:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:30:48.617765801 +0000 UTC m=+5381.009816892" watchObservedRunningTime="2025-12-09 11:30:48.623786852 +0000 UTC m=+5381.015837933" Dec 09 11:30:49 crc kubenswrapper[5002]: I1209 11:30:49.061210 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:30:49 crc kubenswrapper[5002]: E1209 11:30:49.061692 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:30:49 crc kubenswrapper[5002]: I1209 11:30:49.608607 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.254105 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.329964 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58865cd75-7vg59"] Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.330786 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58865cd75-7vg59" podUID="bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" containerName="dnsmasq-dns" containerID="cri-o://9d0624af58cec598367afc30db1261c4a854b1d14cb1a92379f8d3af9fbc6ea1" gracePeriod=10 Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.672696 5002 generic.go:334] "Generic (PLEG): container finished" podID="bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" containerID="9d0624af58cec598367afc30db1261c4a854b1d14cb1a92379f8d3af9fbc6ea1" exitCode=0 Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.672799 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58865cd75-7vg59" event={"ID":"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e","Type":"ContainerDied","Data":"9d0624af58cec598367afc30db1261c4a854b1d14cb1a92379f8d3af9fbc6ea1"} Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.840556 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.961283 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-dns-svc\") pod \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.961415 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-config\") pod \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.961450 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-sb\") pod \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.961476 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-nb\") pod \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.961523 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6xrz\" (UniqueName: \"kubernetes.io/projected/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-kube-api-access-t6xrz\") pod \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\" (UID: \"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e\") " Dec 09 11:30:55 crc kubenswrapper[5002]: I1209 11:30:55.969042 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-kube-api-access-t6xrz" (OuterVolumeSpecName: "kube-api-access-t6xrz") pod "bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" (UID: "bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e"). InnerVolumeSpecName "kube-api-access-t6xrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.014556 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" (UID: "bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.026410 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" (UID: "bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.037280 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" (UID: "bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.039534 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-config" (OuterVolumeSpecName: "config") pod "bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" (UID: "bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.063566 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.063597 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.063610 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.063623 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6xrz\" (UniqueName: \"kubernetes.io/projected/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-kube-api-access-t6xrz\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.063635 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.683283 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58865cd75-7vg59" event={"ID":"bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e","Type":"ContainerDied","Data":"459bd6e37baf706c6c03869a3862b96b92f5235f4c35e4afe3a474fa8cdb0ee5"} Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.683334 5002 scope.go:117] "RemoveContainer" containerID="9d0624af58cec598367afc30db1261c4a854b1d14cb1a92379f8d3af9fbc6ea1" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.683440 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58865cd75-7vg59" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.714352 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58865cd75-7vg59"] Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.720886 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58865cd75-7vg59"] Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.724295 5002 scope.go:117] "RemoveContainer" containerID="914c82607a7c95b17d4bd573522f17f6b92d983a16fe95e794cfc586583572ae" Dec 09 11:30:56 crc kubenswrapper[5002]: I1209 11:30:56.890518 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:57 crc kubenswrapper[5002]: I1209 11:30:57.007490 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-746d5c8778-w7l5z" Dec 09 11:30:58 crc kubenswrapper[5002]: I1209 11:30:58.091259 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" path="/var/lib/kubelet/pods/bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e/volumes" Dec 09 11:31:00 crc kubenswrapper[5002]: I1209 11:31:00.060318 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:31:00 crc kubenswrapper[5002]: E1209 11:31:00.061104 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.579581 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-mmk2t"] Dec 09 11:31:09 crc kubenswrapper[5002]: E1209 11:31:09.580855 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" containerName="init" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.580872 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" containerName="init" Dec 09 11:31:09 crc kubenswrapper[5002]: E1209 11:31:09.580892 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" containerName="dnsmasq-dns" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.580899 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" containerName="dnsmasq-dns" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.581075 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5d35ce-bd73-4ac1-bb1d-ba6816b9b70e" containerName="dnsmasq-dns" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.582064 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mmk2t" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.603111 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mmk2t"] Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.636163 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96vxz\" (UniqueName: \"kubernetes.io/projected/d023e717-31e3-445f-9634-6cc0e92b6870-kube-api-access-96vxz\") pod \"neutron-db-create-mmk2t\" (UID: \"d023e717-31e3-445f-9634-6cc0e92b6870\") " pod="openstack/neutron-db-create-mmk2t" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.636238 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d023e717-31e3-445f-9634-6cc0e92b6870-operator-scripts\") pod \"neutron-db-create-mmk2t\" (UID: \"d023e717-31e3-445f-9634-6cc0e92b6870\") " pod="openstack/neutron-db-create-mmk2t" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.681946 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9fe7-account-create-update-t2szh"] Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.683205 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9fe7-account-create-update-t2szh" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.684880 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.695982 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9fe7-account-create-update-t2szh"] Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.737550 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96vxz\" (UniqueName: \"kubernetes.io/projected/d023e717-31e3-445f-9634-6cc0e92b6870-kube-api-access-96vxz\") pod \"neutron-db-create-mmk2t\" (UID: \"d023e717-31e3-445f-9634-6cc0e92b6870\") " pod="openstack/neutron-db-create-mmk2t" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.737610 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d023e717-31e3-445f-9634-6cc0e92b6870-operator-scripts\") pod \"neutron-db-create-mmk2t\" (UID: \"d023e717-31e3-445f-9634-6cc0e92b6870\") " pod="openstack/neutron-db-create-mmk2t" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.737636 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wtwg\" (UniqueName: \"kubernetes.io/projected/c402dd03-2420-4dcc-a745-0685e3a997c8-kube-api-access-8wtwg\") pod \"neutron-9fe7-account-create-update-t2szh\" (UID: \"c402dd03-2420-4dcc-a745-0685e3a997c8\") " pod="openstack/neutron-9fe7-account-create-update-t2szh" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.737669 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c402dd03-2420-4dcc-a745-0685e3a997c8-operator-scripts\") pod \"neutron-9fe7-account-create-update-t2szh\" (UID: \"c402dd03-2420-4dcc-a745-0685e3a997c8\") " pod="openstack/neutron-9fe7-account-create-update-t2szh" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.738725 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d023e717-31e3-445f-9634-6cc0e92b6870-operator-scripts\") pod \"neutron-db-create-mmk2t\" (UID: \"d023e717-31e3-445f-9634-6cc0e92b6870\") " pod="openstack/neutron-db-create-mmk2t" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.757605 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96vxz\" (UniqueName: \"kubernetes.io/projected/d023e717-31e3-445f-9634-6cc0e92b6870-kube-api-access-96vxz\") pod \"neutron-db-create-mmk2t\" (UID: \"d023e717-31e3-445f-9634-6cc0e92b6870\") " pod="openstack/neutron-db-create-mmk2t" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.838848 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wtwg\" (UniqueName: \"kubernetes.io/projected/c402dd03-2420-4dcc-a745-0685e3a997c8-kube-api-access-8wtwg\") pod \"neutron-9fe7-account-create-update-t2szh\" (UID: \"c402dd03-2420-4dcc-a745-0685e3a997c8\") " pod="openstack/neutron-9fe7-account-create-update-t2szh" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.839148 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c402dd03-2420-4dcc-a745-0685e3a997c8-operator-scripts\") pod \"neutron-9fe7-account-create-update-t2szh\" (UID: \"c402dd03-2420-4dcc-a745-0685e3a997c8\") " pod="openstack/neutron-9fe7-account-create-update-t2szh" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.839825 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c402dd03-2420-4dcc-a745-0685e3a997c8-operator-scripts\") pod \"neutron-9fe7-account-create-update-t2szh\" (UID: \"c402dd03-2420-4dcc-a745-0685e3a997c8\") " pod="openstack/neutron-9fe7-account-create-update-t2szh" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.853833 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wtwg\" (UniqueName: \"kubernetes.io/projected/c402dd03-2420-4dcc-a745-0685e3a997c8-kube-api-access-8wtwg\") pod \"neutron-9fe7-account-create-update-t2szh\" (UID: \"c402dd03-2420-4dcc-a745-0685e3a997c8\") " pod="openstack/neutron-9fe7-account-create-update-t2szh" Dec 09 11:31:09 crc kubenswrapper[5002]: I1209 11:31:09.900763 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mmk2t" Dec 09 11:31:10 crc kubenswrapper[5002]: I1209 11:31:10.010320 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9fe7-account-create-update-t2szh" Dec 09 11:31:10 crc kubenswrapper[5002]: I1209 11:31:10.336405 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mmk2t"] Dec 09 11:31:10 crc kubenswrapper[5002]: W1209 11:31:10.340796 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd023e717_31e3_445f_9634_6cc0e92b6870.slice/crio-65f4af90f7611f075ca1513275efb0a4b59d5fee66b83a5bdf1b26434741973a WatchSource:0}: Error finding container 65f4af90f7611f075ca1513275efb0a4b59d5fee66b83a5bdf1b26434741973a: Status 404 returned error can't find the container with id 65f4af90f7611f075ca1513275efb0a4b59d5fee66b83a5bdf1b26434741973a Dec 09 11:31:10 crc kubenswrapper[5002]: I1209 11:31:10.463661 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9fe7-account-create-update-t2szh"] Dec 09 11:31:10 crc kubenswrapper[5002]: W1209 11:31:10.466077 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc402dd03_2420_4dcc_a745_0685e3a997c8.slice/crio-c37c108a341d83a4bcc7033927b21cf038d9fe943b75917bb446e97fb55f0af9 WatchSource:0}: Error finding container c37c108a341d83a4bcc7033927b21cf038d9fe943b75917bb446e97fb55f0af9: Status 404 returned error can't find the container with id c37c108a341d83a4bcc7033927b21cf038d9fe943b75917bb446e97fb55f0af9 Dec 09 11:31:10 crc kubenswrapper[5002]: I1209 11:31:10.796828 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9fe7-account-create-update-t2szh" event={"ID":"c402dd03-2420-4dcc-a745-0685e3a997c8","Type":"ContainerStarted","Data":"47d7a28455611c23e87f0f23ea5e5ae932d18a03dcc49b0037afe1f4eb41c58f"} Dec 09 11:31:10 crc kubenswrapper[5002]: I1209 11:31:10.797745 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9fe7-account-create-update-t2szh" event={"ID":"c402dd03-2420-4dcc-a745-0685e3a997c8","Type":"ContainerStarted","Data":"c37c108a341d83a4bcc7033927b21cf038d9fe943b75917bb446e97fb55f0af9"} Dec 09 11:31:10 crc kubenswrapper[5002]: I1209 11:31:10.799500 5002 generic.go:334] "Generic (PLEG): container finished" podID="d023e717-31e3-445f-9634-6cc0e92b6870" containerID="7c44740fc24f76b2cec5038f0aad39170b0333d0a8578ff0449ee320600e59db" exitCode=0 Dec 09 11:31:10 crc kubenswrapper[5002]: I1209 11:31:10.799558 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mmk2t" event={"ID":"d023e717-31e3-445f-9634-6cc0e92b6870","Type":"ContainerDied","Data":"7c44740fc24f76b2cec5038f0aad39170b0333d0a8578ff0449ee320600e59db"} Dec 09 11:31:10 crc kubenswrapper[5002]: I1209 11:31:10.799596 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mmk2t" event={"ID":"d023e717-31e3-445f-9634-6cc0e92b6870","Type":"ContainerStarted","Data":"65f4af90f7611f075ca1513275efb0a4b59d5fee66b83a5bdf1b26434741973a"} Dec 09 11:31:10 crc kubenswrapper[5002]: I1209 11:31:10.818734 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9fe7-account-create-update-t2szh" podStartSLOduration=1.818715785 podStartE2EDuration="1.818715785s" podCreationTimestamp="2025-12-09 11:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:31:10.810917217 +0000 UTC m=+5403.202968308" watchObservedRunningTime="2025-12-09 11:31:10.818715785 +0000 UTC m=+5403.210766866" Dec 09 11:31:11 crc kubenswrapper[5002]: I1209 11:31:11.061165 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:31:11 crc kubenswrapper[5002]: E1209 11:31:11.061634 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:31:11 crc kubenswrapper[5002]: I1209 11:31:11.809064 5002 generic.go:334] "Generic (PLEG): container finished" podID="c402dd03-2420-4dcc-a745-0685e3a997c8" containerID="47d7a28455611c23e87f0f23ea5e5ae932d18a03dcc49b0037afe1f4eb41c58f" exitCode=0 Dec 09 11:31:11 crc kubenswrapper[5002]: I1209 11:31:11.809206 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9fe7-account-create-update-t2szh" event={"ID":"c402dd03-2420-4dcc-a745-0685e3a997c8","Type":"ContainerDied","Data":"47d7a28455611c23e87f0f23ea5e5ae932d18a03dcc49b0037afe1f4eb41c58f"} Dec 09 11:31:12 crc kubenswrapper[5002]: I1209 11:31:12.203315 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mmk2t" Dec 09 11:31:12 crc kubenswrapper[5002]: I1209 11:31:12.287409 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d023e717-31e3-445f-9634-6cc0e92b6870-operator-scripts\") pod \"d023e717-31e3-445f-9634-6cc0e92b6870\" (UID: \"d023e717-31e3-445f-9634-6cc0e92b6870\") " Dec 09 11:31:12 crc kubenswrapper[5002]: I1209 11:31:12.288071 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96vxz\" (UniqueName: \"kubernetes.io/projected/d023e717-31e3-445f-9634-6cc0e92b6870-kube-api-access-96vxz\") pod \"d023e717-31e3-445f-9634-6cc0e92b6870\" (UID: \"d023e717-31e3-445f-9634-6cc0e92b6870\") " Dec 09 11:31:12 crc kubenswrapper[5002]: I1209 11:31:12.288196 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d023e717-31e3-445f-9634-6cc0e92b6870-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d023e717-31e3-445f-9634-6cc0e92b6870" (UID: "d023e717-31e3-445f-9634-6cc0e92b6870"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:31:12 crc kubenswrapper[5002]: I1209 11:31:12.288511 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d023e717-31e3-445f-9634-6cc0e92b6870-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:12 crc kubenswrapper[5002]: I1209 11:31:12.294488 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d023e717-31e3-445f-9634-6cc0e92b6870-kube-api-access-96vxz" (OuterVolumeSpecName: "kube-api-access-96vxz") pod "d023e717-31e3-445f-9634-6cc0e92b6870" (UID: "d023e717-31e3-445f-9634-6cc0e92b6870"). InnerVolumeSpecName "kube-api-access-96vxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:31:12 crc kubenswrapper[5002]: I1209 11:31:12.390015 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96vxz\" (UniqueName: \"kubernetes.io/projected/d023e717-31e3-445f-9634-6cc0e92b6870-kube-api-access-96vxz\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:12 crc kubenswrapper[5002]: I1209 11:31:12.820151 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mmk2t" Dec 09 11:31:12 crc kubenswrapper[5002]: I1209 11:31:12.820143 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mmk2t" event={"ID":"d023e717-31e3-445f-9634-6cc0e92b6870","Type":"ContainerDied","Data":"65f4af90f7611f075ca1513275efb0a4b59d5fee66b83a5bdf1b26434741973a"} Dec 09 11:31:12 crc kubenswrapper[5002]: I1209 11:31:12.820337 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65f4af90f7611f075ca1513275efb0a4b59d5fee66b83a5bdf1b26434741973a" Dec 09 11:31:13 crc kubenswrapper[5002]: I1209 11:31:13.165628 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9fe7-account-create-update-t2szh" Dec 09 11:31:13 crc kubenswrapper[5002]: I1209 11:31:13.211407 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wtwg\" (UniqueName: \"kubernetes.io/projected/c402dd03-2420-4dcc-a745-0685e3a997c8-kube-api-access-8wtwg\") pod \"c402dd03-2420-4dcc-a745-0685e3a997c8\" (UID: \"c402dd03-2420-4dcc-a745-0685e3a997c8\") " Dec 09 11:31:13 crc kubenswrapper[5002]: I1209 11:31:13.211561 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c402dd03-2420-4dcc-a745-0685e3a997c8-operator-scripts\") pod \"c402dd03-2420-4dcc-a745-0685e3a997c8\" (UID: \"c402dd03-2420-4dcc-a745-0685e3a997c8\") " Dec 09 11:31:13 crc kubenswrapper[5002]: I1209 11:31:13.212470 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c402dd03-2420-4dcc-a745-0685e3a997c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c402dd03-2420-4dcc-a745-0685e3a997c8" (UID: "c402dd03-2420-4dcc-a745-0685e3a997c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:31:13 crc kubenswrapper[5002]: I1209 11:31:13.242491 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c402dd03-2420-4dcc-a745-0685e3a997c8-kube-api-access-8wtwg" (OuterVolumeSpecName: "kube-api-access-8wtwg") pod "c402dd03-2420-4dcc-a745-0685e3a997c8" (UID: "c402dd03-2420-4dcc-a745-0685e3a997c8"). InnerVolumeSpecName "kube-api-access-8wtwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:31:13 crc kubenswrapper[5002]: I1209 11:31:13.313714 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wtwg\" (UniqueName: \"kubernetes.io/projected/c402dd03-2420-4dcc-a745-0685e3a997c8-kube-api-access-8wtwg\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:13 crc kubenswrapper[5002]: I1209 11:31:13.313773 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c402dd03-2420-4dcc-a745-0685e3a997c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:13 crc kubenswrapper[5002]: I1209 11:31:13.838250 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9fe7-account-create-update-t2szh" event={"ID":"c402dd03-2420-4dcc-a745-0685e3a997c8","Type":"ContainerDied","Data":"c37c108a341d83a4bcc7033927b21cf038d9fe943b75917bb446e97fb55f0af9"} Dec 09 11:31:13 crc kubenswrapper[5002]: I1209 11:31:13.838321 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c37c108a341d83a4bcc7033927b21cf038d9fe943b75917bb446e97fb55f0af9" Dec 09 11:31:13 crc kubenswrapper[5002]: I1209 11:31:13.838405 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9fe7-account-create-update-t2szh" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.892842 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-6qhmt"] Dec 09 11:31:14 crc kubenswrapper[5002]: E1209 11:31:14.893714 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c402dd03-2420-4dcc-a745-0685e3a997c8" containerName="mariadb-account-create-update" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.893733 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c402dd03-2420-4dcc-a745-0685e3a997c8" containerName="mariadb-account-create-update" Dec 09 11:31:14 crc kubenswrapper[5002]: E1209 11:31:14.893767 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d023e717-31e3-445f-9634-6cc0e92b6870" containerName="mariadb-database-create" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.893775 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d023e717-31e3-445f-9634-6cc0e92b6870" containerName="mariadb-database-create" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.894022 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c402dd03-2420-4dcc-a745-0685e3a997c8" containerName="mariadb-account-create-update" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.894048 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d023e717-31e3-445f-9634-6cc0e92b6870" containerName="mariadb-database-create" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.895379 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.898094 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.898222 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.898495 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jrpk7" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.915448 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-6qhmt"] Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.942855 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-config\") pod \"neutron-db-sync-6qhmt\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.942922 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-combined-ca-bundle\") pod \"neutron-db-sync-6qhmt\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:14 crc kubenswrapper[5002]: I1209 11:31:14.943177 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rcql\" (UniqueName: \"kubernetes.io/projected/6451fbf1-4860-402d-a862-6e320f7177e8-kube-api-access-7rcql\") pod \"neutron-db-sync-6qhmt\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:15 crc kubenswrapper[5002]: I1209 11:31:15.045439 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rcql\" (UniqueName: \"kubernetes.io/projected/6451fbf1-4860-402d-a862-6e320f7177e8-kube-api-access-7rcql\") pod \"neutron-db-sync-6qhmt\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:15 crc kubenswrapper[5002]: I1209 11:31:15.045562 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-config\") pod \"neutron-db-sync-6qhmt\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:15 crc kubenswrapper[5002]: I1209 11:31:15.045597 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-combined-ca-bundle\") pod \"neutron-db-sync-6qhmt\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:15 crc kubenswrapper[5002]: I1209 11:31:15.050953 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-combined-ca-bundle\") pod \"neutron-db-sync-6qhmt\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:15 crc kubenswrapper[5002]: I1209 11:31:15.051499 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-config\") pod \"neutron-db-sync-6qhmt\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:15 crc kubenswrapper[5002]: I1209 11:31:15.075569 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rcql\" (UniqueName: \"kubernetes.io/projected/6451fbf1-4860-402d-a862-6e320f7177e8-kube-api-access-7rcql\") pod \"neutron-db-sync-6qhmt\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:15 crc kubenswrapper[5002]: I1209 11:31:15.218473 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:15 crc kubenswrapper[5002]: I1209 11:31:15.639747 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-6qhmt"] Dec 09 11:31:15 crc kubenswrapper[5002]: W1209 11:31:15.644368 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6451fbf1_4860_402d_a862_6e320f7177e8.slice/crio-1cbd61451c75d4db76ccc015d8019f62ddf41c61196fefae64bca71129bea4a1 WatchSource:0}: Error finding container 1cbd61451c75d4db76ccc015d8019f62ddf41c61196fefae64bca71129bea4a1: Status 404 returned error can't find the container with id 1cbd61451c75d4db76ccc015d8019f62ddf41c61196fefae64bca71129bea4a1 Dec 09 11:31:15 crc kubenswrapper[5002]: I1209 11:31:15.856036 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6qhmt" event={"ID":"6451fbf1-4860-402d-a862-6e320f7177e8","Type":"ContainerStarted","Data":"900ae01be6554a19f311dcf8d5d4536cf3153c982b7cb47869b08e41783ec6bd"} Dec 09 11:31:15 crc kubenswrapper[5002]: I1209 11:31:15.856368 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6qhmt" event={"ID":"6451fbf1-4860-402d-a862-6e320f7177e8","Type":"ContainerStarted","Data":"1cbd61451c75d4db76ccc015d8019f62ddf41c61196fefae64bca71129bea4a1"} Dec 09 11:31:15 crc kubenswrapper[5002]: I1209 11:31:15.870298 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-6qhmt" podStartSLOduration=1.870274419 podStartE2EDuration="1.870274419s" podCreationTimestamp="2025-12-09 11:31:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:31:15.86918456 +0000 UTC m=+5408.261235651" watchObservedRunningTime="2025-12-09 11:31:15.870274419 +0000 UTC m=+5408.262325510" Dec 09 11:31:20 crc kubenswrapper[5002]: I1209 11:31:20.904111 5002 generic.go:334] "Generic (PLEG): container finished" podID="6451fbf1-4860-402d-a862-6e320f7177e8" containerID="900ae01be6554a19f311dcf8d5d4536cf3153c982b7cb47869b08e41783ec6bd" exitCode=0 Dec 09 11:31:20 crc kubenswrapper[5002]: I1209 11:31:20.904242 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6qhmt" event={"ID":"6451fbf1-4860-402d-a862-6e320f7177e8","Type":"ContainerDied","Data":"900ae01be6554a19f311dcf8d5d4536cf3153c982b7cb47869b08e41783ec6bd"} Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.270241 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.391024 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-combined-ca-bundle\") pod \"6451fbf1-4860-402d-a862-6e320f7177e8\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.391083 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-config\") pod \"6451fbf1-4860-402d-a862-6e320f7177e8\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.391292 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rcql\" (UniqueName: \"kubernetes.io/projected/6451fbf1-4860-402d-a862-6e320f7177e8-kube-api-access-7rcql\") pod \"6451fbf1-4860-402d-a862-6e320f7177e8\" (UID: \"6451fbf1-4860-402d-a862-6e320f7177e8\") " Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.411118 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6451fbf1-4860-402d-a862-6e320f7177e8-kube-api-access-7rcql" (OuterVolumeSpecName: "kube-api-access-7rcql") pod "6451fbf1-4860-402d-a862-6e320f7177e8" (UID: "6451fbf1-4860-402d-a862-6e320f7177e8"). InnerVolumeSpecName "kube-api-access-7rcql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.420285 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-config" (OuterVolumeSpecName: "config") pod "6451fbf1-4860-402d-a862-6e320f7177e8" (UID: "6451fbf1-4860-402d-a862-6e320f7177e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.431032 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6451fbf1-4860-402d-a862-6e320f7177e8" (UID: "6451fbf1-4860-402d-a862-6e320f7177e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.493587 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rcql\" (UniqueName: \"kubernetes.io/projected/6451fbf1-4860-402d-a862-6e320f7177e8-kube-api-access-7rcql\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.493629 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.493643 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6451fbf1-4860-402d-a862-6e320f7177e8-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.925408 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6qhmt" event={"ID":"6451fbf1-4860-402d-a862-6e320f7177e8","Type":"ContainerDied","Data":"1cbd61451c75d4db76ccc015d8019f62ddf41c61196fefae64bca71129bea4a1"} Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.925467 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cbd61451c75d4db76ccc015d8019f62ddf41c61196fefae64bca71129bea4a1" Dec 09 11:31:22 crc kubenswrapper[5002]: I1209 11:31:22.925488 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6qhmt" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.089613 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-cwc9r"] Dec 09 11:31:23 crc kubenswrapper[5002]: E1209 11:31:23.090100 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6451fbf1-4860-402d-a862-6e320f7177e8" containerName="neutron-db-sync" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.090125 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="6451fbf1-4860-402d-a862-6e320f7177e8" containerName="neutron-db-sync" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.090363 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="6451fbf1-4860-402d-a862-6e320f7177e8" containerName="neutron-db-sync" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.091478 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.137458 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-cwc9r"] Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.177622 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7fb76d694f-vmgzb"] Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.179732 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.183253 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jrpk7" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.183546 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.188470 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.196800 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7fb76d694f-vmgzb"] Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.210079 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.210154 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.210193 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-config\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.210226 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-dns-svc\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.210254 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bs57\" (UniqueName: \"kubernetes.io/projected/05aacd78-7157-4300-88d1-de9770dc85ee-kube-api-access-4bs57\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.210278 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-combined-ca-bundle\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.210303 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-httpd-config\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.210340 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-config\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.210370 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfkzw\" (UniqueName: \"kubernetes.io/projected/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-kube-api-access-zfkzw\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.312335 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfkzw\" (UniqueName: \"kubernetes.io/projected/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-kube-api-access-zfkzw\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.312433 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.312471 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.312501 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-config\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.312526 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-dns-svc\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.312547 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bs57\" (UniqueName: \"kubernetes.io/projected/05aacd78-7157-4300-88d1-de9770dc85ee-kube-api-access-4bs57\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.312566 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-combined-ca-bundle\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.312589 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-httpd-config\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.312618 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-config\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.313710 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.313924 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.316278 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-dns-svc\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.316510 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-config\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.316717 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-combined-ca-bundle\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.318261 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-config\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.318278 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-httpd-config\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.337539 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfkzw\" (UniqueName: \"kubernetes.io/projected/45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558-kube-api-access-zfkzw\") pod \"neutron-7fb76d694f-vmgzb\" (UID: \"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558\") " pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.340579 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bs57\" (UniqueName: \"kubernetes.io/projected/05aacd78-7157-4300-88d1-de9770dc85ee-kube-api-access-4bs57\") pod \"dnsmasq-dns-5f59d59797-cwc9r\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.463553 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.510268 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:23 crc kubenswrapper[5002]: I1209 11:31:23.947564 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-cwc9r"] Dec 09 11:31:23 crc kubenswrapper[5002]: W1209 11:31:23.953707 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05aacd78_7157_4300_88d1_de9770dc85ee.slice/crio-587402d4b678d8c17d7a9f356f3d7023f80cc75c74378bb4f884caa27c9ccefd WatchSource:0}: Error finding container 587402d4b678d8c17d7a9f356f3d7023f80cc75c74378bb4f884caa27c9ccefd: Status 404 returned error can't find the container with id 587402d4b678d8c17d7a9f356f3d7023f80cc75c74378bb4f884caa27c9ccefd Dec 09 11:31:24 crc kubenswrapper[5002]: I1209 11:31:24.158345 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7fb76d694f-vmgzb"] Dec 09 11:31:24 crc kubenswrapper[5002]: W1209 11:31:24.171250 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45c5af3b_9ac2_4d9d_b1e4_e7bf01b80558.slice/crio-65e5aa1c61492d371473d58ef9b6939982a77d66250de12503d14fb9fcdf8289 WatchSource:0}: Error finding container 65e5aa1c61492d371473d58ef9b6939982a77d66250de12503d14fb9fcdf8289: Status 404 returned error can't find the container with id 65e5aa1c61492d371473d58ef9b6939982a77d66250de12503d14fb9fcdf8289 Dec 09 11:31:24 crc kubenswrapper[5002]: I1209 11:31:24.946110 5002 generic.go:334] "Generic (PLEG): container finished" podID="05aacd78-7157-4300-88d1-de9770dc85ee" containerID="8b89733bfb6323aa3f601b0b1fc7df5bd875b1443c83d283c4dbbc99eae2c110" exitCode=0 Dec 09 11:31:24 crc kubenswrapper[5002]: I1209 11:31:24.946230 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" event={"ID":"05aacd78-7157-4300-88d1-de9770dc85ee","Type":"ContainerDied","Data":"8b89733bfb6323aa3f601b0b1fc7df5bd875b1443c83d283c4dbbc99eae2c110"} Dec 09 11:31:24 crc kubenswrapper[5002]: I1209 11:31:24.946546 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" event={"ID":"05aacd78-7157-4300-88d1-de9770dc85ee","Type":"ContainerStarted","Data":"587402d4b678d8c17d7a9f356f3d7023f80cc75c74378bb4f884caa27c9ccefd"} Dec 09 11:31:24 crc kubenswrapper[5002]: I1209 11:31:24.952478 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fb76d694f-vmgzb" event={"ID":"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558","Type":"ContainerStarted","Data":"d6fa89ced37d251404d1dc5385e5db70962e1aa7de1354861c75d20726f17ab1"} Dec 09 11:31:24 crc kubenswrapper[5002]: I1209 11:31:24.952526 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fb76d694f-vmgzb" event={"ID":"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558","Type":"ContainerStarted","Data":"45a9ff0e8606669bc9dd4a497f83264dc7f56ab4df7610d48a3db2a289fea151"} Dec 09 11:31:24 crc kubenswrapper[5002]: I1209 11:31:24.952540 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fb76d694f-vmgzb" event={"ID":"45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558","Type":"ContainerStarted","Data":"65e5aa1c61492d371473d58ef9b6939982a77d66250de12503d14fb9fcdf8289"} Dec 09 11:31:24 crc kubenswrapper[5002]: I1209 11:31:24.952661 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:31:24 crc kubenswrapper[5002]: I1209 11:31:24.995350 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7fb76d694f-vmgzb" podStartSLOduration=1.995328487 podStartE2EDuration="1.995328487s" podCreationTimestamp="2025-12-09 11:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:31:24.986539311 +0000 UTC m=+5417.378590412" watchObservedRunningTime="2025-12-09 11:31:24.995328487 +0000 UTC m=+5417.387379568" Dec 09 11:31:25 crc kubenswrapper[5002]: I1209 11:31:25.976431 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" event={"ID":"05aacd78-7157-4300-88d1-de9770dc85ee","Type":"ContainerStarted","Data":"aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d"} Dec 09 11:31:26 crc kubenswrapper[5002]: I1209 11:31:26.012579 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" podStartSLOduration=3.012561297 podStartE2EDuration="3.012561297s" podCreationTimestamp="2025-12-09 11:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:31:26.009537646 +0000 UTC m=+5418.401588797" watchObservedRunningTime="2025-12-09 11:31:26.012561297 +0000 UTC m=+5418.404612388" Dec 09 11:31:26 crc kubenswrapper[5002]: I1209 11:31:26.065143 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:31:26 crc kubenswrapper[5002]: E1209 11:31:26.065322 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:31:26 crc kubenswrapper[5002]: I1209 11:31:26.985881 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:33 crc kubenswrapper[5002]: I1209 11:31:33.464973 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:31:33 crc kubenswrapper[5002]: I1209 11:31:33.561264 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-x5jj2"] Dec 09 11:31:33 crc kubenswrapper[5002]: I1209 11:31:33.561730 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" podUID="9e373294-c60d-4f3f-b56b-4c73c06b2ce9" containerName="dnsmasq-dns" containerID="cri-o://bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2" gracePeriod=10 Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.018409 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.074883 5002 generic.go:334] "Generic (PLEG): container finished" podID="9e373294-c60d-4f3f-b56b-4c73c06b2ce9" containerID="bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2" exitCode=0 Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.074960 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.079747 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" event={"ID":"9e373294-c60d-4f3f-b56b-4c73c06b2ce9","Type":"ContainerDied","Data":"bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2"} Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.079780 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596df78cd9-x5jj2" event={"ID":"9e373294-c60d-4f3f-b56b-4c73c06b2ce9","Type":"ContainerDied","Data":"3492010e59380e9ca81197aa160eab0a09f173294c3ea57c65ff08a19f6a8aac"} Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.079796 5002 scope.go:117] "RemoveContainer" containerID="bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.107744 5002 scope.go:117] "RemoveContainer" containerID="ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.130384 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-config\") pod \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.130444 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-dns-svc\") pod \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.130563 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkv85\" (UniqueName: \"kubernetes.io/projected/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-kube-api-access-vkv85\") pod \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.130611 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-nb\") pod \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.130730 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-sb\") pod \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\" (UID: \"9e373294-c60d-4f3f-b56b-4c73c06b2ce9\") " Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.135185 5002 scope.go:117] "RemoveContainer" containerID="bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2" Dec 09 11:31:34 crc kubenswrapper[5002]: E1209 11:31:34.136099 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2\": container with ID starting with bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2 not found: ID does not exist" containerID="bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.136140 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2"} err="failed to get container status \"bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2\": rpc error: code = NotFound desc = could not find container \"bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2\": container with ID starting with bf8f476b3078210f286f2049b6802851486ef1848d27669981515b05259503c2 not found: ID does not exist" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.136167 5002 scope.go:117] "RemoveContainer" containerID="ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041" Dec 09 11:31:34 crc kubenswrapper[5002]: E1209 11:31:34.136419 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041\": container with ID starting with ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041 not found: ID does not exist" containerID="ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.136443 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041"} err="failed to get container status \"ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041\": rpc error: code = NotFound desc = could not find container \"ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041\": container with ID starting with ac5d62412818034d443155e471c9c77ba0218e880e782e725003fbf323db8041 not found: ID does not exist" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.136553 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-kube-api-access-vkv85" (OuterVolumeSpecName: "kube-api-access-vkv85") pod "9e373294-c60d-4f3f-b56b-4c73c06b2ce9" (UID: "9e373294-c60d-4f3f-b56b-4c73c06b2ce9"). InnerVolumeSpecName "kube-api-access-vkv85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.170294 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-config" (OuterVolumeSpecName: "config") pod "9e373294-c60d-4f3f-b56b-4c73c06b2ce9" (UID: "9e373294-c60d-4f3f-b56b-4c73c06b2ce9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.170753 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9e373294-c60d-4f3f-b56b-4c73c06b2ce9" (UID: "9e373294-c60d-4f3f-b56b-4c73c06b2ce9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.171239 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9e373294-c60d-4f3f-b56b-4c73c06b2ce9" (UID: "9e373294-c60d-4f3f-b56b-4c73c06b2ce9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.172881 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e373294-c60d-4f3f-b56b-4c73c06b2ce9" (UID: "9e373294-c60d-4f3f-b56b-4c73c06b2ce9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.232631 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.232662 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.232671 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.232680 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkv85\" (UniqueName: \"kubernetes.io/projected/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-kube-api-access-vkv85\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.232689 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e373294-c60d-4f3f-b56b-4c73c06b2ce9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.407800 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-x5jj2"] Dec 09 11:31:34 crc kubenswrapper[5002]: I1209 11:31:34.414085 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-x5jj2"] Dec 09 11:31:36 crc kubenswrapper[5002]: I1209 11:31:36.076053 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e373294-c60d-4f3f-b56b-4c73c06b2ce9" path="/var/lib/kubelet/pods/9e373294-c60d-4f3f-b56b-4c73c06b2ce9/volumes" Dec 09 11:31:41 crc kubenswrapper[5002]: I1209 11:31:41.061159 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:31:41 crc kubenswrapper[5002]: E1209 11:31:41.061964 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:31:52 crc kubenswrapper[5002]: I1209 11:31:52.061384 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:31:52 crc kubenswrapper[5002]: E1209 11:31:52.062653 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:31:53 crc kubenswrapper[5002]: I1209 11:31:53.520443 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7fb76d694f-vmgzb" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.068047 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-m58rr"] Dec 09 11:32:01 crc kubenswrapper[5002]: E1209 11:32:01.068941 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e373294-c60d-4f3f-b56b-4c73c06b2ce9" containerName="init" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.068958 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e373294-c60d-4f3f-b56b-4c73c06b2ce9" containerName="init" Dec 09 11:32:01 crc kubenswrapper[5002]: E1209 11:32:01.069002 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e373294-c60d-4f3f-b56b-4c73c06b2ce9" containerName="dnsmasq-dns" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.069011 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e373294-c60d-4f3f-b56b-4c73c06b2ce9" containerName="dnsmasq-dns" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.069211 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e373294-c60d-4f3f-b56b-4c73c06b2ce9" containerName="dnsmasq-dns" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.069947 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m58rr" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.087072 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-m58rr"] Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.170712 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-272b-account-create-update-kxfs7"] Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.172181 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-272b-account-create-update-kxfs7" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.173967 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.179023 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-272b-account-create-update-kxfs7"] Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.202325 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8df4965-e06f-4000-9b29-5cd4bfbdc827-operator-scripts\") pod \"glance-db-create-m58rr\" (UID: \"b8df4965-e06f-4000-9b29-5cd4bfbdc827\") " pod="openstack/glance-db-create-m58rr" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.202522 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjlww\" (UniqueName: \"kubernetes.io/projected/b8df4965-e06f-4000-9b29-5cd4bfbdc827-kube-api-access-xjlww\") pod \"glance-db-create-m58rr\" (UID: \"b8df4965-e06f-4000-9b29-5cd4bfbdc827\") " pod="openstack/glance-db-create-m58rr" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.202624 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j4jp\" (UniqueName: \"kubernetes.io/projected/d905abdf-06e8-474e-b053-702860fc6a48-kube-api-access-2j4jp\") pod \"glance-272b-account-create-update-kxfs7\" (UID: \"d905abdf-06e8-474e-b053-702860fc6a48\") " pod="openstack/glance-272b-account-create-update-kxfs7" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.202890 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d905abdf-06e8-474e-b053-702860fc6a48-operator-scripts\") pod \"glance-272b-account-create-update-kxfs7\" (UID: \"d905abdf-06e8-474e-b053-702860fc6a48\") " pod="openstack/glance-272b-account-create-update-kxfs7" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.304035 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8df4965-e06f-4000-9b29-5cd4bfbdc827-operator-scripts\") pod \"glance-db-create-m58rr\" (UID: \"b8df4965-e06f-4000-9b29-5cd4bfbdc827\") " pod="openstack/glance-db-create-m58rr" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.304163 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjlww\" (UniqueName: \"kubernetes.io/projected/b8df4965-e06f-4000-9b29-5cd4bfbdc827-kube-api-access-xjlww\") pod \"glance-db-create-m58rr\" (UID: \"b8df4965-e06f-4000-9b29-5cd4bfbdc827\") " pod="openstack/glance-db-create-m58rr" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.304243 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j4jp\" (UniqueName: \"kubernetes.io/projected/d905abdf-06e8-474e-b053-702860fc6a48-kube-api-access-2j4jp\") pod \"glance-272b-account-create-update-kxfs7\" (UID: \"d905abdf-06e8-474e-b053-702860fc6a48\") " pod="openstack/glance-272b-account-create-update-kxfs7" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.304344 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d905abdf-06e8-474e-b053-702860fc6a48-operator-scripts\") pod \"glance-272b-account-create-update-kxfs7\" (UID: \"d905abdf-06e8-474e-b053-702860fc6a48\") " pod="openstack/glance-272b-account-create-update-kxfs7" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.304796 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8df4965-e06f-4000-9b29-5cd4bfbdc827-operator-scripts\") pod \"glance-db-create-m58rr\" (UID: \"b8df4965-e06f-4000-9b29-5cd4bfbdc827\") " pod="openstack/glance-db-create-m58rr" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.305188 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d905abdf-06e8-474e-b053-702860fc6a48-operator-scripts\") pod \"glance-272b-account-create-update-kxfs7\" (UID: \"d905abdf-06e8-474e-b053-702860fc6a48\") " pod="openstack/glance-272b-account-create-update-kxfs7" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.327602 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j4jp\" (UniqueName: \"kubernetes.io/projected/d905abdf-06e8-474e-b053-702860fc6a48-kube-api-access-2j4jp\") pod \"glance-272b-account-create-update-kxfs7\" (UID: \"d905abdf-06e8-474e-b053-702860fc6a48\") " pod="openstack/glance-272b-account-create-update-kxfs7" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.329859 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjlww\" (UniqueName: \"kubernetes.io/projected/b8df4965-e06f-4000-9b29-5cd4bfbdc827-kube-api-access-xjlww\") pod \"glance-db-create-m58rr\" (UID: \"b8df4965-e06f-4000-9b29-5cd4bfbdc827\") " pod="openstack/glance-db-create-m58rr" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.419161 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m58rr" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.486661 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-272b-account-create-update-kxfs7" Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.855385 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-m58rr"] Dec 09 11:32:01 crc kubenswrapper[5002]: I1209 11:32:01.945678 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-272b-account-create-update-kxfs7"] Dec 09 11:32:02 crc kubenswrapper[5002]: I1209 11:32:02.120433 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m58rr" event={"ID":"b8df4965-e06f-4000-9b29-5cd4bfbdc827","Type":"ContainerStarted","Data":"6a2250e0429e97b6c1c0fc36bef5d0e99a3b006bf02ef0dbb35256b686acade6"} Dec 09 11:32:02 crc kubenswrapper[5002]: I1209 11:32:02.120479 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m58rr" event={"ID":"b8df4965-e06f-4000-9b29-5cd4bfbdc827","Type":"ContainerStarted","Data":"5cbcd9e106e55fe146f8fd245e54ac8fe7b348d48b9c6557a8635d886bcc83ed"} Dec 09 11:32:02 crc kubenswrapper[5002]: I1209 11:32:02.123207 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-272b-account-create-update-kxfs7" event={"ID":"d905abdf-06e8-474e-b053-702860fc6a48","Type":"ContainerStarted","Data":"27b49507d971a35cb7b62603337368fe2b5a39da95b6cfcae2e4a5714b6920cd"} Dec 09 11:32:02 crc kubenswrapper[5002]: I1209 11:32:02.123240 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-272b-account-create-update-kxfs7" event={"ID":"d905abdf-06e8-474e-b053-702860fc6a48","Type":"ContainerStarted","Data":"7fcd1cd9384219c2dd7dba2850a288707ad363934f754591f286785ad52eb32a"} Dec 09 11:32:02 crc kubenswrapper[5002]: I1209 11:32:02.138329 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-m58rr" podStartSLOduration=1.138310517 podStartE2EDuration="1.138310517s" podCreationTimestamp="2025-12-09 11:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:32:02.134246188 +0000 UTC m=+5454.526297309" watchObservedRunningTime="2025-12-09 11:32:02.138310517 +0000 UTC m=+5454.530361608" Dec 09 11:32:02 crc kubenswrapper[5002]: I1209 11:32:02.158390 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-272b-account-create-update-kxfs7" podStartSLOduration=1.158368033 podStartE2EDuration="1.158368033s" podCreationTimestamp="2025-12-09 11:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:32:02.15639026 +0000 UTC m=+5454.548441351" watchObservedRunningTime="2025-12-09 11:32:02.158368033 +0000 UTC m=+5454.550419114" Dec 09 11:32:03 crc kubenswrapper[5002]: I1209 11:32:03.135687 5002 generic.go:334] "Generic (PLEG): container finished" podID="b8df4965-e06f-4000-9b29-5cd4bfbdc827" containerID="6a2250e0429e97b6c1c0fc36bef5d0e99a3b006bf02ef0dbb35256b686acade6" exitCode=0 Dec 09 11:32:03 crc kubenswrapper[5002]: I1209 11:32:03.135874 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m58rr" event={"ID":"b8df4965-e06f-4000-9b29-5cd4bfbdc827","Type":"ContainerDied","Data":"6a2250e0429e97b6c1c0fc36bef5d0e99a3b006bf02ef0dbb35256b686acade6"} Dec 09 11:32:03 crc kubenswrapper[5002]: I1209 11:32:03.142752 5002 generic.go:334] "Generic (PLEG): container finished" podID="d905abdf-06e8-474e-b053-702860fc6a48" containerID="27b49507d971a35cb7b62603337368fe2b5a39da95b6cfcae2e4a5714b6920cd" exitCode=0 Dec 09 11:32:03 crc kubenswrapper[5002]: I1209 11:32:03.142801 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-272b-account-create-update-kxfs7" event={"ID":"d905abdf-06e8-474e-b053-702860fc6a48","Type":"ContainerDied","Data":"27b49507d971a35cb7b62603337368fe2b5a39da95b6cfcae2e4a5714b6920cd"} Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.551954 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-272b-account-create-update-kxfs7" Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.562739 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m58rr" Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.670785 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j4jp\" (UniqueName: \"kubernetes.io/projected/d905abdf-06e8-474e-b053-702860fc6a48-kube-api-access-2j4jp\") pod \"d905abdf-06e8-474e-b053-702860fc6a48\" (UID: \"d905abdf-06e8-474e-b053-702860fc6a48\") " Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.670999 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d905abdf-06e8-474e-b053-702860fc6a48-operator-scripts\") pod \"d905abdf-06e8-474e-b053-702860fc6a48\" (UID: \"d905abdf-06e8-474e-b053-702860fc6a48\") " Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.671074 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8df4965-e06f-4000-9b29-5cd4bfbdc827-operator-scripts\") pod \"b8df4965-e06f-4000-9b29-5cd4bfbdc827\" (UID: \"b8df4965-e06f-4000-9b29-5cd4bfbdc827\") " Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.671260 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjlww\" (UniqueName: \"kubernetes.io/projected/b8df4965-e06f-4000-9b29-5cd4bfbdc827-kube-api-access-xjlww\") pod \"b8df4965-e06f-4000-9b29-5cd4bfbdc827\" (UID: \"b8df4965-e06f-4000-9b29-5cd4bfbdc827\") " Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.671707 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d905abdf-06e8-474e-b053-702860fc6a48-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d905abdf-06e8-474e-b053-702860fc6a48" (UID: "d905abdf-06e8-474e-b053-702860fc6a48"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.671877 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8df4965-e06f-4000-9b29-5cd4bfbdc827-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8df4965-e06f-4000-9b29-5cd4bfbdc827" (UID: "b8df4965-e06f-4000-9b29-5cd4bfbdc827"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.677298 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8df4965-e06f-4000-9b29-5cd4bfbdc827-kube-api-access-xjlww" (OuterVolumeSpecName: "kube-api-access-xjlww") pod "b8df4965-e06f-4000-9b29-5cd4bfbdc827" (UID: "b8df4965-e06f-4000-9b29-5cd4bfbdc827"). InnerVolumeSpecName "kube-api-access-xjlww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.678875 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d905abdf-06e8-474e-b053-702860fc6a48-kube-api-access-2j4jp" (OuterVolumeSpecName: "kube-api-access-2j4jp") pod "d905abdf-06e8-474e-b053-702860fc6a48" (UID: "d905abdf-06e8-474e-b053-702860fc6a48"). InnerVolumeSpecName "kube-api-access-2j4jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.772993 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjlww\" (UniqueName: \"kubernetes.io/projected/b8df4965-e06f-4000-9b29-5cd4bfbdc827-kube-api-access-xjlww\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.773487 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j4jp\" (UniqueName: \"kubernetes.io/projected/d905abdf-06e8-474e-b053-702860fc6a48-kube-api-access-2j4jp\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.773505 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d905abdf-06e8-474e-b053-702860fc6a48-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:04 crc kubenswrapper[5002]: I1209 11:32:04.773544 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8df4965-e06f-4000-9b29-5cd4bfbdc827-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:05 crc kubenswrapper[5002]: I1209 11:32:05.170450 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-272b-account-create-update-kxfs7" event={"ID":"d905abdf-06e8-474e-b053-702860fc6a48","Type":"ContainerDied","Data":"7fcd1cd9384219c2dd7dba2850a288707ad363934f754591f286785ad52eb32a"} Dec 09 11:32:05 crc kubenswrapper[5002]: I1209 11:32:05.170506 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fcd1cd9384219c2dd7dba2850a288707ad363934f754591f286785ad52eb32a" Dec 09 11:32:05 crc kubenswrapper[5002]: I1209 11:32:05.170585 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-272b-account-create-update-kxfs7" Dec 09 11:32:05 crc kubenswrapper[5002]: I1209 11:32:05.177060 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m58rr" event={"ID":"b8df4965-e06f-4000-9b29-5cd4bfbdc827","Type":"ContainerDied","Data":"5cbcd9e106e55fe146f8fd245e54ac8fe7b348d48b9c6557a8635d886bcc83ed"} Dec 09 11:32:05 crc kubenswrapper[5002]: I1209 11:32:05.177106 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cbcd9e106e55fe146f8fd245e54ac8fe7b348d48b9c6557a8635d886bcc83ed" Dec 09 11:32:05 crc kubenswrapper[5002]: I1209 11:32:05.177184 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m58rr" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.061038 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:32:06 crc kubenswrapper[5002]: E1209 11:32:06.063459 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.299488 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-l44gb"] Dec 09 11:32:06 crc kubenswrapper[5002]: E1209 11:32:06.300178 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8df4965-e06f-4000-9b29-5cd4bfbdc827" containerName="mariadb-database-create" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.300266 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8df4965-e06f-4000-9b29-5cd4bfbdc827" containerName="mariadb-database-create" Dec 09 11:32:06 crc kubenswrapper[5002]: E1209 11:32:06.300354 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d905abdf-06e8-474e-b053-702860fc6a48" containerName="mariadb-account-create-update" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.300424 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d905abdf-06e8-474e-b053-702860fc6a48" containerName="mariadb-account-create-update" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.300715 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8df4965-e06f-4000-9b29-5cd4bfbdc827" containerName="mariadb-database-create" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.300837 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d905abdf-06e8-474e-b053-702860fc6a48" containerName="mariadb-account-create-update" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.301623 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.305456 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.326777 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dchft" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.331662 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l44gb"] Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.407788 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g78c\" (UniqueName: \"kubernetes.io/projected/c88a1594-851e-43a5-b1ad-ba082a94d556-kube-api-access-8g78c\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.408185 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-config-data\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.408337 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-db-sync-config-data\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.408485 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-combined-ca-bundle\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.510559 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g78c\" (UniqueName: \"kubernetes.io/projected/c88a1594-851e-43a5-b1ad-ba082a94d556-kube-api-access-8g78c\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.512408 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-config-data\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.512574 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-db-sync-config-data\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.512720 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-combined-ca-bundle\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.517890 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-combined-ca-bundle\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.518335 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-config-data\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.523763 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-db-sync-config-data\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.527147 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g78c\" (UniqueName: \"kubernetes.io/projected/c88a1594-851e-43a5-b1ad-ba082a94d556-kube-api-access-8g78c\") pod \"glance-db-sync-l44gb\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:06 crc kubenswrapper[5002]: I1209 11:32:06.633649 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:07 crc kubenswrapper[5002]: I1209 11:32:07.172501 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l44gb"] Dec 09 11:32:07 crc kubenswrapper[5002]: I1209 11:32:07.204035 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l44gb" event={"ID":"c88a1594-851e-43a5-b1ad-ba082a94d556","Type":"ContainerStarted","Data":"9bd52a0fbaeffc2a257358c2e03ea2b1463518ebceb3cc9d4a5d7e2cf4392518"} Dec 09 11:32:08 crc kubenswrapper[5002]: I1209 11:32:08.219780 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l44gb" event={"ID":"c88a1594-851e-43a5-b1ad-ba082a94d556","Type":"ContainerStarted","Data":"e9317b7a96ea912c6bdda8365f01af064ecf7b077911e19a228f4f0cdf9849a6"} Dec 09 11:32:08 crc kubenswrapper[5002]: I1209 11:32:08.238790 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-l44gb" podStartSLOduration=2.23877026 podStartE2EDuration="2.23877026s" podCreationTimestamp="2025-12-09 11:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:32:08.233288613 +0000 UTC m=+5460.625339704" watchObservedRunningTime="2025-12-09 11:32:08.23877026 +0000 UTC m=+5460.630821331" Dec 09 11:32:11 crc kubenswrapper[5002]: I1209 11:32:11.248154 5002 generic.go:334] "Generic (PLEG): container finished" podID="c88a1594-851e-43a5-b1ad-ba082a94d556" containerID="e9317b7a96ea912c6bdda8365f01af064ecf7b077911e19a228f4f0cdf9849a6" exitCode=0 Dec 09 11:32:11 crc kubenswrapper[5002]: I1209 11:32:11.248255 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l44gb" event={"ID":"c88a1594-851e-43a5-b1ad-ba082a94d556","Type":"ContainerDied","Data":"e9317b7a96ea912c6bdda8365f01af064ecf7b077911e19a228f4f0cdf9849a6"} Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.635128 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.737960 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-config-data\") pod \"c88a1594-851e-43a5-b1ad-ba082a94d556\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.738043 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-db-sync-config-data\") pod \"c88a1594-851e-43a5-b1ad-ba082a94d556\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.738090 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g78c\" (UniqueName: \"kubernetes.io/projected/c88a1594-851e-43a5-b1ad-ba082a94d556-kube-api-access-8g78c\") pod \"c88a1594-851e-43a5-b1ad-ba082a94d556\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.738123 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-combined-ca-bundle\") pod \"c88a1594-851e-43a5-b1ad-ba082a94d556\" (UID: \"c88a1594-851e-43a5-b1ad-ba082a94d556\") " Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.743144 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c88a1594-851e-43a5-b1ad-ba082a94d556-kube-api-access-8g78c" (OuterVolumeSpecName: "kube-api-access-8g78c") pod "c88a1594-851e-43a5-b1ad-ba082a94d556" (UID: "c88a1594-851e-43a5-b1ad-ba082a94d556"). InnerVolumeSpecName "kube-api-access-8g78c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.743211 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c88a1594-851e-43a5-b1ad-ba082a94d556" (UID: "c88a1594-851e-43a5-b1ad-ba082a94d556"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.759142 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c88a1594-851e-43a5-b1ad-ba082a94d556" (UID: "c88a1594-851e-43a5-b1ad-ba082a94d556"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.776956 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-config-data" (OuterVolumeSpecName: "config-data") pod "c88a1594-851e-43a5-b1ad-ba082a94d556" (UID: "c88a1594-851e-43a5-b1ad-ba082a94d556"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.840428 5002 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.840464 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g78c\" (UniqueName: \"kubernetes.io/projected/c88a1594-851e-43a5-b1ad-ba082a94d556-kube-api-access-8g78c\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.840481 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[5002]: I1209 11:32:12.840490 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88a1594-851e-43a5-b1ad-ba082a94d556-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.271250 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l44gb" event={"ID":"c88a1594-851e-43a5-b1ad-ba082a94d556","Type":"ContainerDied","Data":"9bd52a0fbaeffc2a257358c2e03ea2b1463518ebceb3cc9d4a5d7e2cf4392518"} Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.271307 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bd52a0fbaeffc2a257358c2e03ea2b1463518ebceb3cc9d4a5d7e2cf4392518" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.271401 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l44gb" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.565082 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:32:13 crc kubenswrapper[5002]: E1209 11:32:13.565473 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c88a1594-851e-43a5-b1ad-ba082a94d556" containerName="glance-db-sync" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.565489 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88a1594-851e-43a5-b1ad-ba082a94d556" containerName="glance-db-sync" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.565711 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c88a1594-851e-43a5-b1ad-ba082a94d556" containerName="glance-db-sync" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.566589 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.569319 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.569565 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.572671 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dchft" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.572675 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.585351 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.654135 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvtbg\" (UniqueName: \"kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-kube-api-access-tvtbg\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.654412 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-ceph\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.654446 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.654464 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-logs\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.654507 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.654523 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.654543 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.674344 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-zmzv8"] Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.675646 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.698243 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-zmzv8"] Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.765576 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.767018 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.768699 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-dns-svc\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.768754 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvtbg\" (UniqueName: \"kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-kube-api-access-tvtbg\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.768785 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.768834 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4wk4\" (UniqueName: \"kubernetes.io/projected/a0321744-2cda-4f1f-9ea5-8ffa684b5373-kube-api-access-z4wk4\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.768884 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-ceph\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.768917 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.768942 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-logs\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.768964 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.768996 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-config\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.769053 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.769078 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.769110 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.769596 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-logs\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.769924 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.773364 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.777281 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.778846 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.790509 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-ceph\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.791826 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.792731 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvtbg\" (UniqueName: \"kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-kube-api-access-tvtbg\") pod \"glance-default-external-api-0\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.794632 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870551 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftlwb\" (UniqueName: \"kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-kube-api-access-ftlwb\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870612 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870637 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-config\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870670 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870688 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870723 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870758 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870788 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-dns-svc\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870823 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870840 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870855 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.870875 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4wk4\" (UniqueName: \"kubernetes.io/projected/a0321744-2cda-4f1f-9ea5-8ffa684b5373-kube-api-access-z4wk4\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.872221 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.873496 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-config\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.874165 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-dns-svc\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.874603 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.884202 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.896705 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4wk4\" (UniqueName: \"kubernetes.io/projected/a0321744-2cda-4f1f-9ea5-8ffa684b5373-kube-api-access-z4wk4\") pod \"dnsmasq-dns-6b9b57f477-zmzv8\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.972298 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.972374 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.972421 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.972445 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.972495 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftlwb\" (UniqueName: \"kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-kube-api-access-ftlwb\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.972549 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.972572 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.973010 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.973196 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.980687 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.981973 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.985246 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.991228 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:13 crc kubenswrapper[5002]: I1209 11:32:13.996960 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:14 crc kubenswrapper[5002]: I1209 11:32:14.002136 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftlwb\" (UniqueName: \"kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-kube-api-access-ftlwb\") pod \"glance-default-internal-api-0\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:14 crc kubenswrapper[5002]: I1209 11:32:14.150917 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:14 crc kubenswrapper[5002]: I1209 11:32:14.481126 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-zmzv8"] Dec 09 11:32:14 crc kubenswrapper[5002]: I1209 11:32:14.495144 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:32:14 crc kubenswrapper[5002]: W1209 11:32:14.500560 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c3be810_4233_4ba2_bb76_7244e5fee0ef.slice/crio-35cf94cd7de48d3949be66417b8572b6db74a4f253d5cd9a8ae98a4283ace2f2 WatchSource:0}: Error finding container 35cf94cd7de48d3949be66417b8572b6db74a4f253d5cd9a8ae98a4283ace2f2: Status 404 returned error can't find the container with id 35cf94cd7de48d3949be66417b8572b6db74a4f253d5cd9a8ae98a4283ace2f2 Dec 09 11:32:14 crc kubenswrapper[5002]: I1209 11:32:14.786265 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:32:14 crc kubenswrapper[5002]: I1209 11:32:14.818879 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:32:15 crc kubenswrapper[5002]: I1209 11:32:15.332033 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c3be810-4233-4ba2-bb76-7244e5fee0ef","Type":"ContainerStarted","Data":"7d689c330ff910d2a71bae079f363572475d4229187e1a13a70535e2251c685b"} Dec 09 11:32:15 crc kubenswrapper[5002]: I1209 11:32:15.332352 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c3be810-4233-4ba2-bb76-7244e5fee0ef","Type":"ContainerStarted","Data":"35cf94cd7de48d3949be66417b8572b6db74a4f253d5cd9a8ae98a4283ace2f2"} Dec 09 11:32:15 crc kubenswrapper[5002]: I1209 11:32:15.337342 5002 generic.go:334] "Generic (PLEG): container finished" podID="a0321744-2cda-4f1f-9ea5-8ffa684b5373" containerID="f186aa1bd90a344ff98f6dd37a3d3342620cce6159599ee1e55dbdd9a14850fc" exitCode=0 Dec 09 11:32:15 crc kubenswrapper[5002]: I1209 11:32:15.337487 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" event={"ID":"a0321744-2cda-4f1f-9ea5-8ffa684b5373","Type":"ContainerDied","Data":"f186aa1bd90a344ff98f6dd37a3d3342620cce6159599ee1e55dbdd9a14850fc"} Dec 09 11:32:15 crc kubenswrapper[5002]: I1209 11:32:15.337516 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" event={"ID":"a0321744-2cda-4f1f-9ea5-8ffa684b5373","Type":"ContainerStarted","Data":"4f13571c2636d93a250256470bb9d7af7026d1cce7b1f5da9a95c358b71fa716"} Dec 09 11:32:15 crc kubenswrapper[5002]: I1209 11:32:15.351514 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1b94715-80cc-4d5d-a617-3d82ce24c5a3","Type":"ContainerStarted","Data":"3467cd13a717253bddc5870367e45fc71dfd304fc39ae29e44b856cab67d5cfe"} Dec 09 11:32:16 crc kubenswrapper[5002]: I1209 11:32:16.365619 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c3be810-4233-4ba2-bb76-7244e5fee0ef","Type":"ContainerStarted","Data":"54c483fbda8dc00c18d0c5de2c07531057470967592bff984fb7acba56da7004"} Dec 09 11:32:16 crc kubenswrapper[5002]: I1209 11:32:16.367399 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1c3be810-4233-4ba2-bb76-7244e5fee0ef" containerName="glance-log" containerID="cri-o://7d689c330ff910d2a71bae079f363572475d4229187e1a13a70535e2251c685b" gracePeriod=30 Dec 09 11:32:16 crc kubenswrapper[5002]: I1209 11:32:16.368261 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1c3be810-4233-4ba2-bb76-7244e5fee0ef" containerName="glance-httpd" containerID="cri-o://54c483fbda8dc00c18d0c5de2c07531057470967592bff984fb7acba56da7004" gracePeriod=30 Dec 09 11:32:16 crc kubenswrapper[5002]: I1209 11:32:16.373751 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" event={"ID":"a0321744-2cda-4f1f-9ea5-8ffa684b5373","Type":"ContainerStarted","Data":"a8fe9843eb691abd30db10125382870e732c648f12b2f9276abf088d2f89ebf8"} Dec 09 11:32:16 crc kubenswrapper[5002]: I1209 11:32:16.374451 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:16 crc kubenswrapper[5002]: I1209 11:32:16.380318 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1b94715-80cc-4d5d-a617-3d82ce24c5a3","Type":"ContainerStarted","Data":"cfbc4675429fb100502a2fe0f58c9c1599bd1eb8d8415c6d2c0e5663050da2de"} Dec 09 11:32:16 crc kubenswrapper[5002]: I1209 11:32:16.380372 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1b94715-80cc-4d5d-a617-3d82ce24c5a3","Type":"ContainerStarted","Data":"555b3d0050057b6861e9e454db5f4c471febc862a48cf55cd9b42b1491cb4676"} Dec 09 11:32:16 crc kubenswrapper[5002]: I1209 11:32:16.395353 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.39533053 podStartE2EDuration="3.39533053s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:32:16.388553959 +0000 UTC m=+5468.780605050" watchObservedRunningTime="2025-12-09 11:32:16.39533053 +0000 UTC m=+5468.787381611" Dec 09 11:32:16 crc kubenswrapper[5002]: I1209 11:32:16.416877 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" podStartSLOduration=3.416852446 podStartE2EDuration="3.416852446s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:32:16.41512601 +0000 UTC m=+5468.807177151" watchObservedRunningTime="2025-12-09 11:32:16.416852446 +0000 UTC m=+5468.808903547" Dec 09 11:32:16 crc kubenswrapper[5002]: I1209 11:32:16.444498 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.444471634 podStartE2EDuration="3.444471634s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:32:16.44395696 +0000 UTC m=+5468.836008041" watchObservedRunningTime="2025-12-09 11:32:16.444471634 +0000 UTC m=+5468.836522725" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.010125 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.392458 5002 generic.go:334] "Generic (PLEG): container finished" podID="1c3be810-4233-4ba2-bb76-7244e5fee0ef" containerID="54c483fbda8dc00c18d0c5de2c07531057470967592bff984fb7acba56da7004" exitCode=0 Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.392870 5002 generic.go:334] "Generic (PLEG): container finished" podID="1c3be810-4233-4ba2-bb76-7244e5fee0ef" containerID="7d689c330ff910d2a71bae079f363572475d4229187e1a13a70535e2251c685b" exitCode=143 Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.392526 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c3be810-4233-4ba2-bb76-7244e5fee0ef","Type":"ContainerDied","Data":"54c483fbda8dc00c18d0c5de2c07531057470967592bff984fb7acba56da7004"} Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.392943 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c3be810-4233-4ba2-bb76-7244e5fee0ef","Type":"ContainerDied","Data":"7d689c330ff910d2a71bae079f363572475d4229187e1a13a70535e2251c685b"} Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.536280 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.645798 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-logs\") pod \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.645873 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvtbg\" (UniqueName: \"kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-kube-api-access-tvtbg\") pod \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.645910 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-ceph\") pod \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.645978 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-httpd-run\") pod \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.646055 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-combined-ca-bundle\") pod \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.646117 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-config-data\") pod \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.646161 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-scripts\") pod \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\" (UID: \"1c3be810-4233-4ba2-bb76-7244e5fee0ef\") " Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.646263 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-logs" (OuterVolumeSpecName: "logs") pod "1c3be810-4233-4ba2-bb76-7244e5fee0ef" (UID: "1c3be810-4233-4ba2-bb76-7244e5fee0ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.646741 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c3be810-4233-4ba2-bb76-7244e5fee0ef" (UID: "1c3be810-4233-4ba2-bb76-7244e5fee0ef"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.646868 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.651201 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-kube-api-access-tvtbg" (OuterVolumeSpecName: "kube-api-access-tvtbg") pod "1c3be810-4233-4ba2-bb76-7244e5fee0ef" (UID: "1c3be810-4233-4ba2-bb76-7244e5fee0ef"). InnerVolumeSpecName "kube-api-access-tvtbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.651600 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-scripts" (OuterVolumeSpecName: "scripts") pod "1c3be810-4233-4ba2-bb76-7244e5fee0ef" (UID: "1c3be810-4233-4ba2-bb76-7244e5fee0ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.652114 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-ceph" (OuterVolumeSpecName: "ceph") pod "1c3be810-4233-4ba2-bb76-7244e5fee0ef" (UID: "1c3be810-4233-4ba2-bb76-7244e5fee0ef"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.670219 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c3be810-4233-4ba2-bb76-7244e5fee0ef" (UID: "1c3be810-4233-4ba2-bb76-7244e5fee0ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.690946 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-config-data" (OuterVolumeSpecName: "config-data") pod "1c3be810-4233-4ba2-bb76-7244e5fee0ef" (UID: "1c3be810-4233-4ba2-bb76-7244e5fee0ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.748402 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvtbg\" (UniqueName: \"kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-kube-api-access-tvtbg\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.748445 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1c3be810-4233-4ba2-bb76-7244e5fee0ef-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.748458 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c3be810-4233-4ba2-bb76-7244e5fee0ef-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.748472 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.748486 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:17 crc kubenswrapper[5002]: I1209 11:32:17.748497 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3be810-4233-4ba2-bb76-7244e5fee0ef-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.072598 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:32:18 crc kubenswrapper[5002]: E1209 11:32:18.073125 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.406199 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c3be810-4233-4ba2-bb76-7244e5fee0ef","Type":"ContainerDied","Data":"35cf94cd7de48d3949be66417b8572b6db74a4f253d5cd9a8ae98a4283ace2f2"} Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.406276 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.406303 5002 scope.go:117] "RemoveContainer" containerID="54c483fbda8dc00c18d0c5de2c07531057470967592bff984fb7acba56da7004" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.419199 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c1b94715-80cc-4d5d-a617-3d82ce24c5a3" containerName="glance-log" containerID="cri-o://555b3d0050057b6861e9e454db5f4c471febc862a48cf55cd9b42b1491cb4676" gracePeriod=30 Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.419227 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c1b94715-80cc-4d5d-a617-3d82ce24c5a3" containerName="glance-httpd" containerID="cri-o://cfbc4675429fb100502a2fe0f58c9c1599bd1eb8d8415c6d2c0e5663050da2de" gracePeriod=30 Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.441940 5002 scope.go:117] "RemoveContainer" containerID="7d689c330ff910d2a71bae079f363572475d4229187e1a13a70535e2251c685b" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.456890 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.477475 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.505951 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:32:18 crc kubenswrapper[5002]: E1209 11:32:18.506623 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3be810-4233-4ba2-bb76-7244e5fee0ef" containerName="glance-httpd" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.506650 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3be810-4233-4ba2-bb76-7244e5fee0ef" containerName="glance-httpd" Dec 09 11:32:18 crc kubenswrapper[5002]: E1209 11:32:18.506708 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3be810-4233-4ba2-bb76-7244e5fee0ef" containerName="glance-log" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.506718 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3be810-4233-4ba2-bb76-7244e5fee0ef" containerName="glance-log" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.511624 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c3be810-4233-4ba2-bb76-7244e5fee0ef" containerName="glance-httpd" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.511695 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c3be810-4233-4ba2-bb76-7244e5fee0ef" containerName="glance-log" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.513308 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.516409 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.594317 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.596393 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.596452 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.596473 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-ceph\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.596557 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp54h\" (UniqueName: \"kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-kube-api-access-xp54h\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.596589 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.596617 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.596639 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-logs\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.698107 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-logs\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.698267 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.698293 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.698313 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-ceph\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.698364 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp54h\" (UniqueName: \"kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-kube-api-access-xp54h\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.698382 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.698408 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.698571 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-logs\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.698830 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.703571 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-ceph\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.703830 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.705851 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.706614 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.716990 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp54h\" (UniqueName: \"kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-kube-api-access-xp54h\") pod \"glance-default-external-api-0\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " pod="openstack/glance-default-external-api-0" Dec 09 11:32:18 crc kubenswrapper[5002]: I1209 11:32:18.843009 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:32:19 crc kubenswrapper[5002]: I1209 11:32:19.381056 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:32:19 crc kubenswrapper[5002]: I1209 11:32:19.423133 5002 generic.go:334] "Generic (PLEG): container finished" podID="c1b94715-80cc-4d5d-a617-3d82ce24c5a3" containerID="cfbc4675429fb100502a2fe0f58c9c1599bd1eb8d8415c6d2c0e5663050da2de" exitCode=0 Dec 09 11:32:19 crc kubenswrapper[5002]: I1209 11:32:19.423171 5002 generic.go:334] "Generic (PLEG): container finished" podID="c1b94715-80cc-4d5d-a617-3d82ce24c5a3" containerID="555b3d0050057b6861e9e454db5f4c471febc862a48cf55cd9b42b1491cb4676" exitCode=143 Dec 09 11:32:19 crc kubenswrapper[5002]: I1209 11:32:19.423180 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1b94715-80cc-4d5d-a617-3d82ce24c5a3","Type":"ContainerDied","Data":"cfbc4675429fb100502a2fe0f58c9c1599bd1eb8d8415c6d2c0e5663050da2de"} Dec 09 11:32:19 crc kubenswrapper[5002]: I1209 11:32:19.423236 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1b94715-80cc-4d5d-a617-3d82ce24c5a3","Type":"ContainerDied","Data":"555b3d0050057b6861e9e454db5f4c471febc862a48cf55cd9b42b1491cb4676"} Dec 09 11:32:19 crc kubenswrapper[5002]: I1209 11:32:19.424924 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d","Type":"ContainerStarted","Data":"cea9e9832989b520bdc9a7743d44c29484ece69eecb96defacc9211a172138a7"} Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.074278 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.084752 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c3be810-4233-4ba2-bb76-7244e5fee0ef" path="/var/lib/kubelet/pods/1c3be810-4233-4ba2-bb76-7244e5fee0ef/volumes" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.223374 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-scripts\") pod \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.223505 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-logs\") pod \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.223530 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-httpd-run\") pod \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.224095 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c1b94715-80cc-4d5d-a617-3d82ce24c5a3" (UID: "c1b94715-80cc-4d5d-a617-3d82ce24c5a3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.224325 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-logs" (OuterVolumeSpecName: "logs") pod "c1b94715-80cc-4d5d-a617-3d82ce24c5a3" (UID: "c1b94715-80cc-4d5d-a617-3d82ce24c5a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.224439 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftlwb\" (UniqueName: \"kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-kube-api-access-ftlwb\") pod \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.224499 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-combined-ca-bundle\") pod \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.224520 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-ceph\") pod \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.224557 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-config-data\") pod \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\" (UID: \"c1b94715-80cc-4d5d-a617-3d82ce24c5a3\") " Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.224979 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.224990 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.228432 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-scripts" (OuterVolumeSpecName: "scripts") pod "c1b94715-80cc-4d5d-a617-3d82ce24c5a3" (UID: "c1b94715-80cc-4d5d-a617-3d82ce24c5a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.228454 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-kube-api-access-ftlwb" (OuterVolumeSpecName: "kube-api-access-ftlwb") pod "c1b94715-80cc-4d5d-a617-3d82ce24c5a3" (UID: "c1b94715-80cc-4d5d-a617-3d82ce24c5a3"). InnerVolumeSpecName "kube-api-access-ftlwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.228845 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-ceph" (OuterVolumeSpecName: "ceph") pod "c1b94715-80cc-4d5d-a617-3d82ce24c5a3" (UID: "c1b94715-80cc-4d5d-a617-3d82ce24c5a3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.249878 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1b94715-80cc-4d5d-a617-3d82ce24c5a3" (UID: "c1b94715-80cc-4d5d-a617-3d82ce24c5a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.272335 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-config-data" (OuterVolumeSpecName: "config-data") pod "c1b94715-80cc-4d5d-a617-3d82ce24c5a3" (UID: "c1b94715-80cc-4d5d-a617-3d82ce24c5a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.327071 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftlwb\" (UniqueName: \"kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-kube-api-access-ftlwb\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.327109 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.327122 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.327135 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.327147 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b94715-80cc-4d5d-a617-3d82ce24c5a3-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.439773 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.439802 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1b94715-80cc-4d5d-a617-3d82ce24c5a3","Type":"ContainerDied","Data":"3467cd13a717253bddc5870367e45fc71dfd304fc39ae29e44b856cab67d5cfe"} Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.439987 5002 scope.go:117] "RemoveContainer" containerID="cfbc4675429fb100502a2fe0f58c9c1599bd1eb8d8415c6d2c0e5663050da2de" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.444308 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d","Type":"ContainerStarted","Data":"11bd6b82419787317723b421efc0133288435a57b6cf87cdfe4c3b856dbc19da"} Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.481072 5002 scope.go:117] "RemoveContainer" containerID="555b3d0050057b6861e9e454db5f4c471febc862a48cf55cd9b42b1491cb4676" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.481298 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.503898 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.513766 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:32:20 crc kubenswrapper[5002]: E1209 11:32:20.514274 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b94715-80cc-4d5d-a617-3d82ce24c5a3" containerName="glance-httpd" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.514290 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b94715-80cc-4d5d-a617-3d82ce24c5a3" containerName="glance-httpd" Dec 09 11:32:20 crc kubenswrapper[5002]: E1209 11:32:20.514305 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b94715-80cc-4d5d-a617-3d82ce24c5a3" containerName="glance-log" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.514311 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b94715-80cc-4d5d-a617-3d82ce24c5a3" containerName="glance-log" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.514485 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b94715-80cc-4d5d-a617-3d82ce24c5a3" containerName="glance-log" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.514514 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b94715-80cc-4d5d-a617-3d82ce24c5a3" containerName="glance-httpd" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.515846 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.524974 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.554280 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.630898 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.630958 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.631028 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-logs\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.631046 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.631116 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.631157 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pqn6\" (UniqueName: \"kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-kube-api-access-6pqn6\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.631186 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: E1209 11:32:20.647700 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1b94715_80cc_4d5d_a617_3d82ce24c5a3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1b94715_80cc_4d5d_a617_3d82ce24c5a3.slice/crio-3467cd13a717253bddc5870367e45fc71dfd304fc39ae29e44b856cab67d5cfe\": RecentStats: unable to find data in memory cache]" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.732891 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.733247 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.733314 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-logs\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.733332 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.733353 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.733375 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pqn6\" (UniqueName: \"kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-kube-api-access-6pqn6\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.733391 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.733923 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.734134 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-logs\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.737837 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.738371 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.739288 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.739480 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.757000 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pqn6\" (UniqueName: \"kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-kube-api-access-6pqn6\") pod \"glance-default-internal-api-0\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:32:20 crc kubenswrapper[5002]: I1209 11:32:20.839934 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:21 crc kubenswrapper[5002]: I1209 11:32:21.446264 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:32:21 crc kubenswrapper[5002]: I1209 11:32:21.457286 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d","Type":"ContainerStarted","Data":"95de9323ecc7ac5f2f04f3b35d37d92ccbed743d76744ec9714c03a948831708"} Dec 09 11:32:21 crc kubenswrapper[5002]: I1209 11:32:21.482052 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.4820299439999998 podStartE2EDuration="3.482029944s" podCreationTimestamp="2025-12-09 11:32:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:32:21.477532913 +0000 UTC m=+5473.869583994" watchObservedRunningTime="2025-12-09 11:32:21.482029944 +0000 UTC m=+5473.874081025" Dec 09 11:32:22 crc kubenswrapper[5002]: I1209 11:32:22.073866 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1b94715-80cc-4d5d-a617-3d82ce24c5a3" path="/var/lib/kubelet/pods/c1b94715-80cc-4d5d-a617-3d82ce24c5a3/volumes" Dec 09 11:32:22 crc kubenswrapper[5002]: I1209 11:32:22.469911 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b","Type":"ContainerStarted","Data":"9e1fdd6323ae596bb3e174d484097706c3446d95c512a3dc7a9c52a735175649"} Dec 09 11:32:22 crc kubenswrapper[5002]: I1209 11:32:22.469990 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b","Type":"ContainerStarted","Data":"03e341703efdba3405a458fa7d8616e3693ff42f9e523a6dd89fe07f0c8c0336"} Dec 09 11:32:23 crc kubenswrapper[5002]: I1209 11:32:23.486380 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b","Type":"ContainerStarted","Data":"2ad7495d08109df0a1d09a4f59f28f80dbd0450288de4c579c6ddea94badcc3c"} Dec 09 11:32:23 crc kubenswrapper[5002]: I1209 11:32:23.518546 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.518522218 podStartE2EDuration="3.518522218s" podCreationTimestamp="2025-12-09 11:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:32:23.50700628 +0000 UTC m=+5475.899057401" watchObservedRunningTime="2025-12-09 11:32:23.518522218 +0000 UTC m=+5475.910573299" Dec 09 11:32:23 crc kubenswrapper[5002]: I1209 11:32:23.992964 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:32:24 crc kubenswrapper[5002]: I1209 11:32:24.052397 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-cwc9r"] Dec 09 11:32:24 crc kubenswrapper[5002]: I1209 11:32:24.054823 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" podUID="05aacd78-7157-4300-88d1-de9770dc85ee" containerName="dnsmasq-dns" containerID="cri-o://aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d" gracePeriod=10 Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.308228 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.415660 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-dns-svc\") pod \"05aacd78-7157-4300-88d1-de9770dc85ee\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.415730 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bs57\" (UniqueName: \"kubernetes.io/projected/05aacd78-7157-4300-88d1-de9770dc85ee-kube-api-access-4bs57\") pod \"05aacd78-7157-4300-88d1-de9770dc85ee\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.415841 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-nb\") pod \"05aacd78-7157-4300-88d1-de9770dc85ee\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.415864 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-sb\") pod \"05aacd78-7157-4300-88d1-de9770dc85ee\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.415890 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-config\") pod \"05aacd78-7157-4300-88d1-de9770dc85ee\" (UID: \"05aacd78-7157-4300-88d1-de9770dc85ee\") " Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.434485 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05aacd78-7157-4300-88d1-de9770dc85ee-kube-api-access-4bs57" (OuterVolumeSpecName: "kube-api-access-4bs57") pod "05aacd78-7157-4300-88d1-de9770dc85ee" (UID: "05aacd78-7157-4300-88d1-de9770dc85ee"). InnerVolumeSpecName "kube-api-access-4bs57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.457580 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "05aacd78-7157-4300-88d1-de9770dc85ee" (UID: "05aacd78-7157-4300-88d1-de9770dc85ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.459404 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-config" (OuterVolumeSpecName: "config") pod "05aacd78-7157-4300-88d1-de9770dc85ee" (UID: "05aacd78-7157-4300-88d1-de9770dc85ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.460324 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "05aacd78-7157-4300-88d1-de9770dc85ee" (UID: "05aacd78-7157-4300-88d1-de9770dc85ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.462582 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "05aacd78-7157-4300-88d1-de9770dc85ee" (UID: "05aacd78-7157-4300-88d1-de9770dc85ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.510761 5002 generic.go:334] "Generic (PLEG): container finished" podID="05aacd78-7157-4300-88d1-de9770dc85ee" containerID="aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d" exitCode=0 Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.510835 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.510871 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" event={"ID":"05aacd78-7157-4300-88d1-de9770dc85ee","Type":"ContainerDied","Data":"aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d"} Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.511227 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59d59797-cwc9r" event={"ID":"05aacd78-7157-4300-88d1-de9770dc85ee","Type":"ContainerDied","Data":"587402d4b678d8c17d7a9f356f3d7023f80cc75c74378bb4f884caa27c9ccefd"} Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.511248 5002 scope.go:117] "RemoveContainer" containerID="aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.518148 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bs57\" (UniqueName: \"kubernetes.io/projected/05aacd78-7157-4300-88d1-de9770dc85ee-kube-api-access-4bs57\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.518176 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.518188 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.518199 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.518211 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05aacd78-7157-4300-88d1-de9770dc85ee-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.551376 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-cwc9r"] Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.552418 5002 scope.go:117] "RemoveContainer" containerID="8b89733bfb6323aa3f601b0b1fc7df5bd875b1443c83d283c4dbbc99eae2c110" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.558318 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-cwc9r"] Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.575067 5002 scope.go:117] "RemoveContainer" containerID="aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d" Dec 09 11:32:25 crc kubenswrapper[5002]: E1209 11:32:25.575608 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d\": container with ID starting with aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d not found: ID does not exist" containerID="aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.575652 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d"} err="failed to get container status \"aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d\": rpc error: code = NotFound desc = could not find container \"aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d\": container with ID starting with aa9364982bacc0ce36d6d153500797f6cce4d70f349ff406cf489cfa5ed7257d not found: ID does not exist" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.575684 5002 scope.go:117] "RemoveContainer" containerID="8b89733bfb6323aa3f601b0b1fc7df5bd875b1443c83d283c4dbbc99eae2c110" Dec 09 11:32:25 crc kubenswrapper[5002]: E1209 11:32:25.576290 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b89733bfb6323aa3f601b0b1fc7df5bd875b1443c83d283c4dbbc99eae2c110\": container with ID starting with 8b89733bfb6323aa3f601b0b1fc7df5bd875b1443c83d283c4dbbc99eae2c110 not found: ID does not exist" containerID="8b89733bfb6323aa3f601b0b1fc7df5bd875b1443c83d283c4dbbc99eae2c110" Dec 09 11:32:25 crc kubenswrapper[5002]: I1209 11:32:25.576339 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b89733bfb6323aa3f601b0b1fc7df5bd875b1443c83d283c4dbbc99eae2c110"} err="failed to get container status \"8b89733bfb6323aa3f601b0b1fc7df5bd875b1443c83d283c4dbbc99eae2c110\": rpc error: code = NotFound desc = could not find container \"8b89733bfb6323aa3f601b0b1fc7df5bd875b1443c83d283c4dbbc99eae2c110\": container with ID starting with 8b89733bfb6323aa3f601b0b1fc7df5bd875b1443c83d283c4dbbc99eae2c110 not found: ID does not exist" Dec 09 11:32:26 crc kubenswrapper[5002]: I1209 11:32:26.076578 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05aacd78-7157-4300-88d1-de9770dc85ee" path="/var/lib/kubelet/pods/05aacd78-7157-4300-88d1-de9770dc85ee/volumes" Dec 09 11:32:28 crc kubenswrapper[5002]: I1209 11:32:28.843374 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 11:32:28 crc kubenswrapper[5002]: I1209 11:32:28.844295 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 11:32:28 crc kubenswrapper[5002]: I1209 11:32:28.881700 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 11:32:28 crc kubenswrapper[5002]: I1209 11:32:28.902453 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 11:32:29 crc kubenswrapper[5002]: I1209 11:32:29.556548 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 11:32:29 crc kubenswrapper[5002]: I1209 11:32:29.557213 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 11:32:30 crc kubenswrapper[5002]: I1209 11:32:30.840161 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:30 crc kubenswrapper[5002]: I1209 11:32:30.840210 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:30 crc kubenswrapper[5002]: I1209 11:32:30.869439 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:30 crc kubenswrapper[5002]: I1209 11:32:30.883350 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:31 crc kubenswrapper[5002]: I1209 11:32:31.521382 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 11:32:31 crc kubenswrapper[5002]: I1209 11:32:31.549932 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 11:32:31 crc kubenswrapper[5002]: I1209 11:32:31.576905 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:31 crc kubenswrapper[5002]: I1209 11:32:31.576990 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:32 crc kubenswrapper[5002]: I1209 11:32:32.060992 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:32:32 crc kubenswrapper[5002]: E1209 11:32:32.061367 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.065982 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wnfqh"] Dec 09 11:32:33 crc kubenswrapper[5002]: E1209 11:32:33.067148 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05aacd78-7157-4300-88d1-de9770dc85ee" containerName="init" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.067183 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="05aacd78-7157-4300-88d1-de9770dc85ee" containerName="init" Dec 09 11:32:33 crc kubenswrapper[5002]: E1209 11:32:33.067234 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05aacd78-7157-4300-88d1-de9770dc85ee" containerName="dnsmasq-dns" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.067250 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="05aacd78-7157-4300-88d1-de9770dc85ee" containerName="dnsmasq-dns" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.067684 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="05aacd78-7157-4300-88d1-de9770dc85ee" containerName="dnsmasq-dns" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.071099 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.075894 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wnfqh"] Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.224529 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-catalog-content\") pod \"redhat-operators-wnfqh\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.224579 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-286zf\" (UniqueName: \"kubernetes.io/projected/d53a238e-51d7-49fa-be22-3548be4a590d-kube-api-access-286zf\") pod \"redhat-operators-wnfqh\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.224905 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-utilities\") pod \"redhat-operators-wnfqh\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.327272 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-utilities\") pod \"redhat-operators-wnfqh\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.327533 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-catalog-content\") pod \"redhat-operators-wnfqh\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.327570 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-286zf\" (UniqueName: \"kubernetes.io/projected/d53a238e-51d7-49fa-be22-3548be4a590d-kube-api-access-286zf\") pod \"redhat-operators-wnfqh\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.328051 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-utilities\") pod \"redhat-operators-wnfqh\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.328200 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-catalog-content\") pod \"redhat-operators-wnfqh\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.346452 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-286zf\" (UniqueName: \"kubernetes.io/projected/d53a238e-51d7-49fa-be22-3548be4a590d-kube-api-access-286zf\") pod \"redhat-operators-wnfqh\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.395113 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.590225 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.590584 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.617730 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.618288 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 11:32:33 crc kubenswrapper[5002]: I1209 11:32:33.854255 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wnfqh"] Dec 09 11:32:34 crc kubenswrapper[5002]: I1209 11:32:34.597884 5002 generic.go:334] "Generic (PLEG): container finished" podID="d53a238e-51d7-49fa-be22-3548be4a590d" containerID="9cf713de871f9493f980662a5886a7a5fa0b2a632368339d1ceb5be6d35d55d3" exitCode=0 Dec 09 11:32:34 crc kubenswrapper[5002]: I1209 11:32:34.597974 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnfqh" event={"ID":"d53a238e-51d7-49fa-be22-3548be4a590d","Type":"ContainerDied","Data":"9cf713de871f9493f980662a5886a7a5fa0b2a632368339d1ceb5be6d35d55d3"} Dec 09 11:32:34 crc kubenswrapper[5002]: I1209 11:32:34.598209 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnfqh" event={"ID":"d53a238e-51d7-49fa-be22-3548be4a590d","Type":"ContainerStarted","Data":"9c5d0678e234bc79529b15d755066b3f63a67cce385270e13e05c8dbd3624943"} Dec 09 11:32:34 crc kubenswrapper[5002]: I1209 11:32:34.600446 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:32:35 crc kubenswrapper[5002]: I1209 11:32:35.610052 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnfqh" event={"ID":"d53a238e-51d7-49fa-be22-3548be4a590d","Type":"ContainerStarted","Data":"a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007"} Dec 09 11:32:36 crc kubenswrapper[5002]: I1209 11:32:36.623981 5002 generic.go:334] "Generic (PLEG): container finished" podID="d53a238e-51d7-49fa-be22-3548be4a590d" containerID="a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007" exitCode=0 Dec 09 11:32:36 crc kubenswrapper[5002]: I1209 11:32:36.624040 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnfqh" event={"ID":"d53a238e-51d7-49fa-be22-3548be4a590d","Type":"ContainerDied","Data":"a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007"} Dec 09 11:32:37 crc kubenswrapper[5002]: I1209 11:32:37.635944 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnfqh" event={"ID":"d53a238e-51d7-49fa-be22-3548be4a590d","Type":"ContainerStarted","Data":"3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d"} Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.139679 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wnfqh" podStartSLOduration=5.6837490630000005 podStartE2EDuration="8.139658287s" podCreationTimestamp="2025-12-09 11:32:33 +0000 UTC" firstStartedPulling="2025-12-09 11:32:34.600229152 +0000 UTC m=+5486.992280233" lastFinishedPulling="2025-12-09 11:32:37.056138326 +0000 UTC m=+5489.448189457" observedRunningTime="2025-12-09 11:32:37.663217035 +0000 UTC m=+5490.055268126" watchObservedRunningTime="2025-12-09 11:32:41.139658287 +0000 UTC m=+5493.531709368" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.141543 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5fgxf"] Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.143477 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5fgxf" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.160158 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5fgxf"] Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.232621 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-069c-account-create-update-gb6q6"] Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.233651 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-069c-account-create-update-gb6q6" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.236074 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.241248 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-069c-account-create-update-gb6q6"] Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.273832 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-operator-scripts\") pod \"placement-db-create-5fgxf\" (UID: \"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b\") " pod="openstack/placement-db-create-5fgxf" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.273903 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8ck6\" (UniqueName: \"kubernetes.io/projected/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-kube-api-access-g8ck6\") pod \"placement-db-create-5fgxf\" (UID: \"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b\") " pod="openstack/placement-db-create-5fgxf" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.377046 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bnk5\" (UniqueName: \"kubernetes.io/projected/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-kube-api-access-5bnk5\") pod \"placement-069c-account-create-update-gb6q6\" (UID: \"d6b5c066-5fc5-40ba-99eb-552de9a2abfb\") " pod="openstack/placement-069c-account-create-update-gb6q6" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.377226 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-operator-scripts\") pod \"placement-db-create-5fgxf\" (UID: \"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b\") " pod="openstack/placement-db-create-5fgxf" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.377292 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8ck6\" (UniqueName: \"kubernetes.io/projected/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-kube-api-access-g8ck6\") pod \"placement-db-create-5fgxf\" (UID: \"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b\") " pod="openstack/placement-db-create-5fgxf" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.378210 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-operator-scripts\") pod \"placement-069c-account-create-update-gb6q6\" (UID: \"d6b5c066-5fc5-40ba-99eb-552de9a2abfb\") " pod="openstack/placement-069c-account-create-update-gb6q6" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.378654 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-operator-scripts\") pod \"placement-db-create-5fgxf\" (UID: \"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b\") " pod="openstack/placement-db-create-5fgxf" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.401226 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8ck6\" (UniqueName: \"kubernetes.io/projected/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-kube-api-access-g8ck6\") pod \"placement-db-create-5fgxf\" (UID: \"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b\") " pod="openstack/placement-db-create-5fgxf" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.463190 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5fgxf" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.480999 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bnk5\" (UniqueName: \"kubernetes.io/projected/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-kube-api-access-5bnk5\") pod \"placement-069c-account-create-update-gb6q6\" (UID: \"d6b5c066-5fc5-40ba-99eb-552de9a2abfb\") " pod="openstack/placement-069c-account-create-update-gb6q6" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.481116 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-operator-scripts\") pod \"placement-069c-account-create-update-gb6q6\" (UID: \"d6b5c066-5fc5-40ba-99eb-552de9a2abfb\") " pod="openstack/placement-069c-account-create-update-gb6q6" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.482110 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-operator-scripts\") pod \"placement-069c-account-create-update-gb6q6\" (UID: \"d6b5c066-5fc5-40ba-99eb-552de9a2abfb\") " pod="openstack/placement-069c-account-create-update-gb6q6" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.499407 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bnk5\" (UniqueName: \"kubernetes.io/projected/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-kube-api-access-5bnk5\") pod \"placement-069c-account-create-update-gb6q6\" (UID: \"d6b5c066-5fc5-40ba-99eb-552de9a2abfb\") " pod="openstack/placement-069c-account-create-update-gb6q6" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.549744 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-069c-account-create-update-gb6q6" Dec 09 11:32:41 crc kubenswrapper[5002]: I1209 11:32:41.927572 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5fgxf"] Dec 09 11:32:41 crc kubenswrapper[5002]: W1209 11:32:41.930878 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce45c659_f71a_4bc5_8a2d_d4f2e230fb0b.slice/crio-66bcf7ec7acffc59fc3696e045d6dcf08e5e7fc6d88950468e28fd46e0baedb4 WatchSource:0}: Error finding container 66bcf7ec7acffc59fc3696e045d6dcf08e5e7fc6d88950468e28fd46e0baedb4: Status 404 returned error can't find the container with id 66bcf7ec7acffc59fc3696e045d6dcf08e5e7fc6d88950468e28fd46e0baedb4 Dec 09 11:32:42 crc kubenswrapper[5002]: W1209 11:32:42.063592 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6b5c066_5fc5_40ba_99eb_552de9a2abfb.slice/crio-b9c220a87c92989dcbb713939ab9860595518fa2551fb19caa7d87a549dde773 WatchSource:0}: Error finding container b9c220a87c92989dcbb713939ab9860595518fa2551fb19caa7d87a549dde773: Status 404 returned error can't find the container with id b9c220a87c92989dcbb713939ab9860595518fa2551fb19caa7d87a549dde773 Dec 09 11:32:42 crc kubenswrapper[5002]: I1209 11:32:42.075026 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-069c-account-create-update-gb6q6"] Dec 09 11:32:42 crc kubenswrapper[5002]: I1209 11:32:42.685604 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-069c-account-create-update-gb6q6" event={"ID":"d6b5c066-5fc5-40ba-99eb-552de9a2abfb","Type":"ContainerStarted","Data":"b9c220a87c92989dcbb713939ab9860595518fa2551fb19caa7d87a549dde773"} Dec 09 11:32:42 crc kubenswrapper[5002]: I1209 11:32:42.686953 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5fgxf" event={"ID":"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b","Type":"ContainerStarted","Data":"66bcf7ec7acffc59fc3696e045d6dcf08e5e7fc6d88950468e28fd46e0baedb4"} Dec 09 11:32:43 crc kubenswrapper[5002]: I1209 11:32:43.395543 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:43 crc kubenswrapper[5002]: I1209 11:32:43.395670 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:43 crc kubenswrapper[5002]: I1209 11:32:43.479177 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:43 crc kubenswrapper[5002]: I1209 11:32:43.769212 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:44 crc kubenswrapper[5002]: I1209 11:32:44.721760 5002 generic.go:334] "Generic (PLEG): container finished" podID="ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b" containerID="0e20ac414f2b1aa76e45c85921a3e6fbb34f3ea393422301c6b27ff7b365452f" exitCode=0 Dec 09 11:32:44 crc kubenswrapper[5002]: I1209 11:32:44.721838 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5fgxf" event={"ID":"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b","Type":"ContainerDied","Data":"0e20ac414f2b1aa76e45c85921a3e6fbb34f3ea393422301c6b27ff7b365452f"} Dec 09 11:32:44 crc kubenswrapper[5002]: I1209 11:32:44.724418 5002 generic.go:334] "Generic (PLEG): container finished" podID="d6b5c066-5fc5-40ba-99eb-552de9a2abfb" containerID="5fccb647da094d58682f4b945b3f22efb76875fa4fe0a6b9c9294851c21f4deb" exitCode=0 Dec 09 11:32:44 crc kubenswrapper[5002]: I1209 11:32:44.724480 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-069c-account-create-update-gb6q6" event={"ID":"d6b5c066-5fc5-40ba-99eb-552de9a2abfb","Type":"ContainerDied","Data":"5fccb647da094d58682f4b945b3f22efb76875fa4fe0a6b9c9294851c21f4deb"} Dec 09 11:32:44 crc kubenswrapper[5002]: I1209 11:32:44.833417 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wnfqh"] Dec 09 11:32:45 crc kubenswrapper[5002]: I1209 11:32:45.736012 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wnfqh" podUID="d53a238e-51d7-49fa-be22-3548be4a590d" containerName="registry-server" containerID="cri-o://3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d" gracePeriod=2 Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.060695 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:32:46 crc kubenswrapper[5002]: E1209 11:32:46.061779 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.072108 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-069c-account-create-update-gb6q6" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.194108 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bnk5\" (UniqueName: \"kubernetes.io/projected/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-kube-api-access-5bnk5\") pod \"d6b5c066-5fc5-40ba-99eb-552de9a2abfb\" (UID: \"d6b5c066-5fc5-40ba-99eb-552de9a2abfb\") " Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.194259 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-operator-scripts\") pod \"d6b5c066-5fc5-40ba-99eb-552de9a2abfb\" (UID: \"d6b5c066-5fc5-40ba-99eb-552de9a2abfb\") " Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.196642 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6b5c066-5fc5-40ba-99eb-552de9a2abfb" (UID: "d6b5c066-5fc5-40ba-99eb-552de9a2abfb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.205583 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-kube-api-access-5bnk5" (OuterVolumeSpecName: "kube-api-access-5bnk5") pod "d6b5c066-5fc5-40ba-99eb-552de9a2abfb" (UID: "d6b5c066-5fc5-40ba-99eb-552de9a2abfb"). InnerVolumeSpecName "kube-api-access-5bnk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.263617 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5fgxf" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.273132 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.298342 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.298392 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bnk5\" (UniqueName: \"kubernetes.io/projected/d6b5c066-5fc5-40ba-99eb-552de9a2abfb-kube-api-access-5bnk5\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.307082 5002 scope.go:117] "RemoveContainer" containerID="0f7acec706fd6888f9f7eabc86bf59fe77a708e3db8637ae0bf85e2bcb4c655a" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.329513 5002 scope.go:117] "RemoveContainer" containerID="29833e54ea38c9e6dd2ba4cc42962c851d100570e218478b0dc10f9edc718553" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.399912 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-catalog-content\") pod \"d53a238e-51d7-49fa-be22-3548be4a590d\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.400007 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8ck6\" (UniqueName: \"kubernetes.io/projected/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-kube-api-access-g8ck6\") pod \"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b\" (UID: \"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b\") " Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.400134 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-utilities\") pod \"d53a238e-51d7-49fa-be22-3548be4a590d\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.400971 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-utilities" (OuterVolumeSpecName: "utilities") pod "d53a238e-51d7-49fa-be22-3548be4a590d" (UID: "d53a238e-51d7-49fa-be22-3548be4a590d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.401100 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-286zf\" (UniqueName: \"kubernetes.io/projected/d53a238e-51d7-49fa-be22-3548be4a590d-kube-api-access-286zf\") pod \"d53a238e-51d7-49fa-be22-3548be4a590d\" (UID: \"d53a238e-51d7-49fa-be22-3548be4a590d\") " Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.401265 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-operator-scripts\") pod \"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b\" (UID: \"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b\") " Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.401791 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b" (UID: "ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.402012 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.402038 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.405541 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-kube-api-access-g8ck6" (OuterVolumeSpecName: "kube-api-access-g8ck6") pod "ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b" (UID: "ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b"). InnerVolumeSpecName "kube-api-access-g8ck6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.410673 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d53a238e-51d7-49fa-be22-3548be4a590d-kube-api-access-286zf" (OuterVolumeSpecName: "kube-api-access-286zf") pod "d53a238e-51d7-49fa-be22-3548be4a590d" (UID: "d53a238e-51d7-49fa-be22-3548be4a590d"). InnerVolumeSpecName "kube-api-access-286zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.504227 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8ck6\" (UniqueName: \"kubernetes.io/projected/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b-kube-api-access-g8ck6\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.504290 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-286zf\" (UniqueName: \"kubernetes.io/projected/d53a238e-51d7-49fa-be22-3548be4a590d-kube-api-access-286zf\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.747140 5002 generic.go:334] "Generic (PLEG): container finished" podID="d53a238e-51d7-49fa-be22-3548be4a590d" containerID="3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d" exitCode=0 Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.747178 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wnfqh" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.747213 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnfqh" event={"ID":"d53a238e-51d7-49fa-be22-3548be4a590d","Type":"ContainerDied","Data":"3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d"} Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.747310 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnfqh" event={"ID":"d53a238e-51d7-49fa-be22-3548be4a590d","Type":"ContainerDied","Data":"9c5d0678e234bc79529b15d755066b3f63a67cce385270e13e05c8dbd3624943"} Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.747371 5002 scope.go:117] "RemoveContainer" containerID="3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.753219 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-069c-account-create-update-gb6q6" event={"ID":"d6b5c066-5fc5-40ba-99eb-552de9a2abfb","Type":"ContainerDied","Data":"b9c220a87c92989dcbb713939ab9860595518fa2551fb19caa7d87a549dde773"} Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.753247 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9c220a87c92989dcbb713939ab9860595518fa2551fb19caa7d87a549dde773" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.753313 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-069c-account-create-update-gb6q6" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.755366 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5fgxf" event={"ID":"ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b","Type":"ContainerDied","Data":"66bcf7ec7acffc59fc3696e045d6dcf08e5e7fc6d88950468e28fd46e0baedb4"} Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.755389 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66bcf7ec7acffc59fc3696e045d6dcf08e5e7fc6d88950468e28fd46e0baedb4" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.755533 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5fgxf" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.786683 5002 scope.go:117] "RemoveContainer" containerID="a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.820747 5002 scope.go:117] "RemoveContainer" containerID="9cf713de871f9493f980662a5886a7a5fa0b2a632368339d1ceb5be6d35d55d3" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.842135 5002 scope.go:117] "RemoveContainer" containerID="3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d" Dec 09 11:32:46 crc kubenswrapper[5002]: E1209 11:32:46.842801 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d\": container with ID starting with 3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d not found: ID does not exist" containerID="3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.842906 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d"} err="failed to get container status \"3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d\": rpc error: code = NotFound desc = could not find container \"3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d\": container with ID starting with 3ccd67dcb2ed6179d354ac2b70d924d6f947b1c5320aa49ee815331927c4fd7d not found: ID does not exist" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.842964 5002 scope.go:117] "RemoveContainer" containerID="a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007" Dec 09 11:32:46 crc kubenswrapper[5002]: E1209 11:32:46.843350 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007\": container with ID starting with a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007 not found: ID does not exist" containerID="a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.843383 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007"} err="failed to get container status \"a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007\": rpc error: code = NotFound desc = could not find container \"a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007\": container with ID starting with a514bfbe18bc2fcf49db5171f5daa6cc012512a60dfe9932db3d6b62c9c77007 not found: ID does not exist" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.843409 5002 scope.go:117] "RemoveContainer" containerID="9cf713de871f9493f980662a5886a7a5fa0b2a632368339d1ceb5be6d35d55d3" Dec 09 11:32:46 crc kubenswrapper[5002]: E1209 11:32:46.843780 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cf713de871f9493f980662a5886a7a5fa0b2a632368339d1ceb5be6d35d55d3\": container with ID starting with 9cf713de871f9493f980662a5886a7a5fa0b2a632368339d1ceb5be6d35d55d3 not found: ID does not exist" containerID="9cf713de871f9493f980662a5886a7a5fa0b2a632368339d1ceb5be6d35d55d3" Dec 09 11:32:46 crc kubenswrapper[5002]: I1209 11:32:46.843852 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf713de871f9493f980662a5886a7a5fa0b2a632368339d1ceb5be6d35d55d3"} err="failed to get container status \"9cf713de871f9493f980662a5886a7a5fa0b2a632368339d1ceb5be6d35d55d3\": rpc error: code = NotFound desc = could not find container \"9cf713de871f9493f980662a5886a7a5fa0b2a632368339d1ceb5be6d35d55d3\": container with ID starting with 9cf713de871f9493f980662a5886a7a5fa0b2a632368339d1ceb5be6d35d55d3 not found: ID does not exist" Dec 09 11:32:47 crc kubenswrapper[5002]: I1209 11:32:47.506876 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d53a238e-51d7-49fa-be22-3548be4a590d" (UID: "d53a238e-51d7-49fa-be22-3548be4a590d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:47 crc kubenswrapper[5002]: I1209 11:32:47.527700 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d53a238e-51d7-49fa-be22-3548be4a590d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:47 crc kubenswrapper[5002]: I1209 11:32:47.689700 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wnfqh"] Dec 09 11:32:47 crc kubenswrapper[5002]: I1209 11:32:47.698650 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wnfqh"] Dec 09 11:32:48 crc kubenswrapper[5002]: I1209 11:32:48.083066 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d53a238e-51d7-49fa-be22-3548be4a590d" path="/var/lib/kubelet/pods/d53a238e-51d7-49fa-be22-3548be4a590d/volumes" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.514341 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zwq82"] Dec 09 11:32:51 crc kubenswrapper[5002]: E1209 11:32:51.517859 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53a238e-51d7-49fa-be22-3548be4a590d" containerName="extract-content" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.518027 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53a238e-51d7-49fa-be22-3548be4a590d" containerName="extract-content" Dec 09 11:32:51 crc kubenswrapper[5002]: E1209 11:32:51.518118 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53a238e-51d7-49fa-be22-3548be4a590d" containerName="registry-server" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.518183 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53a238e-51d7-49fa-be22-3548be4a590d" containerName="registry-server" Dec 09 11:32:51 crc kubenswrapper[5002]: E1209 11:32:51.518255 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53a238e-51d7-49fa-be22-3548be4a590d" containerName="extract-utilities" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.518319 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53a238e-51d7-49fa-be22-3548be4a590d" containerName="extract-utilities" Dec 09 11:32:51 crc kubenswrapper[5002]: E1209 11:32:51.518382 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b" containerName="mariadb-database-create" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.518437 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b" containerName="mariadb-database-create" Dec 09 11:32:51 crc kubenswrapper[5002]: E1209 11:32:51.518496 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b5c066-5fc5-40ba-99eb-552de9a2abfb" containerName="mariadb-account-create-update" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.518556 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b5c066-5fc5-40ba-99eb-552de9a2abfb" containerName="mariadb-account-create-update" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.518887 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b5c066-5fc5-40ba-99eb-552de9a2abfb" containerName="mariadb-account-create-update" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.518999 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d53a238e-51d7-49fa-be22-3548be4a590d" containerName="registry-server" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.519066 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b" containerName="mariadb-database-create" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.519725 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.527337 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-v7blq" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.527587 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.527708 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.533139 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zwq82"] Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.546265 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-l9t2c"] Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.547851 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.581983 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-l9t2c"] Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.610532 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c13469b-99fb-4eac-b05f-ec9c7f806022-logs\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.610586 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-scripts\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.610629 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-combined-ca-bundle\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.610798 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-config-data\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.610878 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt4k4\" (UniqueName: \"kubernetes.io/projected/3c13469b-99fb-4eac-b05f-ec9c7f806022-kube-api-access-pt4k4\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.713860 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-sb\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.713924 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-combined-ca-bundle\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.713980 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj4z8\" (UniqueName: \"kubernetes.io/projected/16eec812-f54c-47c6-a589-678dde276c4a-kube-api-access-bj4z8\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.714004 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-config-data\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.714032 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt4k4\" (UniqueName: \"kubernetes.io/projected/3c13469b-99fb-4eac-b05f-ec9c7f806022-kube-api-access-pt4k4\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.714113 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-dns-svc\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.714159 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c13469b-99fb-4eac-b05f-ec9c7f806022-logs\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.714190 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-nb\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.714227 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-scripts\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.714251 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-config\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.715581 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c13469b-99fb-4eac-b05f-ec9c7f806022-logs\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.722080 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-config-data\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.724075 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-scripts\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.725456 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-combined-ca-bundle\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.732805 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt4k4\" (UniqueName: \"kubernetes.io/projected/3c13469b-99fb-4eac-b05f-ec9c7f806022-kube-api-access-pt4k4\") pod \"placement-db-sync-zwq82\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.816381 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-nb\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.817717 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-nb\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.817893 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-config\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.818627 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-config\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.818755 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-sb\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.819476 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-sb\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.819645 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj4z8\" (UniqueName: \"kubernetes.io/projected/16eec812-f54c-47c6-a589-678dde276c4a-kube-api-access-bj4z8\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.820237 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-dns-svc\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.820932 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-dns-svc\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.837276 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj4z8\" (UniqueName: \"kubernetes.io/projected/16eec812-f54c-47c6-a589-678dde276c4a-kube-api-access-bj4z8\") pod \"dnsmasq-dns-74df65d56c-l9t2c\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.841368 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:51 crc kubenswrapper[5002]: I1209 11:32:51.874832 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:52 crc kubenswrapper[5002]: I1209 11:32:52.311673 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zwq82"] Dec 09 11:32:52 crc kubenswrapper[5002]: I1209 11:32:52.412592 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-l9t2c"] Dec 09 11:32:52 crc kubenswrapper[5002]: W1209 11:32:52.415651 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16eec812_f54c_47c6_a589_678dde276c4a.slice/crio-40adc1e7b694f9eced2f615e323391ea074c4ef56a092ea0bdbd9505e373ce28 WatchSource:0}: Error finding container 40adc1e7b694f9eced2f615e323391ea074c4ef56a092ea0bdbd9505e373ce28: Status 404 returned error can't find the container with id 40adc1e7b694f9eced2f615e323391ea074c4ef56a092ea0bdbd9505e373ce28 Dec 09 11:32:52 crc kubenswrapper[5002]: I1209 11:32:52.817488 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zwq82" event={"ID":"3c13469b-99fb-4eac-b05f-ec9c7f806022","Type":"ContainerStarted","Data":"52242b04f97709bd7e2fb3c8f9c7e6ca7de31a5c540d28fcb628875993952497"} Dec 09 11:32:52 crc kubenswrapper[5002]: I1209 11:32:52.820901 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" event={"ID":"16eec812-f54c-47c6-a589-678dde276c4a","Type":"ContainerStarted","Data":"40adc1e7b694f9eced2f615e323391ea074c4ef56a092ea0bdbd9505e373ce28"} Dec 09 11:32:54 crc kubenswrapper[5002]: I1209 11:32:54.847847 5002 generic.go:334] "Generic (PLEG): container finished" podID="16eec812-f54c-47c6-a589-678dde276c4a" containerID="43cedbe4b3b104d834d8e2e8d92dfc163b26dcfd6ecd5f47edaf171e0619e046" exitCode=0 Dec 09 11:32:54 crc kubenswrapper[5002]: I1209 11:32:54.847960 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" event={"ID":"16eec812-f54c-47c6-a589-678dde276c4a","Type":"ContainerDied","Data":"43cedbe4b3b104d834d8e2e8d92dfc163b26dcfd6ecd5f47edaf171e0619e046"} Dec 09 11:32:54 crc kubenswrapper[5002]: I1209 11:32:54.851144 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zwq82" event={"ID":"3c13469b-99fb-4eac-b05f-ec9c7f806022","Type":"ContainerStarted","Data":"368e80e160dd49a06b484bf5b0ef26549e076ef6842dcf7c2a30a3464d733cf7"} Dec 09 11:32:54 crc kubenswrapper[5002]: I1209 11:32:54.894264 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zwq82" podStartSLOduration=3.8942467 podStartE2EDuration="3.8942467s" podCreationTimestamp="2025-12-09 11:32:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:32:54.887130819 +0000 UTC m=+5507.279182050" watchObservedRunningTime="2025-12-09 11:32:54.8942467 +0000 UTC m=+5507.286297781" Dec 09 11:32:55 crc kubenswrapper[5002]: I1209 11:32:55.862050 5002 generic.go:334] "Generic (PLEG): container finished" podID="3c13469b-99fb-4eac-b05f-ec9c7f806022" containerID="368e80e160dd49a06b484bf5b0ef26549e076ef6842dcf7c2a30a3464d733cf7" exitCode=0 Dec 09 11:32:55 crc kubenswrapper[5002]: I1209 11:32:55.862105 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zwq82" event={"ID":"3c13469b-99fb-4eac-b05f-ec9c7f806022","Type":"ContainerDied","Data":"368e80e160dd49a06b484bf5b0ef26549e076ef6842dcf7c2a30a3464d733cf7"} Dec 09 11:32:55 crc kubenswrapper[5002]: I1209 11:32:55.864874 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" event={"ID":"16eec812-f54c-47c6-a589-678dde276c4a","Type":"ContainerStarted","Data":"1701310af47e13f7baace4b38b57de9c5fb3d4eabeda36b3c7ed6c88475a9973"} Dec 09 11:32:55 crc kubenswrapper[5002]: I1209 11:32:55.865117 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.233870 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.252217 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" podStartSLOduration=6.252201152 podStartE2EDuration="6.252201152s" podCreationTimestamp="2025-12-09 11:32:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:32:55.901517743 +0000 UTC m=+5508.293568834" watchObservedRunningTime="2025-12-09 11:32:57.252201152 +0000 UTC m=+5509.644252224" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.328550 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt4k4\" (UniqueName: \"kubernetes.io/projected/3c13469b-99fb-4eac-b05f-ec9c7f806022-kube-api-access-pt4k4\") pod \"3c13469b-99fb-4eac-b05f-ec9c7f806022\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.328645 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-combined-ca-bundle\") pod \"3c13469b-99fb-4eac-b05f-ec9c7f806022\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.328771 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c13469b-99fb-4eac-b05f-ec9c7f806022-logs\") pod \"3c13469b-99fb-4eac-b05f-ec9c7f806022\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.328806 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-scripts\") pod \"3c13469b-99fb-4eac-b05f-ec9c7f806022\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.328956 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-config-data\") pod \"3c13469b-99fb-4eac-b05f-ec9c7f806022\" (UID: \"3c13469b-99fb-4eac-b05f-ec9c7f806022\") " Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.331929 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c13469b-99fb-4eac-b05f-ec9c7f806022-logs" (OuterVolumeSpecName: "logs") pod "3c13469b-99fb-4eac-b05f-ec9c7f806022" (UID: "3c13469b-99fb-4eac-b05f-ec9c7f806022"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.335197 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-scripts" (OuterVolumeSpecName: "scripts") pod "3c13469b-99fb-4eac-b05f-ec9c7f806022" (UID: "3c13469b-99fb-4eac-b05f-ec9c7f806022"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.335450 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c13469b-99fb-4eac-b05f-ec9c7f806022-kube-api-access-pt4k4" (OuterVolumeSpecName: "kube-api-access-pt4k4") pod "3c13469b-99fb-4eac-b05f-ec9c7f806022" (UID: "3c13469b-99fb-4eac-b05f-ec9c7f806022"). InnerVolumeSpecName "kube-api-access-pt4k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.353953 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c13469b-99fb-4eac-b05f-ec9c7f806022" (UID: "3c13469b-99fb-4eac-b05f-ec9c7f806022"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.361516 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-config-data" (OuterVolumeSpecName: "config-data") pod "3c13469b-99fb-4eac-b05f-ec9c7f806022" (UID: "3c13469b-99fb-4eac-b05f-ec9c7f806022"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.431377 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.431632 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c13469b-99fb-4eac-b05f-ec9c7f806022-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.431643 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.431652 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c13469b-99fb-4eac-b05f-ec9c7f806022-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.431664 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt4k4\" (UniqueName: \"kubernetes.io/projected/3c13469b-99fb-4eac-b05f-ec9c7f806022-kube-api-access-pt4k4\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.943967 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zwq82" event={"ID":"3c13469b-99fb-4eac-b05f-ec9c7f806022","Type":"ContainerDied","Data":"52242b04f97709bd7e2fb3c8f9c7e6ca7de31a5c540d28fcb628875993952497"} Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.944011 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52242b04f97709bd7e2fb3c8f9c7e6ca7de31a5c540d28fcb628875993952497" Dec 09 11:32:57 crc kubenswrapper[5002]: I1209 11:32:57.944093 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zwq82" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.067010 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:32:58 crc kubenswrapper[5002]: E1209 11:32:58.067305 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.333443 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-77cc5c8946-bsgmg"] Dec 09 11:32:58 crc kubenswrapper[5002]: E1209 11:32:58.334599 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c13469b-99fb-4eac-b05f-ec9c7f806022" containerName="placement-db-sync" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.334736 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c13469b-99fb-4eac-b05f-ec9c7f806022" containerName="placement-db-sync" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.335081 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c13469b-99fb-4eac-b05f-ec9c7f806022" containerName="placement-db-sync" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.336355 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.347763 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-77cc5c8946-bsgmg"] Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.349314 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.350110 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.353047 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-v7blq" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.450895 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26231343-3514-4a68-bbb5-95594b525082-scripts\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.450964 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26231343-3514-4a68-bbb5-95594b525082-combined-ca-bundle\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.450994 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxhhl\" (UniqueName: \"kubernetes.io/projected/26231343-3514-4a68-bbb5-95594b525082-kube-api-access-kxhhl\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.451459 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26231343-3514-4a68-bbb5-95594b525082-logs\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.451551 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26231343-3514-4a68-bbb5-95594b525082-config-data\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.553162 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26231343-3514-4a68-bbb5-95594b525082-scripts\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.553221 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26231343-3514-4a68-bbb5-95594b525082-combined-ca-bundle\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.553243 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxhhl\" (UniqueName: \"kubernetes.io/projected/26231343-3514-4a68-bbb5-95594b525082-kube-api-access-kxhhl\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.553350 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26231343-3514-4a68-bbb5-95594b525082-logs\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.553369 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26231343-3514-4a68-bbb5-95594b525082-config-data\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.553847 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26231343-3514-4a68-bbb5-95594b525082-logs\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.560511 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26231343-3514-4a68-bbb5-95594b525082-config-data\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.566226 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26231343-3514-4a68-bbb5-95594b525082-scripts\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.566682 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26231343-3514-4a68-bbb5-95594b525082-combined-ca-bundle\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.572285 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxhhl\" (UniqueName: \"kubernetes.io/projected/26231343-3514-4a68-bbb5-95594b525082-kube-api-access-kxhhl\") pod \"placement-77cc5c8946-bsgmg\" (UID: \"26231343-3514-4a68-bbb5-95594b525082\") " pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.667467 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:58 crc kubenswrapper[5002]: I1209 11:32:58.965312 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-77cc5c8946-bsgmg"] Dec 09 11:32:59 crc kubenswrapper[5002]: I1209 11:32:59.962359 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77cc5c8946-bsgmg" event={"ID":"26231343-3514-4a68-bbb5-95594b525082","Type":"ContainerStarted","Data":"03ed0e6feb80eea5bdb857dc4e77be498843490ebf97eee479af686b5f1e2c78"} Dec 09 11:32:59 crc kubenswrapper[5002]: I1209 11:32:59.963165 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:59 crc kubenswrapper[5002]: I1209 11:32:59.963194 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77cc5c8946-bsgmg" event={"ID":"26231343-3514-4a68-bbb5-95594b525082","Type":"ContainerStarted","Data":"8e3a517b5288ebd95766ee7cd2c67eeb97d0f122ebad1443394dff6996889462"} Dec 09 11:32:59 crc kubenswrapper[5002]: I1209 11:32:59.963218 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77cc5c8946-bsgmg" event={"ID":"26231343-3514-4a68-bbb5-95594b525082","Type":"ContainerStarted","Data":"3720613bcecff539bc24628230caa2fcad970eef31d0371a65cf6567b97eaa6b"} Dec 09 11:32:59 crc kubenswrapper[5002]: I1209 11:32:59.963238 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:32:59 crc kubenswrapper[5002]: I1209 11:32:59.995778 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-77cc5c8946-bsgmg" podStartSLOduration=1.99575583 podStartE2EDuration="1.99575583s" podCreationTimestamp="2025-12-09 11:32:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:32:59.986472362 +0000 UTC m=+5512.378523483" watchObservedRunningTime="2025-12-09 11:32:59.99575583 +0000 UTC m=+5512.387806921" Dec 09 11:33:01 crc kubenswrapper[5002]: I1209 11:33:01.877004 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:33:01 crc kubenswrapper[5002]: I1209 11:33:01.955284 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-zmzv8"] Dec 09 11:33:01 crc kubenswrapper[5002]: I1209 11:33:01.955574 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" podUID="a0321744-2cda-4f1f-9ea5-8ffa684b5373" containerName="dnsmasq-dns" containerID="cri-o://a8fe9843eb691abd30db10125382870e732c648f12b2f9276abf088d2f89ebf8" gracePeriod=10 Dec 09 11:33:02 crc kubenswrapper[5002]: I1209 11:33:02.990974 5002 generic.go:334] "Generic (PLEG): container finished" podID="a0321744-2cda-4f1f-9ea5-8ffa684b5373" containerID="a8fe9843eb691abd30db10125382870e732c648f12b2f9276abf088d2f89ebf8" exitCode=0 Dec 09 11:33:02 crc kubenswrapper[5002]: I1209 11:33:02.991246 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" event={"ID":"a0321744-2cda-4f1f-9ea5-8ffa684b5373","Type":"ContainerDied","Data":"a8fe9843eb691abd30db10125382870e732c648f12b2f9276abf088d2f89ebf8"} Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.363408 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.444980 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-dns-svc\") pod \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.445075 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4wk4\" (UniqueName: \"kubernetes.io/projected/a0321744-2cda-4f1f-9ea5-8ffa684b5373-kube-api-access-z4wk4\") pod \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.445104 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-nb\") pod \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.445158 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-sb\") pod \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.445185 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-config\") pod \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\" (UID: \"a0321744-2cda-4f1f-9ea5-8ffa684b5373\") " Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.450731 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0321744-2cda-4f1f-9ea5-8ffa684b5373-kube-api-access-z4wk4" (OuterVolumeSpecName: "kube-api-access-z4wk4") pod "a0321744-2cda-4f1f-9ea5-8ffa684b5373" (UID: "a0321744-2cda-4f1f-9ea5-8ffa684b5373"). InnerVolumeSpecName "kube-api-access-z4wk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.493136 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-config" (OuterVolumeSpecName: "config") pod "a0321744-2cda-4f1f-9ea5-8ffa684b5373" (UID: "a0321744-2cda-4f1f-9ea5-8ffa684b5373"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.495492 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a0321744-2cda-4f1f-9ea5-8ffa684b5373" (UID: "a0321744-2cda-4f1f-9ea5-8ffa684b5373"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.499422 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a0321744-2cda-4f1f-9ea5-8ffa684b5373" (UID: "a0321744-2cda-4f1f-9ea5-8ffa684b5373"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.503124 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0321744-2cda-4f1f-9ea5-8ffa684b5373" (UID: "a0321744-2cda-4f1f-9ea5-8ffa684b5373"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.549761 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.549783 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4wk4\" (UniqueName: \"kubernetes.io/projected/a0321744-2cda-4f1f-9ea5-8ffa684b5373-kube-api-access-z4wk4\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.549795 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.549804 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:03 crc kubenswrapper[5002]: I1209 11:33:03.549825 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0321744-2cda-4f1f-9ea5-8ffa684b5373-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:04 crc kubenswrapper[5002]: I1209 11:33:04.003196 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" event={"ID":"a0321744-2cda-4f1f-9ea5-8ffa684b5373","Type":"ContainerDied","Data":"4f13571c2636d93a250256470bb9d7af7026d1cce7b1f5da9a95c358b71fa716"} Dec 09 11:33:04 crc kubenswrapper[5002]: I1209 11:33:04.003487 5002 scope.go:117] "RemoveContainer" containerID="a8fe9843eb691abd30db10125382870e732c648f12b2f9276abf088d2f89ebf8" Dec 09 11:33:04 crc kubenswrapper[5002]: I1209 11:33:04.003309 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9b57f477-zmzv8" Dec 09 11:33:04 crc kubenswrapper[5002]: I1209 11:33:04.042284 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-zmzv8"] Dec 09 11:33:04 crc kubenswrapper[5002]: I1209 11:33:04.050379 5002 scope.go:117] "RemoveContainer" containerID="f186aa1bd90a344ff98f6dd37a3d3342620cce6159599ee1e55dbdd9a14850fc" Dec 09 11:33:04 crc kubenswrapper[5002]: I1209 11:33:04.052102 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-zmzv8"] Dec 09 11:33:04 crc kubenswrapper[5002]: I1209 11:33:04.070449 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0321744-2cda-4f1f-9ea5-8ffa684b5373" path="/var/lib/kubelet/pods/a0321744-2cda-4f1f-9ea5-8ffa684b5373/volumes" Dec 09 11:33:13 crc kubenswrapper[5002]: I1209 11:33:13.060239 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:33:13 crc kubenswrapper[5002]: E1209 11:33:13.061088 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:33:28 crc kubenswrapper[5002]: I1209 11:33:28.074911 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:33:28 crc kubenswrapper[5002]: E1209 11:33:28.075892 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:33:29 crc kubenswrapper[5002]: I1209 11:33:29.841618 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:33:30 crc kubenswrapper[5002]: I1209 11:33:30.849869 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-77cc5c8946-bsgmg" Dec 09 11:33:41 crc kubenswrapper[5002]: I1209 11:33:41.060331 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:33:41 crc kubenswrapper[5002]: E1209 11:33:41.061200 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:33:53 crc kubenswrapper[5002]: I1209 11:33:53.060256 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:33:53 crc kubenswrapper[5002]: E1209 11:33:53.062204 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.586584 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-wbv78"] Dec 09 11:33:54 crc kubenswrapper[5002]: E1209 11:33:54.587410 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0321744-2cda-4f1f-9ea5-8ffa684b5373" containerName="init" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.587432 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0321744-2cda-4f1f-9ea5-8ffa684b5373" containerName="init" Dec 09 11:33:54 crc kubenswrapper[5002]: E1209 11:33:54.587457 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0321744-2cda-4f1f-9ea5-8ffa684b5373" containerName="dnsmasq-dns" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.587465 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0321744-2cda-4f1f-9ea5-8ffa684b5373" containerName="dnsmasq-dns" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.587685 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0321744-2cda-4f1f-9ea5-8ffa684b5373" containerName="dnsmasq-dns" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.588497 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wbv78" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.600135 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5a5c-account-create-update-wlfkx"] Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.601529 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5a5c-account-create-update-wlfkx" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.603544 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.610415 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wbv78"] Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.618019 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5a5c-account-create-update-wlfkx"] Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.679679 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-p7j5k"] Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.681606 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p7j5k" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.690700 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p7j5k"] Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.746913 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08af662c-3e6c-4904-b409-bb179c6f45d1-operator-scripts\") pod \"nova-api-db-create-wbv78\" (UID: \"08af662c-3e6c-4904-b409-bb179c6f45d1\") " pod="openstack/nova-api-db-create-wbv78" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.746997 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7qmz\" (UniqueName: \"kubernetes.io/projected/4d66cb19-429c-4c73-a24c-d8c49803146c-kube-api-access-v7qmz\") pod \"nova-api-5a5c-account-create-update-wlfkx\" (UID: \"4d66cb19-429c-4c73-a24c-d8c49803146c\") " pod="openstack/nova-api-5a5c-account-create-update-wlfkx" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.747070 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkc9z\" (UniqueName: \"kubernetes.io/projected/08af662c-3e6c-4904-b409-bb179c6f45d1-kube-api-access-xkc9z\") pod \"nova-api-db-create-wbv78\" (UID: \"08af662c-3e6c-4904-b409-bb179c6f45d1\") " pod="openstack/nova-api-db-create-wbv78" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.747104 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d66cb19-429c-4c73-a24c-d8c49803146c-operator-scripts\") pod \"nova-api-5a5c-account-create-update-wlfkx\" (UID: \"4d66cb19-429c-4c73-a24c-d8c49803146c\") " pod="openstack/nova-api-5a5c-account-create-update-wlfkx" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.772330 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-lscbm"] Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.774129 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lscbm" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.782591 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lscbm"] Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.807041 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2680-account-create-update-xh47f"] Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.808397 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2680-account-create-update-xh47f" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.820381 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.855305 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5f4f809-9551-419b-9aac-57de880d3629-operator-scripts\") pod \"nova-cell1-db-create-lscbm\" (UID: \"b5f4f809-9551-419b-9aac-57de880d3629\") " pod="openstack/nova-cell1-db-create-lscbm" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.855368 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-operator-scripts\") pod \"nova-cell0-db-create-p7j5k\" (UID: \"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3\") " pod="openstack/nova-cell0-db-create-p7j5k" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.855418 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7qmz\" (UniqueName: \"kubernetes.io/projected/4d66cb19-429c-4c73-a24c-d8c49803146c-kube-api-access-v7qmz\") pod \"nova-api-5a5c-account-create-update-wlfkx\" (UID: \"4d66cb19-429c-4c73-a24c-d8c49803146c\") " pod="openstack/nova-api-5a5c-account-create-update-wlfkx" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.855453 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq4zb\" (UniqueName: \"kubernetes.io/projected/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-kube-api-access-mq4zb\") pod \"nova-cell0-db-create-p7j5k\" (UID: \"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3\") " pod="openstack/nova-cell0-db-create-p7j5k" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.855518 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb9bq\" (UniqueName: \"kubernetes.io/projected/b5f4f809-9551-419b-9aac-57de880d3629-kube-api-access-fb9bq\") pod \"nova-cell1-db-create-lscbm\" (UID: \"b5f4f809-9551-419b-9aac-57de880d3629\") " pod="openstack/nova-cell1-db-create-lscbm" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.855557 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkc9z\" (UniqueName: \"kubernetes.io/projected/08af662c-3e6c-4904-b409-bb179c6f45d1-kube-api-access-xkc9z\") pod \"nova-api-db-create-wbv78\" (UID: \"08af662c-3e6c-4904-b409-bb179c6f45d1\") " pod="openstack/nova-api-db-create-wbv78" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.855626 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d66cb19-429c-4c73-a24c-d8c49803146c-operator-scripts\") pod \"nova-api-5a5c-account-create-update-wlfkx\" (UID: \"4d66cb19-429c-4c73-a24c-d8c49803146c\") " pod="openstack/nova-api-5a5c-account-create-update-wlfkx" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.855749 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08af662c-3e6c-4904-b409-bb179c6f45d1-operator-scripts\") pod \"nova-api-db-create-wbv78\" (UID: \"08af662c-3e6c-4904-b409-bb179c6f45d1\") " pod="openstack/nova-api-db-create-wbv78" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.857233 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08af662c-3e6c-4904-b409-bb179c6f45d1-operator-scripts\") pod \"nova-api-db-create-wbv78\" (UID: \"08af662c-3e6c-4904-b409-bb179c6f45d1\") " pod="openstack/nova-api-db-create-wbv78" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.860482 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d66cb19-429c-4c73-a24c-d8c49803146c-operator-scripts\") pod \"nova-api-5a5c-account-create-update-wlfkx\" (UID: \"4d66cb19-429c-4c73-a24c-d8c49803146c\") " pod="openstack/nova-api-5a5c-account-create-update-wlfkx" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.888910 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2680-account-create-update-xh47f"] Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.911586 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7qmz\" (UniqueName: \"kubernetes.io/projected/4d66cb19-429c-4c73-a24c-d8c49803146c-kube-api-access-v7qmz\") pod \"nova-api-5a5c-account-create-update-wlfkx\" (UID: \"4d66cb19-429c-4c73-a24c-d8c49803146c\") " pod="openstack/nova-api-5a5c-account-create-update-wlfkx" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.914015 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkc9z\" (UniqueName: \"kubernetes.io/projected/08af662c-3e6c-4904-b409-bb179c6f45d1-kube-api-access-xkc9z\") pod \"nova-api-db-create-wbv78\" (UID: \"08af662c-3e6c-4904-b409-bb179c6f45d1\") " pod="openstack/nova-api-db-create-wbv78" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.928886 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5a5c-account-create-update-wlfkx" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.959574 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb9bq\" (UniqueName: \"kubernetes.io/projected/b5f4f809-9551-419b-9aac-57de880d3629-kube-api-access-fb9bq\") pod \"nova-cell1-db-create-lscbm\" (UID: \"b5f4f809-9551-419b-9aac-57de880d3629\") " pod="openstack/nova-cell1-db-create-lscbm" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.959751 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xmhq\" (UniqueName: \"kubernetes.io/projected/3f5b89ee-003e-4323-ac0d-e9e22f329941-kube-api-access-5xmhq\") pod \"nova-cell0-2680-account-create-update-xh47f\" (UID: \"3f5b89ee-003e-4323-ac0d-e9e22f329941\") " pod="openstack/nova-cell0-2680-account-create-update-xh47f" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.959870 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5f4f809-9551-419b-9aac-57de880d3629-operator-scripts\") pod \"nova-cell1-db-create-lscbm\" (UID: \"b5f4f809-9551-419b-9aac-57de880d3629\") " pod="openstack/nova-cell1-db-create-lscbm" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.959897 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-operator-scripts\") pod \"nova-cell0-db-create-p7j5k\" (UID: \"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3\") " pod="openstack/nova-cell0-db-create-p7j5k" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.959929 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f5b89ee-003e-4323-ac0d-e9e22f329941-operator-scripts\") pod \"nova-cell0-2680-account-create-update-xh47f\" (UID: \"3f5b89ee-003e-4323-ac0d-e9e22f329941\") " pod="openstack/nova-cell0-2680-account-create-update-xh47f" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.959958 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq4zb\" (UniqueName: \"kubernetes.io/projected/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-kube-api-access-mq4zb\") pod \"nova-cell0-db-create-p7j5k\" (UID: \"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3\") " pod="openstack/nova-cell0-db-create-p7j5k" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.961773 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5f4f809-9551-419b-9aac-57de880d3629-operator-scripts\") pod \"nova-cell1-db-create-lscbm\" (UID: \"b5f4f809-9551-419b-9aac-57de880d3629\") " pod="openstack/nova-cell1-db-create-lscbm" Dec 09 11:33:54 crc kubenswrapper[5002]: I1209 11:33:54.962098 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-operator-scripts\") pod \"nova-cell0-db-create-p7j5k\" (UID: \"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3\") " pod="openstack/nova-cell0-db-create-p7j5k" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.006954 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq4zb\" (UniqueName: \"kubernetes.io/projected/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-kube-api-access-mq4zb\") pod \"nova-cell0-db-create-p7j5k\" (UID: \"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3\") " pod="openstack/nova-cell0-db-create-p7j5k" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.021408 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b588-account-create-update-tjhwj"] Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.022479 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b588-account-create-update-tjhwj" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.025785 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb9bq\" (UniqueName: \"kubernetes.io/projected/b5f4f809-9551-419b-9aac-57de880d3629-kube-api-access-fb9bq\") pod \"nova-cell1-db-create-lscbm\" (UID: \"b5f4f809-9551-419b-9aac-57de880d3629\") " pod="openstack/nova-cell1-db-create-lscbm" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.035347 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.035712 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b588-account-create-update-tjhwj"] Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.079774 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xmhq\" (UniqueName: \"kubernetes.io/projected/3f5b89ee-003e-4323-ac0d-e9e22f329941-kube-api-access-5xmhq\") pod \"nova-cell0-2680-account-create-update-xh47f\" (UID: \"3f5b89ee-003e-4323-ac0d-e9e22f329941\") " pod="openstack/nova-cell0-2680-account-create-update-xh47f" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.079967 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f5b89ee-003e-4323-ac0d-e9e22f329941-operator-scripts\") pod \"nova-cell0-2680-account-create-update-xh47f\" (UID: \"3f5b89ee-003e-4323-ac0d-e9e22f329941\") " pod="openstack/nova-cell0-2680-account-create-update-xh47f" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.080662 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f5b89ee-003e-4323-ac0d-e9e22f329941-operator-scripts\") pod \"nova-cell0-2680-account-create-update-xh47f\" (UID: \"3f5b89ee-003e-4323-ac0d-e9e22f329941\") " pod="openstack/nova-cell0-2680-account-create-update-xh47f" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.090319 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lscbm" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.096448 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xmhq\" (UniqueName: \"kubernetes.io/projected/3f5b89ee-003e-4323-ac0d-e9e22f329941-kube-api-access-5xmhq\") pod \"nova-cell0-2680-account-create-update-xh47f\" (UID: \"3f5b89ee-003e-4323-ac0d-e9e22f329941\") " pod="openstack/nova-cell0-2680-account-create-update-xh47f" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.134885 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2680-account-create-update-xh47f" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.185536 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hslwr\" (UniqueName: \"kubernetes.io/projected/01960093-3aa2-428d-a587-3647c93e64e7-kube-api-access-hslwr\") pod \"nova-cell1-b588-account-create-update-tjhwj\" (UID: \"01960093-3aa2-428d-a587-3647c93e64e7\") " pod="openstack/nova-cell1-b588-account-create-update-tjhwj" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.185671 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01960093-3aa2-428d-a587-3647c93e64e7-operator-scripts\") pod \"nova-cell1-b588-account-create-update-tjhwj\" (UID: \"01960093-3aa2-428d-a587-3647c93e64e7\") " pod="openstack/nova-cell1-b588-account-create-update-tjhwj" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.205497 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wbv78" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.287780 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hslwr\" (UniqueName: \"kubernetes.io/projected/01960093-3aa2-428d-a587-3647c93e64e7-kube-api-access-hslwr\") pod \"nova-cell1-b588-account-create-update-tjhwj\" (UID: \"01960093-3aa2-428d-a587-3647c93e64e7\") " pod="openstack/nova-cell1-b588-account-create-update-tjhwj" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.288140 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01960093-3aa2-428d-a587-3647c93e64e7-operator-scripts\") pod \"nova-cell1-b588-account-create-update-tjhwj\" (UID: \"01960093-3aa2-428d-a587-3647c93e64e7\") " pod="openstack/nova-cell1-b588-account-create-update-tjhwj" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.288827 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01960093-3aa2-428d-a587-3647c93e64e7-operator-scripts\") pod \"nova-cell1-b588-account-create-update-tjhwj\" (UID: \"01960093-3aa2-428d-a587-3647c93e64e7\") " pod="openstack/nova-cell1-b588-account-create-update-tjhwj" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.302358 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p7j5k" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.308918 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hslwr\" (UniqueName: \"kubernetes.io/projected/01960093-3aa2-428d-a587-3647c93e64e7-kube-api-access-hslwr\") pod \"nova-cell1-b588-account-create-update-tjhwj\" (UID: \"01960093-3aa2-428d-a587-3647c93e64e7\") " pod="openstack/nova-cell1-b588-account-create-update-tjhwj" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.398423 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b588-account-create-update-tjhwj" Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.522974 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5a5c-account-create-update-wlfkx"] Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.549398 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5a5c-account-create-update-wlfkx" event={"ID":"4d66cb19-429c-4c73-a24c-d8c49803146c","Type":"ContainerStarted","Data":"fcc14f790af230730b42723e0c69116b86871118e9342e0180421e767ec2c1de"} Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.608453 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lscbm"] Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.737274 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2680-account-create-update-xh47f"] Dec 09 11:33:55 crc kubenswrapper[5002]: W1209 11:33:55.739508 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f5b89ee_003e_4323_ac0d_e9e22f329941.slice/crio-b7363e48472ad36d52cd8256c655d9b925e1bdc4d4127e61f40a2ca0f356649a WatchSource:0}: Error finding container b7363e48472ad36d52cd8256c655d9b925e1bdc4d4127e61f40a2ca0f356649a: Status 404 returned error can't find the container with id b7363e48472ad36d52cd8256c655d9b925e1bdc4d4127e61f40a2ca0f356649a Dec 09 11:33:55 crc kubenswrapper[5002]: W1209 11:33:55.788093 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08af662c_3e6c_4904_b409_bb179c6f45d1.slice/crio-64f6c7db48940ac228c1ddd62ff9a307cc64bb672b43a0ce0062d077ddaf5ce9 WatchSource:0}: Error finding container 64f6c7db48940ac228c1ddd62ff9a307cc64bb672b43a0ce0062d077ddaf5ce9: Status 404 returned error can't find the container with id 64f6c7db48940ac228c1ddd62ff9a307cc64bb672b43a0ce0062d077ddaf5ce9 Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.788435 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wbv78"] Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.856488 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p7j5k"] Dec 09 11:33:55 crc kubenswrapper[5002]: W1209 11:33:55.868593 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a5c9ceb_52aa_4490_b2da_4c0a8077cbb3.slice/crio-ead596395595ed37ed554dec3284a89014668b66650660f7bb6b9c3a29dd74fc WatchSource:0}: Error finding container ead596395595ed37ed554dec3284a89014668b66650660f7bb6b9c3a29dd74fc: Status 404 returned error can't find the container with id ead596395595ed37ed554dec3284a89014668b66650660f7bb6b9c3a29dd74fc Dec 09 11:33:55 crc kubenswrapper[5002]: I1209 11:33:55.947276 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b588-account-create-update-tjhwj"] Dec 09 11:33:55 crc kubenswrapper[5002]: W1209 11:33:55.954870 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01960093_3aa2_428d_a587_3647c93e64e7.slice/crio-db703ddaa03bb21d83c8b81b5895d3ea09783cda6328c790922d06a460860dc0 WatchSource:0}: Error finding container db703ddaa03bb21d83c8b81b5895d3ea09783cda6328c790922d06a460860dc0: Status 404 returned error can't find the container with id db703ddaa03bb21d83c8b81b5895d3ea09783cda6328c790922d06a460860dc0 Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.560926 5002 generic.go:334] "Generic (PLEG): container finished" podID="b5f4f809-9551-419b-9aac-57de880d3629" containerID="d016bff16ba99528c7723b173b133ea96bad49711180df12a533544c72ea88a0" exitCode=0 Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.561223 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lscbm" event={"ID":"b5f4f809-9551-419b-9aac-57de880d3629","Type":"ContainerDied","Data":"d016bff16ba99528c7723b173b133ea96bad49711180df12a533544c72ea88a0"} Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.561250 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lscbm" event={"ID":"b5f4f809-9551-419b-9aac-57de880d3629","Type":"ContainerStarted","Data":"82e39d3a6e05272b54e59120120ef503d2faa2b46075ff2b7e8ea00aa416dbbf"} Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.563203 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wbv78" event={"ID":"08af662c-3e6c-4904-b409-bb179c6f45d1","Type":"ContainerDied","Data":"d9b186599c730c3ced7d8b2d84556ffd618c00151e035c62bce2480676ab9269"} Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.563170 5002 generic.go:334] "Generic (PLEG): container finished" podID="08af662c-3e6c-4904-b409-bb179c6f45d1" containerID="d9b186599c730c3ced7d8b2d84556ffd618c00151e035c62bce2480676ab9269" exitCode=0 Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.563297 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wbv78" event={"ID":"08af662c-3e6c-4904-b409-bb179c6f45d1","Type":"ContainerStarted","Data":"64f6c7db48940ac228c1ddd62ff9a307cc64bb672b43a0ce0062d077ddaf5ce9"} Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.566681 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b588-account-create-update-tjhwj" event={"ID":"01960093-3aa2-428d-a587-3647c93e64e7","Type":"ContainerStarted","Data":"c8be20e4e0e5c214f785b464bd36592547711f4fddd96159be4aa632d51e3dbf"} Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.566731 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b588-account-create-update-tjhwj" event={"ID":"01960093-3aa2-428d-a587-3647c93e64e7","Type":"ContainerStarted","Data":"db703ddaa03bb21d83c8b81b5895d3ea09783cda6328c790922d06a460860dc0"} Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.569186 5002 generic.go:334] "Generic (PLEG): container finished" podID="4d66cb19-429c-4c73-a24c-d8c49803146c" containerID="bc3058c8296bc813b787621a3c504285a0ae7c0905d7ec99d8318031a21d4e2f" exitCode=0 Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.569263 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5a5c-account-create-update-wlfkx" event={"ID":"4d66cb19-429c-4c73-a24c-d8c49803146c","Type":"ContainerDied","Data":"bc3058c8296bc813b787621a3c504285a0ae7c0905d7ec99d8318031a21d4e2f"} Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.571037 5002 generic.go:334] "Generic (PLEG): container finished" podID="7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3" containerID="4943ee674f928e1daa38145cc1df49f274cc9de9527018f716f2a60fb5fd4d20" exitCode=0 Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.571116 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p7j5k" event={"ID":"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3","Type":"ContainerDied","Data":"4943ee674f928e1daa38145cc1df49f274cc9de9527018f716f2a60fb5fd4d20"} Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.571283 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p7j5k" event={"ID":"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3","Type":"ContainerStarted","Data":"ead596395595ed37ed554dec3284a89014668b66650660f7bb6b9c3a29dd74fc"} Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.573070 5002 generic.go:334] "Generic (PLEG): container finished" podID="3f5b89ee-003e-4323-ac0d-e9e22f329941" containerID="c62a7bd81f29cab3e89ac6d0e06dd2164973b7b6399f640e691f17b8ab0f37c3" exitCode=0 Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.573108 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2680-account-create-update-xh47f" event={"ID":"3f5b89ee-003e-4323-ac0d-e9e22f329941","Type":"ContainerDied","Data":"c62a7bd81f29cab3e89ac6d0e06dd2164973b7b6399f640e691f17b8ab0f37c3"} Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.573131 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2680-account-create-update-xh47f" event={"ID":"3f5b89ee-003e-4323-ac0d-e9e22f329941","Type":"ContainerStarted","Data":"b7363e48472ad36d52cd8256c655d9b925e1bdc4d4127e61f40a2ca0f356649a"} Dec 09 11:33:56 crc kubenswrapper[5002]: I1209 11:33:56.633670 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-b588-account-create-update-tjhwj" podStartSLOduration=2.633654251 podStartE2EDuration="2.633654251s" podCreationTimestamp="2025-12-09 11:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:56.631429441 +0000 UTC m=+5569.023480532" watchObservedRunningTime="2025-12-09 11:33:56.633654251 +0000 UTC m=+5569.025705332" Dec 09 11:33:57 crc kubenswrapper[5002]: I1209 11:33:57.583798 5002 generic.go:334] "Generic (PLEG): container finished" podID="01960093-3aa2-428d-a587-3647c93e64e7" containerID="c8be20e4e0e5c214f785b464bd36592547711f4fddd96159be4aa632d51e3dbf" exitCode=0 Dec 09 11:33:57 crc kubenswrapper[5002]: I1209 11:33:57.583863 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b588-account-create-update-tjhwj" event={"ID":"01960093-3aa2-428d-a587-3647c93e64e7","Type":"ContainerDied","Data":"c8be20e4e0e5c214f785b464bd36592547711f4fddd96159be4aa632d51e3dbf"} Dec 09 11:33:57 crc kubenswrapper[5002]: I1209 11:33:57.955622 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5a5c-account-create-update-wlfkx" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.036692 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7qmz\" (UniqueName: \"kubernetes.io/projected/4d66cb19-429c-4c73-a24c-d8c49803146c-kube-api-access-v7qmz\") pod \"4d66cb19-429c-4c73-a24c-d8c49803146c\" (UID: \"4d66cb19-429c-4c73-a24c-d8c49803146c\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.036767 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d66cb19-429c-4c73-a24c-d8c49803146c-operator-scripts\") pod \"4d66cb19-429c-4c73-a24c-d8c49803146c\" (UID: \"4d66cb19-429c-4c73-a24c-d8c49803146c\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.037694 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d66cb19-429c-4c73-a24c-d8c49803146c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d66cb19-429c-4c73-a24c-d8c49803146c" (UID: "4d66cb19-429c-4c73-a24c-d8c49803146c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.044084 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d66cb19-429c-4c73-a24c-d8c49803146c-kube-api-access-v7qmz" (OuterVolumeSpecName: "kube-api-access-v7qmz") pod "4d66cb19-429c-4c73-a24c-d8c49803146c" (UID: "4d66cb19-429c-4c73-a24c-d8c49803146c"). InnerVolumeSpecName "kube-api-access-v7qmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.113860 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lscbm" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.119104 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wbv78" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.132397 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p7j5k" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.138768 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7qmz\" (UniqueName: \"kubernetes.io/projected/4d66cb19-429c-4c73-a24c-d8c49803146c-kube-api-access-v7qmz\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.138791 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d66cb19-429c-4c73-a24c-d8c49803146c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.143185 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2680-account-create-update-xh47f" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.239618 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-operator-scripts\") pod \"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3\" (UID: \"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.239696 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb9bq\" (UniqueName: \"kubernetes.io/projected/b5f4f809-9551-419b-9aac-57de880d3629-kube-api-access-fb9bq\") pod \"b5f4f809-9551-419b-9aac-57de880d3629\" (UID: \"b5f4f809-9551-419b-9aac-57de880d3629\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.239856 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5f4f809-9551-419b-9aac-57de880d3629-operator-scripts\") pod \"b5f4f809-9551-419b-9aac-57de880d3629\" (UID: \"b5f4f809-9551-419b-9aac-57de880d3629\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.239881 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xmhq\" (UniqueName: \"kubernetes.io/projected/3f5b89ee-003e-4323-ac0d-e9e22f329941-kube-api-access-5xmhq\") pod \"3f5b89ee-003e-4323-ac0d-e9e22f329941\" (UID: \"3f5b89ee-003e-4323-ac0d-e9e22f329941\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.239934 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f5b89ee-003e-4323-ac0d-e9e22f329941-operator-scripts\") pod \"3f5b89ee-003e-4323-ac0d-e9e22f329941\" (UID: \"3f5b89ee-003e-4323-ac0d-e9e22f329941\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.239985 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq4zb\" (UniqueName: \"kubernetes.io/projected/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-kube-api-access-mq4zb\") pod \"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3\" (UID: \"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.240036 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkc9z\" (UniqueName: \"kubernetes.io/projected/08af662c-3e6c-4904-b409-bb179c6f45d1-kube-api-access-xkc9z\") pod \"08af662c-3e6c-4904-b409-bb179c6f45d1\" (UID: \"08af662c-3e6c-4904-b409-bb179c6f45d1\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.240075 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08af662c-3e6c-4904-b409-bb179c6f45d1-operator-scripts\") pod \"08af662c-3e6c-4904-b409-bb179c6f45d1\" (UID: \"08af662c-3e6c-4904-b409-bb179c6f45d1\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.240857 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3" (UID: "7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.240887 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f4f809-9551-419b-9aac-57de880d3629-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5f4f809-9551-419b-9aac-57de880d3629" (UID: "b5f4f809-9551-419b-9aac-57de880d3629"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.240914 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08af662c-3e6c-4904-b409-bb179c6f45d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08af662c-3e6c-4904-b409-bb179c6f45d1" (UID: "08af662c-3e6c-4904-b409-bb179c6f45d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.241038 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08af662c-3e6c-4904-b409-bb179c6f45d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.241058 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.241070 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5f4f809-9551-419b-9aac-57de880d3629-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.241525 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f5b89ee-003e-4323-ac0d-e9e22f329941-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f5b89ee-003e-4323-ac0d-e9e22f329941" (UID: "3f5b89ee-003e-4323-ac0d-e9e22f329941"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.242662 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08af662c-3e6c-4904-b409-bb179c6f45d1-kube-api-access-xkc9z" (OuterVolumeSpecName: "kube-api-access-xkc9z") pod "08af662c-3e6c-4904-b409-bb179c6f45d1" (UID: "08af662c-3e6c-4904-b409-bb179c6f45d1"). InnerVolumeSpecName "kube-api-access-xkc9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.242693 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f5b89ee-003e-4323-ac0d-e9e22f329941-kube-api-access-5xmhq" (OuterVolumeSpecName: "kube-api-access-5xmhq") pod "3f5b89ee-003e-4323-ac0d-e9e22f329941" (UID: "3f5b89ee-003e-4323-ac0d-e9e22f329941"). InnerVolumeSpecName "kube-api-access-5xmhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.243148 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f4f809-9551-419b-9aac-57de880d3629-kube-api-access-fb9bq" (OuterVolumeSpecName: "kube-api-access-fb9bq") pod "b5f4f809-9551-419b-9aac-57de880d3629" (UID: "b5f4f809-9551-419b-9aac-57de880d3629"). InnerVolumeSpecName "kube-api-access-fb9bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.243331 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-kube-api-access-mq4zb" (OuterVolumeSpecName: "kube-api-access-mq4zb") pod "7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3" (UID: "7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3"). InnerVolumeSpecName "kube-api-access-mq4zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.342431 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f5b89ee-003e-4323-ac0d-e9e22f329941-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.342461 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq4zb\" (UniqueName: \"kubernetes.io/projected/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3-kube-api-access-mq4zb\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.342472 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkc9z\" (UniqueName: \"kubernetes.io/projected/08af662c-3e6c-4904-b409-bb179c6f45d1-kube-api-access-xkc9z\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.342481 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb9bq\" (UniqueName: \"kubernetes.io/projected/b5f4f809-9551-419b-9aac-57de880d3629-kube-api-access-fb9bq\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.342491 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xmhq\" (UniqueName: \"kubernetes.io/projected/3f5b89ee-003e-4323-ac0d-e9e22f329941-kube-api-access-5xmhq\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.596043 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p7j5k" event={"ID":"7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3","Type":"ContainerDied","Data":"ead596395595ed37ed554dec3284a89014668b66650660f7bb6b9c3a29dd74fc"} Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.596116 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ead596395595ed37ed554dec3284a89014668b66650660f7bb6b9c3a29dd74fc" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.596137 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p7j5k" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.598342 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2680-account-create-update-xh47f" event={"ID":"3f5b89ee-003e-4323-ac0d-e9e22f329941","Type":"ContainerDied","Data":"b7363e48472ad36d52cd8256c655d9b925e1bdc4d4127e61f40a2ca0f356649a"} Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.598391 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7363e48472ad36d52cd8256c655d9b925e1bdc4d4127e61f40a2ca0f356649a" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.598386 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2680-account-create-update-xh47f" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.600843 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lscbm" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.600799 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lscbm" event={"ID":"b5f4f809-9551-419b-9aac-57de880d3629","Type":"ContainerDied","Data":"82e39d3a6e05272b54e59120120ef503d2faa2b46075ff2b7e8ea00aa416dbbf"} Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.600915 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82e39d3a6e05272b54e59120120ef503d2faa2b46075ff2b7e8ea00aa416dbbf" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.602607 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wbv78" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.602618 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wbv78" event={"ID":"08af662c-3e6c-4904-b409-bb179c6f45d1","Type":"ContainerDied","Data":"64f6c7db48940ac228c1ddd62ff9a307cc64bb672b43a0ce0062d077ddaf5ce9"} Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.602676 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64f6c7db48940ac228c1ddd62ff9a307cc64bb672b43a0ce0062d077ddaf5ce9" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.604614 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5a5c-account-create-update-wlfkx" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.604599 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5a5c-account-create-update-wlfkx" event={"ID":"4d66cb19-429c-4c73-a24c-d8c49803146c","Type":"ContainerDied","Data":"fcc14f790af230730b42723e0c69116b86871118e9342e0180421e767ec2c1de"} Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.604852 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcc14f790af230730b42723e0c69116b86871118e9342e0180421e767ec2c1de" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.832211 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b588-account-create-update-tjhwj" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.956784 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hslwr\" (UniqueName: \"kubernetes.io/projected/01960093-3aa2-428d-a587-3647c93e64e7-kube-api-access-hslwr\") pod \"01960093-3aa2-428d-a587-3647c93e64e7\" (UID: \"01960093-3aa2-428d-a587-3647c93e64e7\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.956885 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01960093-3aa2-428d-a587-3647c93e64e7-operator-scripts\") pod \"01960093-3aa2-428d-a587-3647c93e64e7\" (UID: \"01960093-3aa2-428d-a587-3647c93e64e7\") " Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.957709 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01960093-3aa2-428d-a587-3647c93e64e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01960093-3aa2-428d-a587-3647c93e64e7" (UID: "01960093-3aa2-428d-a587-3647c93e64e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:33:58 crc kubenswrapper[5002]: I1209 11:33:58.961623 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01960093-3aa2-428d-a587-3647c93e64e7-kube-api-access-hslwr" (OuterVolumeSpecName: "kube-api-access-hslwr") pod "01960093-3aa2-428d-a587-3647c93e64e7" (UID: "01960093-3aa2-428d-a587-3647c93e64e7"). InnerVolumeSpecName "kube-api-access-hslwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:33:59 crc kubenswrapper[5002]: I1209 11:33:59.058456 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hslwr\" (UniqueName: \"kubernetes.io/projected/01960093-3aa2-428d-a587-3647c93e64e7-kube-api-access-hslwr\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:59 crc kubenswrapper[5002]: I1209 11:33:59.058482 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01960093-3aa2-428d-a587-3647c93e64e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:33:59 crc kubenswrapper[5002]: I1209 11:33:59.614439 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b588-account-create-update-tjhwj" event={"ID":"01960093-3aa2-428d-a587-3647c93e64e7","Type":"ContainerDied","Data":"db703ddaa03bb21d83c8b81b5895d3ea09783cda6328c790922d06a460860dc0"} Dec 09 11:33:59 crc kubenswrapper[5002]: I1209 11:33:59.614481 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b588-account-create-update-tjhwj" Dec 09 11:33:59 crc kubenswrapper[5002]: I1209 11:33:59.614487 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db703ddaa03bb21d83c8b81b5895d3ea09783cda6328c790922d06a460860dc0" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.080773 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bdrr7"] Dec 09 11:34:00 crc kubenswrapper[5002]: E1209 11:34:00.081256 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01960093-3aa2-428d-a587-3647c93e64e7" containerName="mariadb-account-create-update" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081275 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="01960093-3aa2-428d-a587-3647c93e64e7" containerName="mariadb-account-create-update" Dec 09 11:34:00 crc kubenswrapper[5002]: E1209 11:34:00.081306 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d66cb19-429c-4c73-a24c-d8c49803146c" containerName="mariadb-account-create-update" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081313 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d66cb19-429c-4c73-a24c-d8c49803146c" containerName="mariadb-account-create-update" Dec 09 11:34:00 crc kubenswrapper[5002]: E1209 11:34:00.081325 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08af662c-3e6c-4904-b409-bb179c6f45d1" containerName="mariadb-database-create" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081334 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="08af662c-3e6c-4904-b409-bb179c6f45d1" containerName="mariadb-database-create" Dec 09 11:34:00 crc kubenswrapper[5002]: E1209 11:34:00.081345 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f4f809-9551-419b-9aac-57de880d3629" containerName="mariadb-database-create" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081352 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f4f809-9551-419b-9aac-57de880d3629" containerName="mariadb-database-create" Dec 09 11:34:00 crc kubenswrapper[5002]: E1209 11:34:00.081368 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3" containerName="mariadb-database-create" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081375 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3" containerName="mariadb-database-create" Dec 09 11:34:00 crc kubenswrapper[5002]: E1209 11:34:00.081396 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f5b89ee-003e-4323-ac0d-e9e22f329941" containerName="mariadb-account-create-update" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081403 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5b89ee-003e-4323-ac0d-e9e22f329941" containerName="mariadb-account-create-update" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081623 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="08af662c-3e6c-4904-b409-bb179c6f45d1" containerName="mariadb-database-create" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081640 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d66cb19-429c-4c73-a24c-d8c49803146c" containerName="mariadb-account-create-update" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081657 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f5b89ee-003e-4323-ac0d-e9e22f329941" containerName="mariadb-account-create-update" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081671 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f4f809-9551-419b-9aac-57de880d3629" containerName="mariadb-database-create" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081683 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3" containerName="mariadb-database-create" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.081697 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="01960093-3aa2-428d-a587-3647c93e64e7" containerName="mariadb-account-create-update" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.082486 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.088380 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.088523 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.088586 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nxdd2" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.111967 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bdrr7"] Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.183186 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99st4\" (UniqueName: \"kubernetes.io/projected/977406ee-d447-4350-9dac-57ff3973d465-kube-api-access-99st4\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.183399 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-scripts\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.183450 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-config-data\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.183503 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.285434 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-scripts\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.285500 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-config-data\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.285563 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.285618 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99st4\" (UniqueName: \"kubernetes.io/projected/977406ee-d447-4350-9dac-57ff3973d465-kube-api-access-99st4\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.292300 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.292842 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-config-data\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.293005 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-scripts\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.305085 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99st4\" (UniqueName: \"kubernetes.io/projected/977406ee-d447-4350-9dac-57ff3973d465-kube-api-access-99st4\") pod \"nova-cell0-conductor-db-sync-bdrr7\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.408295 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:00 crc kubenswrapper[5002]: I1209 11:34:00.702266 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bdrr7"] Dec 09 11:34:01 crc kubenswrapper[5002]: I1209 11:34:01.643685 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bdrr7" event={"ID":"977406ee-d447-4350-9dac-57ff3973d465","Type":"ContainerStarted","Data":"b3d392e02c2adf871c8bfa7fec0bd8307d58287da6ff4b2875b7e99a0e4df62a"} Dec 09 11:34:01 crc kubenswrapper[5002]: I1209 11:34:01.644279 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bdrr7" event={"ID":"977406ee-d447-4350-9dac-57ff3973d465","Type":"ContainerStarted","Data":"d3f29c6e173202938f240c1038bacfcec2e51f891fd00769ba40eb3cc87f44ec"} Dec 09 11:34:01 crc kubenswrapper[5002]: I1209 11:34:01.671908 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-bdrr7" podStartSLOduration=1.671880668 podStartE2EDuration="1.671880668s" podCreationTimestamp="2025-12-09 11:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:01.665500448 +0000 UTC m=+5574.057551539" watchObservedRunningTime="2025-12-09 11:34:01.671880668 +0000 UTC m=+5574.063931789" Dec 09 11:34:06 crc kubenswrapper[5002]: I1209 11:34:06.702341 5002 generic.go:334] "Generic (PLEG): container finished" podID="977406ee-d447-4350-9dac-57ff3973d465" containerID="b3d392e02c2adf871c8bfa7fec0bd8307d58287da6ff4b2875b7e99a0e4df62a" exitCode=0 Dec 09 11:34:06 crc kubenswrapper[5002]: I1209 11:34:06.702458 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bdrr7" event={"ID":"977406ee-d447-4350-9dac-57ff3973d465","Type":"ContainerDied","Data":"b3d392e02c2adf871c8bfa7fec0bd8307d58287da6ff4b2875b7e99a0e4df62a"} Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.006242 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.066283 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.129296 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-config-data\") pod \"977406ee-d447-4350-9dac-57ff3973d465\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.129361 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-combined-ca-bundle\") pod \"977406ee-d447-4350-9dac-57ff3973d465\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.129416 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99st4\" (UniqueName: \"kubernetes.io/projected/977406ee-d447-4350-9dac-57ff3973d465-kube-api-access-99st4\") pod \"977406ee-d447-4350-9dac-57ff3973d465\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.129549 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-scripts\") pod \"977406ee-d447-4350-9dac-57ff3973d465\" (UID: \"977406ee-d447-4350-9dac-57ff3973d465\") " Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.135244 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-scripts" (OuterVolumeSpecName: "scripts") pod "977406ee-d447-4350-9dac-57ff3973d465" (UID: "977406ee-d447-4350-9dac-57ff3973d465"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.135241 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/977406ee-d447-4350-9dac-57ff3973d465-kube-api-access-99st4" (OuterVolumeSpecName: "kube-api-access-99st4") pod "977406ee-d447-4350-9dac-57ff3973d465" (UID: "977406ee-d447-4350-9dac-57ff3973d465"). InnerVolumeSpecName "kube-api-access-99st4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.163904 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "977406ee-d447-4350-9dac-57ff3973d465" (UID: "977406ee-d447-4350-9dac-57ff3973d465"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.168600 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-config-data" (OuterVolumeSpecName: "config-data") pod "977406ee-d447-4350-9dac-57ff3973d465" (UID: "977406ee-d447-4350-9dac-57ff3973d465"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.232668 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.232997 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.233012 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977406ee-d447-4350-9dac-57ff3973d465-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.233025 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99st4\" (UniqueName: \"kubernetes.io/projected/977406ee-d447-4350-9dac-57ff3973d465-kube-api-access-99st4\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.725661 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bdrr7" event={"ID":"977406ee-d447-4350-9dac-57ff3973d465","Type":"ContainerDied","Data":"d3f29c6e173202938f240c1038bacfcec2e51f891fd00769ba40eb3cc87f44ec"} Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.725735 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3f29c6e173202938f240c1038bacfcec2e51f891fd00769ba40eb3cc87f44ec" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.725737 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bdrr7" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.749802 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"c9279947ed00c5b6531641df0eb3e04f34e3d816632d088e326b1acbc67d09a2"} Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.988890 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:34:08 crc kubenswrapper[5002]: E1209 11:34:08.989516 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977406ee-d447-4350-9dac-57ff3973d465" containerName="nova-cell0-conductor-db-sync" Dec 09 11:34:08 crc kubenswrapper[5002]: I1209 11:34:08.989529 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="977406ee-d447-4350-9dac-57ff3973d465" containerName="nova-cell0-conductor-db-sync" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.015988 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="977406ee-d447-4350-9dac-57ff3973d465" containerName="nova-cell0-conductor-db-sync" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.039945 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.040048 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.044746 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nxdd2" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.045015 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.187601 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.187646 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kcp2\" (UniqueName: \"kubernetes.io/projected/1bc03877-56b5-44c0-9565-ce459d9f28da-kube-api-access-7kcp2\") pod \"nova-cell0-conductor-0\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.187728 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.289745 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.289990 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.290028 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kcp2\" (UniqueName: \"kubernetes.io/projected/1bc03877-56b5-44c0-9565-ce459d9f28da-kube-api-access-7kcp2\") pod \"nova-cell0-conductor-0\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.296282 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.296379 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.309107 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kcp2\" (UniqueName: \"kubernetes.io/projected/1bc03877-56b5-44c0-9565-ce459d9f28da-kube-api-access-7kcp2\") pod \"nova-cell0-conductor-0\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.374921 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:09 crc kubenswrapper[5002]: I1209 11:34:09.841886 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:34:10 crc kubenswrapper[5002]: I1209 11:34:10.782580 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1bc03877-56b5-44c0-9565-ce459d9f28da","Type":"ContainerStarted","Data":"0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0"} Dec 09 11:34:10 crc kubenswrapper[5002]: I1209 11:34:10.783050 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1bc03877-56b5-44c0-9565-ce459d9f28da","Type":"ContainerStarted","Data":"5ebe7d63153bd97ec1d9a3746c2e50eedbcfc28a2cfcba84efa0072399de1e6d"} Dec 09 11:34:10 crc kubenswrapper[5002]: I1209 11:34:10.783096 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:10 crc kubenswrapper[5002]: I1209 11:34:10.807348 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.807325894 podStartE2EDuration="2.807325894s" podCreationTimestamp="2025-12-09 11:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:10.802086603 +0000 UTC m=+5583.194137714" watchObservedRunningTime="2025-12-09 11:34:10.807325894 +0000 UTC m=+5583.199376975" Dec 09 11:34:19 crc kubenswrapper[5002]: I1209 11:34:19.410972 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 09 11:34:19 crc kubenswrapper[5002]: I1209 11:34:19.834865 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-qd2mx"] Dec 09 11:34:19 crc kubenswrapper[5002]: I1209 11:34:19.836499 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:19 crc kubenswrapper[5002]: I1209 11:34:19.839271 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 09 11:34:19 crc kubenswrapper[5002]: I1209 11:34:19.846624 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qd2mx"] Dec 09 11:34:19 crc kubenswrapper[5002]: I1209 11:34:19.850986 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 09 11:34:19 crc kubenswrapper[5002]: I1209 11:34:19.981129 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-scripts\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:19 crc kubenswrapper[5002]: I1209 11:34:19.981283 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:19 crc kubenswrapper[5002]: I1209 11:34:19.981386 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kwbg\" (UniqueName: \"kubernetes.io/projected/3fb0a2a3-32f7-4406-9a40-db702f2e2786-kube-api-access-2kwbg\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:19 crc kubenswrapper[5002]: I1209 11:34:19.981425 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-config-data\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.002479 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.004293 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.007346 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.022935 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.024960 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.028884 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.038949 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.082731 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.083677 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kwbg\" (UniqueName: \"kubernetes.io/projected/3fb0a2a3-32f7-4406-9a40-db702f2e2786-kube-api-access-2kwbg\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.083719 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-config-data\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.083795 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-logs\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.083875 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-scripts\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.083933 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdqg2\" (UniqueName: \"kubernetes.io/projected/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-kube-api-access-tdqg2\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.083970 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-config-data\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.084015 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.084033 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-config-data\") pod \"nova-scheduler-0\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.084050 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.084111 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.084165 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5stz\" (UniqueName: \"kubernetes.io/projected/e9faf59d-a3e1-4d74-acd5-481eebd94f63-kube-api-access-f5stz\") pod \"nova-scheduler-0\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.090610 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-config-data\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.092797 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-scripts\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.115485 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kwbg\" (UniqueName: \"kubernetes.io/projected/3fb0a2a3-32f7-4406-9a40-db702f2e2786-kube-api-access-2kwbg\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.124271 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qd2mx\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.139276 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.140711 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.151175 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.159349 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.187151 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.187724 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-logs\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.187833 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdqg2\" (UniqueName: \"kubernetes.io/projected/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-kube-api-access-tdqg2\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.187868 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-config-data\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.187894 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.187916 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-config-data\") pod \"nova-scheduler-0\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.187938 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.188017 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5stz\" (UniqueName: \"kubernetes.io/projected/e9faf59d-a3e1-4d74-acd5-481eebd94f63-kube-api-access-f5stz\") pod \"nova-scheduler-0\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.188829 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-logs\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.196037 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-config-data\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.196675 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.203397 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-config-data\") pod \"nova-scheduler-0\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.204547 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.216376 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5stz\" (UniqueName: \"kubernetes.io/projected/e9faf59d-a3e1-4d74-acd5-481eebd94f63-kube-api-access-f5stz\") pod \"nova-scheduler-0\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.230636 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdqg2\" (UniqueName: \"kubernetes.io/projected/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-kube-api-access-tdqg2\") pod \"nova-api-0\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.259697 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.260895 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.271279 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.289514 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-config-data\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.289556 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlsnd\" (UniqueName: \"kubernetes.io/projected/d160668f-3435-4b33-ac14-34d6a19c9b62-kube-api-access-dlsnd\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.289584 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d160668f-3435-4b33-ac14-34d6a19c9b62-logs\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.289649 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.300217 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-n4m46"] Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.303108 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.321780 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-n4m46"] Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.331664 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.342357 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.353374 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.391086 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-config-data\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.391131 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlsnd\" (UniqueName: \"kubernetes.io/projected/d160668f-3435-4b33-ac14-34d6a19c9b62-kube-api-access-dlsnd\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.391162 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrqj5\" (UniqueName: \"kubernetes.io/projected/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-kube-api-access-lrqj5\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.391185 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d160668f-3435-4b33-ac14-34d6a19c9b62-logs\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.391216 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.391279 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.391949 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-dns-svc\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.392103 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.392263 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.392489 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-config\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.392694 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwq7\" (UniqueName: \"kubernetes.io/projected/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-kube-api-access-ccwq7\") pod \"nova-cell1-novncproxy-0\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.393120 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.394838 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-config-data\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.395104 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d160668f-3435-4b33-ac14-34d6a19c9b62-logs\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.400277 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.425441 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlsnd\" (UniqueName: \"kubernetes.io/projected/d160668f-3435-4b33-ac14-34d6a19c9b62-kube-api-access-dlsnd\") pod \"nova-metadata-0\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.495200 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwq7\" (UniqueName: \"kubernetes.io/projected/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-kube-api-access-ccwq7\") pod \"nova-cell1-novncproxy-0\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.495346 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.495435 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrqj5\" (UniqueName: \"kubernetes.io/projected/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-kube-api-access-lrqj5\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.495482 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.495589 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-dns-svc\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.495616 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.495638 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.495678 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-config\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.496747 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-config\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.497620 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.497864 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-dns-svc\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.498508 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.513363 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.513629 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.518739 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwq7\" (UniqueName: \"kubernetes.io/projected/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-kube-api-access-ccwq7\") pod \"nova-cell1-novncproxy-0\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.518965 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrqj5\" (UniqueName: \"kubernetes.io/projected/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-kube-api-access-lrqj5\") pod \"dnsmasq-dns-696f9966c7-n4m46\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.670715 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.695895 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.719100 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.852634 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qd2mx"] Dec 09 11:34:20 crc kubenswrapper[5002]: W1209 11:34:20.906777 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fb0a2a3_32f7_4406_9a40_db702f2e2786.slice/crio-713cc5925a4e2762db8203567301fdef93e5bd6bba8bbd0529007ad655a1b1fd WatchSource:0}: Error finding container 713cc5925a4e2762db8203567301fdef93e5bd6bba8bbd0529007ad655a1b1fd: Status 404 returned error can't find the container with id 713cc5925a4e2762db8203567301fdef93e5bd6bba8bbd0529007ad655a1b1fd Dec 09 11:34:20 crc kubenswrapper[5002]: I1209 11:34:20.940117 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.006957 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xxkmb"] Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.009731 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.024971 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.025066 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 11:34:21 crc kubenswrapper[5002]: W1209 11:34:21.026071 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9faf59d_a3e1_4d74_acd5_481eebd94f63.slice/crio-0e739f6dc53124f6994b215ab19565b78ddf25ef4030a98b6aeb284b912fd585 WatchSource:0}: Error finding container 0e739f6dc53124f6994b215ab19565b78ddf25ef4030a98b6aeb284b912fd585: Status 404 returned error can't find the container with id 0e739f6dc53124f6994b215ab19565b78ddf25ef4030a98b6aeb284b912fd585 Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.034628 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xxkmb"] Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.064399 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.110557 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-scripts\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.110880 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.110922 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-config-data\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.110951 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f62wd\" (UniqueName: \"kubernetes.io/projected/e771094a-9f24-43c1-be1f-3efca4e1b42c-kube-api-access-f62wd\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.194926 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:21 crc kubenswrapper[5002]: W1209 11:34:21.203150 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd160668f_3435_4b33_ac14_34d6a19c9b62.slice/crio-9e2a56495add99dcfbd41c06a4bceb2cff38adfec7188e1d665580af1cf718b5 WatchSource:0}: Error finding container 9e2a56495add99dcfbd41c06a4bceb2cff38adfec7188e1d665580af1cf718b5: Status 404 returned error can't find the container with id 9e2a56495add99dcfbd41c06a4bceb2cff38adfec7188e1d665580af1cf718b5 Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.215844 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f62wd\" (UniqueName: \"kubernetes.io/projected/e771094a-9f24-43c1-be1f-3efca4e1b42c-kube-api-access-f62wd\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.215971 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-scripts\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.216028 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.216433 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-config-data\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.221734 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-config-data\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.222238 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-scripts\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.224744 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.238133 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f62wd\" (UniqueName: \"kubernetes.io/projected/e771094a-9f24-43c1-be1f-3efca4e1b42c-kube-api-access-f62wd\") pod \"nova-cell1-conductor-db-sync-xxkmb\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.323106 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.344282 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.434741 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-n4m46"] Dec 09 11:34:21 crc kubenswrapper[5002]: W1209 11:34:21.454661 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod902ce8a4_b7c1_4b23_a506_f66fb2c84cb0.slice/crio-72b83b503b2ba78c9f7744aa21e836956d24071948013d03beb51eb281e5cd46 WatchSource:0}: Error finding container 72b83b503b2ba78c9f7744aa21e836956d24071948013d03beb51eb281e5cd46: Status 404 returned error can't find the container with id 72b83b503b2ba78c9f7744aa21e836956d24071948013d03beb51eb281e5cd46 Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.915981 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79b301c9-b2dc-4e36-9940-cd6ac885fd6d","Type":"ContainerStarted","Data":"cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05"} Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.916336 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79b301c9-b2dc-4e36-9940-cd6ac885fd6d","Type":"ContainerStarted","Data":"033d6d1d7f3c660571f1423051490f89b87cbe00a1c1834ac9fe96cf15c5ce09"} Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.919654 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xxkmb"] Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.922102 5002 generic.go:334] "Generic (PLEG): container finished" podID="902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" containerID="1464fed4c2ffc7539d32bdc329de1df2b4ee9271af080b41cf86c5a18184e769" exitCode=0 Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.922188 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" event={"ID":"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0","Type":"ContainerDied","Data":"1464fed4c2ffc7539d32bdc329de1df2b4ee9271af080b41cf86c5a18184e769"} Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.922220 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" event={"ID":"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0","Type":"ContainerStarted","Data":"72b83b503b2ba78c9f7744aa21e836956d24071948013d03beb51eb281e5cd46"} Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.923744 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b3483538-8c2c-4c39-a9f1-5c8b6779d43a","Type":"ContainerStarted","Data":"aa4b67030850ec07ff3d003861c1733d259c14d2364627e83d950e4c96243b51"} Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.923775 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b3483538-8c2c-4c39-a9f1-5c8b6779d43a","Type":"ContainerStarted","Data":"74657b77970a967a07566401238df14e9d7b6b635c85c4e98bb6b08e2755ede3"} Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.938032 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qd2mx" event={"ID":"3fb0a2a3-32f7-4406-9a40-db702f2e2786","Type":"ContainerStarted","Data":"bc386f309010208a9e507929a5a7930742f3b1063b25febf38482d67eb33280f"} Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.938089 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qd2mx" event={"ID":"3fb0a2a3-32f7-4406-9a40-db702f2e2786","Type":"ContainerStarted","Data":"713cc5925a4e2762db8203567301fdef93e5bd6bba8bbd0529007ad655a1b1fd"} Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.953427 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d160668f-3435-4b33-ac14-34d6a19c9b62","Type":"ContainerStarted","Data":"131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c"} Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.953676 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d160668f-3435-4b33-ac14-34d6a19c9b62","Type":"ContainerStarted","Data":"9e2a56495add99dcfbd41c06a4bceb2cff38adfec7188e1d665580af1cf718b5"} Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.963774 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9faf59d-a3e1-4d74-acd5-481eebd94f63","Type":"ContainerStarted","Data":"ced19961c74f67c4400fff13faf2518cec0899932c3ec03808ce8ef4152854e7"} Dec 09 11:34:21 crc kubenswrapper[5002]: I1209 11:34:21.964035 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9faf59d-a3e1-4d74-acd5-481eebd94f63","Type":"ContainerStarted","Data":"0e739f6dc53124f6994b215ab19565b78ddf25ef4030a98b6aeb284b912fd585"} Dec 09 11:34:22 crc kubenswrapper[5002]: I1209 11:34:22.041485 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.041464715 podStartE2EDuration="2.041464715s" podCreationTimestamp="2025-12-09 11:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:21.999684817 +0000 UTC m=+5594.391735898" watchObservedRunningTime="2025-12-09 11:34:22.041464715 +0000 UTC m=+5594.433515796" Dec 09 11:34:22 crc kubenswrapper[5002]: I1209 11:34:22.048356 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-qd2mx" podStartSLOduration=3.048295237 podStartE2EDuration="3.048295237s" podCreationTimestamp="2025-12-09 11:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:22.0181194 +0000 UTC m=+5594.410170481" watchObservedRunningTime="2025-12-09 11:34:22.048295237 +0000 UTC m=+5594.440346328" Dec 09 11:34:22 crc kubenswrapper[5002]: I1209 11:34:22.072554 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.072537786 podStartE2EDuration="3.072537786s" podCreationTimestamp="2025-12-09 11:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:22.040850918 +0000 UTC m=+5594.432901999" watchObservedRunningTime="2025-12-09 11:34:22.072537786 +0000 UTC m=+5594.464588867" Dec 09 11:34:22 crc kubenswrapper[5002]: I1209 11:34:22.975966 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79b301c9-b2dc-4e36-9940-cd6ac885fd6d","Type":"ContainerStarted","Data":"f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a"} Dec 09 11:34:22 crc kubenswrapper[5002]: I1209 11:34:22.979579 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" event={"ID":"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0","Type":"ContainerStarted","Data":"dbf109bd991efc11573cd767cb66e79d5b108a2b043b7bf68501792d400c35e4"} Dec 09 11:34:22 crc kubenswrapper[5002]: I1209 11:34:22.979673 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:22 crc kubenswrapper[5002]: I1209 11:34:22.981465 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xxkmb" event={"ID":"e771094a-9f24-43c1-be1f-3efca4e1b42c","Type":"ContainerStarted","Data":"3e59c03533020fe2ca5756b8f99eb2364b35c8f23674f71f840e505b68b56eac"} Dec 09 11:34:22 crc kubenswrapper[5002]: I1209 11:34:22.981490 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xxkmb" event={"ID":"e771094a-9f24-43c1-be1f-3efca4e1b42c","Type":"ContainerStarted","Data":"f64ad1380b60420db324e03de1b9450b24f10ef1ec0ece1bdb729369c01b6b1f"} Dec 09 11:34:22 crc kubenswrapper[5002]: I1209 11:34:22.986194 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d160668f-3435-4b33-ac14-34d6a19c9b62","Type":"ContainerStarted","Data":"056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b"} Dec 09 11:34:22 crc kubenswrapper[5002]: I1209 11:34:22.999281 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.999261705 podStartE2EDuration="3.999261705s" podCreationTimestamp="2025-12-09 11:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:22.998626118 +0000 UTC m=+5595.390677199" watchObservedRunningTime="2025-12-09 11:34:22.999261705 +0000 UTC m=+5595.391312786" Dec 09 11:34:23 crc kubenswrapper[5002]: I1209 11:34:23.019504 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xxkmb" podStartSLOduration=3.019482106 podStartE2EDuration="3.019482106s" podCreationTimestamp="2025-12-09 11:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:23.013043174 +0000 UTC m=+5595.405094255" watchObservedRunningTime="2025-12-09 11:34:23.019482106 +0000 UTC m=+5595.411533197" Dec 09 11:34:23 crc kubenswrapper[5002]: I1209 11:34:23.042401 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.042380079 podStartE2EDuration="3.042380079s" podCreationTimestamp="2025-12-09 11:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:23.038882565 +0000 UTC m=+5595.430933646" watchObservedRunningTime="2025-12-09 11:34:23.042380079 +0000 UTC m=+5595.434431160" Dec 09 11:34:25 crc kubenswrapper[5002]: I1209 11:34:25.354776 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 11:34:25 crc kubenswrapper[5002]: I1209 11:34:25.671727 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:34:25 crc kubenswrapper[5002]: I1209 11:34:25.671790 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:34:25 crc kubenswrapper[5002]: I1209 11:34:25.697563 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:26 crc kubenswrapper[5002]: I1209 11:34:26.016057 5002 generic.go:334] "Generic (PLEG): container finished" podID="e771094a-9f24-43c1-be1f-3efca4e1b42c" containerID="3e59c03533020fe2ca5756b8f99eb2364b35c8f23674f71f840e505b68b56eac" exitCode=0 Dec 09 11:34:26 crc kubenswrapper[5002]: I1209 11:34:26.016103 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xxkmb" event={"ID":"e771094a-9f24-43c1-be1f-3efca4e1b42c","Type":"ContainerDied","Data":"3e59c03533020fe2ca5756b8f99eb2364b35c8f23674f71f840e505b68b56eac"} Dec 09 11:34:26 crc kubenswrapper[5002]: I1209 11:34:26.043376 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" podStartSLOduration=6.043351661 podStartE2EDuration="6.043351661s" podCreationTimestamp="2025-12-09 11:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:23.063581366 +0000 UTC m=+5595.455632447" watchObservedRunningTime="2025-12-09 11:34:26.043351661 +0000 UTC m=+5598.435402742" Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.026906 5002 generic.go:334] "Generic (PLEG): container finished" podID="3fb0a2a3-32f7-4406-9a40-db702f2e2786" containerID="bc386f309010208a9e507929a5a7930742f3b1063b25febf38482d67eb33280f" exitCode=0 Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.027012 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qd2mx" event={"ID":"3fb0a2a3-32f7-4406-9a40-db702f2e2786","Type":"ContainerDied","Data":"bc386f309010208a9e507929a5a7930742f3b1063b25febf38482d67eb33280f"} Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.472248 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.555919 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f62wd\" (UniqueName: \"kubernetes.io/projected/e771094a-9f24-43c1-be1f-3efca4e1b42c-kube-api-access-f62wd\") pod \"e771094a-9f24-43c1-be1f-3efca4e1b42c\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.556162 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-combined-ca-bundle\") pod \"e771094a-9f24-43c1-be1f-3efca4e1b42c\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.556217 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-config-data\") pod \"e771094a-9f24-43c1-be1f-3efca4e1b42c\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.556290 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-scripts\") pod \"e771094a-9f24-43c1-be1f-3efca4e1b42c\" (UID: \"e771094a-9f24-43c1-be1f-3efca4e1b42c\") " Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.562727 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-scripts" (OuterVolumeSpecName: "scripts") pod "e771094a-9f24-43c1-be1f-3efca4e1b42c" (UID: "e771094a-9f24-43c1-be1f-3efca4e1b42c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.563235 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e771094a-9f24-43c1-be1f-3efca4e1b42c-kube-api-access-f62wd" (OuterVolumeSpecName: "kube-api-access-f62wd") pod "e771094a-9f24-43c1-be1f-3efca4e1b42c" (UID: "e771094a-9f24-43c1-be1f-3efca4e1b42c"). InnerVolumeSpecName "kube-api-access-f62wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.582583 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-config-data" (OuterVolumeSpecName: "config-data") pod "e771094a-9f24-43c1-be1f-3efca4e1b42c" (UID: "e771094a-9f24-43c1-be1f-3efca4e1b42c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.585455 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e771094a-9f24-43c1-be1f-3efca4e1b42c" (UID: "e771094a-9f24-43c1-be1f-3efca4e1b42c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.660795 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.660914 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f62wd\" (UniqueName: \"kubernetes.io/projected/e771094a-9f24-43c1-be1f-3efca4e1b42c-kube-api-access-f62wd\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.660941 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:27 crc kubenswrapper[5002]: I1209 11:34:27.660958 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e771094a-9f24-43c1-be1f-3efca4e1b42c-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.038974 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xxkmb" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.038967 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xxkmb" event={"ID":"e771094a-9f24-43c1-be1f-3efca4e1b42c","Type":"ContainerDied","Data":"f64ad1380b60420db324e03de1b9450b24f10ef1ec0ece1bdb729369c01b6b1f"} Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.039042 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f64ad1380b60420db324e03de1b9450b24f10ef1ec0ece1bdb729369c01b6b1f" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.138582 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:34:28 crc kubenswrapper[5002]: E1209 11:34:28.139047 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e771094a-9f24-43c1-be1f-3efca4e1b42c" containerName="nova-cell1-conductor-db-sync" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.139069 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e771094a-9f24-43c1-be1f-3efca4e1b42c" containerName="nova-cell1-conductor-db-sync" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.139302 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e771094a-9f24-43c1-be1f-3efca4e1b42c" containerName="nova-cell1-conductor-db-sync" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.140157 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.146955 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.165363 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.278556 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.278713 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4hz\" (UniqueName: \"kubernetes.io/projected/90e280fc-dcb3-4d43-828d-89c8abf26988-kube-api-access-xx4hz\") pod \"nova-cell1-conductor-0\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.278911 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.380785 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.381206 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4hz\" (UniqueName: \"kubernetes.io/projected/90e280fc-dcb3-4d43-828d-89c8abf26988-kube-api-access-xx4hz\") pod \"nova-cell1-conductor-0\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.381252 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.385734 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.386917 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.404844 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4hz\" (UniqueName: \"kubernetes.io/projected/90e280fc-dcb3-4d43-828d-89c8abf26988-kube-api-access-xx4hz\") pod \"nova-cell1-conductor-0\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.458005 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.470224 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.583855 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-combined-ca-bundle\") pod \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.583981 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-scripts\") pod \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.584114 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kwbg\" (UniqueName: \"kubernetes.io/projected/3fb0a2a3-32f7-4406-9a40-db702f2e2786-kube-api-access-2kwbg\") pod \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.584137 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-config-data\") pod \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\" (UID: \"3fb0a2a3-32f7-4406-9a40-db702f2e2786\") " Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.592768 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fb0a2a3-32f7-4406-9a40-db702f2e2786-kube-api-access-2kwbg" (OuterVolumeSpecName: "kube-api-access-2kwbg") pod "3fb0a2a3-32f7-4406-9a40-db702f2e2786" (UID: "3fb0a2a3-32f7-4406-9a40-db702f2e2786"). InnerVolumeSpecName "kube-api-access-2kwbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.593539 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-scripts" (OuterVolumeSpecName: "scripts") pod "3fb0a2a3-32f7-4406-9a40-db702f2e2786" (UID: "3fb0a2a3-32f7-4406-9a40-db702f2e2786"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.614515 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-config-data" (OuterVolumeSpecName: "config-data") pod "3fb0a2a3-32f7-4406-9a40-db702f2e2786" (UID: "3fb0a2a3-32f7-4406-9a40-db702f2e2786"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.624880 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fb0a2a3-32f7-4406-9a40-db702f2e2786" (UID: "3fb0a2a3-32f7-4406-9a40-db702f2e2786"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.686592 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kwbg\" (UniqueName: \"kubernetes.io/projected/3fb0a2a3-32f7-4406-9a40-db702f2e2786-kube-api-access-2kwbg\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.686626 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.686642 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.686655 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb0a2a3-32f7-4406-9a40-db702f2e2786-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:28 crc kubenswrapper[5002]: I1209 11:34:28.908520 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:34:28 crc kubenswrapper[5002]: W1209 11:34:28.922558 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90e280fc_dcb3_4d43_828d_89c8abf26988.slice/crio-1598f12a7d883046533a4134c774c9fba49de612f60ec3a332bbfa5f9871bcd7 WatchSource:0}: Error finding container 1598f12a7d883046533a4134c774c9fba49de612f60ec3a332bbfa5f9871bcd7: Status 404 returned error can't find the container with id 1598f12a7d883046533a4134c774c9fba49de612f60ec3a332bbfa5f9871bcd7 Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.050456 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qd2mx" Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.050525 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qd2mx" event={"ID":"3fb0a2a3-32f7-4406-9a40-db702f2e2786","Type":"ContainerDied","Data":"713cc5925a4e2762db8203567301fdef93e5bd6bba8bbd0529007ad655a1b1fd"} Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.051449 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="713cc5925a4e2762db8203567301fdef93e5bd6bba8bbd0529007ad655a1b1fd" Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.051857 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"90e280fc-dcb3-4d43-828d-89c8abf26988","Type":"ContainerStarted","Data":"1598f12a7d883046533a4134c774c9fba49de612f60ec3a332bbfa5f9871bcd7"} Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.227763 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.228032 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79b301c9-b2dc-4e36-9940-cd6ac885fd6d" containerName="nova-api-log" containerID="cri-o://cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05" gracePeriod=30 Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.228146 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79b301c9-b2dc-4e36-9940-cd6ac885fd6d" containerName="nova-api-api" containerID="cri-o://f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a" gracePeriod=30 Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.245442 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.245647 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e9faf59d-a3e1-4d74-acd5-481eebd94f63" containerName="nova-scheduler-scheduler" containerID="cri-o://ced19961c74f67c4400fff13faf2518cec0899932c3ec03808ce8ef4152854e7" gracePeriod=30 Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.271083 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.271686 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d160668f-3435-4b33-ac14-34d6a19c9b62" containerName="nova-metadata-log" containerID="cri-o://131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c" gracePeriod=30 Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.272392 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d160668f-3435-4b33-ac14-34d6a19c9b62" containerName="nova-metadata-metadata" containerID="cri-o://056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b" gracePeriod=30 Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.780515 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.845965 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.911104 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-config-data\") pod \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.911152 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdqg2\" (UniqueName: \"kubernetes.io/projected/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-kube-api-access-tdqg2\") pod \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.911171 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlsnd\" (UniqueName: \"kubernetes.io/projected/d160668f-3435-4b33-ac14-34d6a19c9b62-kube-api-access-dlsnd\") pod \"d160668f-3435-4b33-ac14-34d6a19c9b62\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.911247 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-config-data\") pod \"d160668f-3435-4b33-ac14-34d6a19c9b62\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.911300 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-combined-ca-bundle\") pod \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.911321 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-logs\") pod \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\" (UID: \"79b301c9-b2dc-4e36-9940-cd6ac885fd6d\") " Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.911336 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-combined-ca-bundle\") pod \"d160668f-3435-4b33-ac14-34d6a19c9b62\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.911404 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d160668f-3435-4b33-ac14-34d6a19c9b62-logs\") pod \"d160668f-3435-4b33-ac14-34d6a19c9b62\" (UID: \"d160668f-3435-4b33-ac14-34d6a19c9b62\") " Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.912601 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-logs" (OuterVolumeSpecName: "logs") pod "79b301c9-b2dc-4e36-9940-cd6ac885fd6d" (UID: "79b301c9-b2dc-4e36-9940-cd6ac885fd6d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.913061 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d160668f-3435-4b33-ac14-34d6a19c9b62-logs" (OuterVolumeSpecName: "logs") pod "d160668f-3435-4b33-ac14-34d6a19c9b62" (UID: "d160668f-3435-4b33-ac14-34d6a19c9b62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.920948 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-kube-api-access-tdqg2" (OuterVolumeSpecName: "kube-api-access-tdqg2") pod "79b301c9-b2dc-4e36-9940-cd6ac885fd6d" (UID: "79b301c9-b2dc-4e36-9940-cd6ac885fd6d"). InnerVolumeSpecName "kube-api-access-tdqg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.920983 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d160668f-3435-4b33-ac14-34d6a19c9b62-kube-api-access-dlsnd" (OuterVolumeSpecName: "kube-api-access-dlsnd") pod "d160668f-3435-4b33-ac14-34d6a19c9b62" (UID: "d160668f-3435-4b33-ac14-34d6a19c9b62"). InnerVolumeSpecName "kube-api-access-dlsnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.946745 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-config-data" (OuterVolumeSpecName: "config-data") pod "79b301c9-b2dc-4e36-9940-cd6ac885fd6d" (UID: "79b301c9-b2dc-4e36-9940-cd6ac885fd6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.964912 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79b301c9-b2dc-4e36-9940-cd6ac885fd6d" (UID: "79b301c9-b2dc-4e36-9940-cd6ac885fd6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.964959 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d160668f-3435-4b33-ac14-34d6a19c9b62" (UID: "d160668f-3435-4b33-ac14-34d6a19c9b62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:29 crc kubenswrapper[5002]: I1209 11:34:29.964912 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-config-data" (OuterVolumeSpecName: "config-data") pod "d160668f-3435-4b33-ac14-34d6a19c9b62" (UID: "d160668f-3435-4b33-ac14-34d6a19c9b62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.014055 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.014101 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.014117 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.014129 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d160668f-3435-4b33-ac14-34d6a19c9b62-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.014140 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.014151 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdqg2\" (UniqueName: \"kubernetes.io/projected/79b301c9-b2dc-4e36-9940-cd6ac885fd6d-kube-api-access-tdqg2\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.014165 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlsnd\" (UniqueName: \"kubernetes.io/projected/d160668f-3435-4b33-ac14-34d6a19c9b62-kube-api-access-dlsnd\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.014176 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d160668f-3435-4b33-ac14-34d6a19c9b62-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.065478 5002 generic.go:334] "Generic (PLEG): container finished" podID="d160668f-3435-4b33-ac14-34d6a19c9b62" containerID="056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b" exitCode=0 Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.065519 5002 generic.go:334] "Generic (PLEG): container finished" podID="d160668f-3435-4b33-ac14-34d6a19c9b62" containerID="131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c" exitCode=143 Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.065626 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.069117 5002 generic.go:334] "Generic (PLEG): container finished" podID="79b301c9-b2dc-4e36-9940-cd6ac885fd6d" containerID="f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a" exitCode=0 Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.069187 5002 generic.go:334] "Generic (PLEG): container finished" podID="79b301c9-b2dc-4e36-9940-cd6ac885fd6d" containerID="cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05" exitCode=143 Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.069396 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.076804 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d160668f-3435-4b33-ac14-34d6a19c9b62","Type":"ContainerDied","Data":"056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b"} Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.076876 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d160668f-3435-4b33-ac14-34d6a19c9b62","Type":"ContainerDied","Data":"131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c"} Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.076892 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d160668f-3435-4b33-ac14-34d6a19c9b62","Type":"ContainerDied","Data":"9e2a56495add99dcfbd41c06a4bceb2cff38adfec7188e1d665580af1cf718b5"} Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.076906 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79b301c9-b2dc-4e36-9940-cd6ac885fd6d","Type":"ContainerDied","Data":"f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a"} Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.076919 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79b301c9-b2dc-4e36-9940-cd6ac885fd6d","Type":"ContainerDied","Data":"cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05"} Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.076932 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79b301c9-b2dc-4e36-9940-cd6ac885fd6d","Type":"ContainerDied","Data":"033d6d1d7f3c660571f1423051490f89b87cbe00a1c1834ac9fe96cf15c5ce09"} Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.076954 5002 scope.go:117] "RemoveContainer" containerID="056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.083184 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"90e280fc-dcb3-4d43-828d-89c8abf26988","Type":"ContainerStarted","Data":"ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821"} Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.083366 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.118840 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.118803286 podStartE2EDuration="2.118803286s" podCreationTimestamp="2025-12-09 11:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:30.109578809 +0000 UTC m=+5602.501629920" watchObservedRunningTime="2025-12-09 11:34:30.118803286 +0000 UTC m=+5602.510854367" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.136694 5002 scope.go:117] "RemoveContainer" containerID="131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.137226 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.159402 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.171091 5002 scope.go:117] "RemoveContainer" containerID="056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.172404 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:30 crc kubenswrapper[5002]: E1209 11:34:30.172895 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b301c9-b2dc-4e36-9940-cd6ac885fd6d" containerName="nova-api-api" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.172920 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b301c9-b2dc-4e36-9940-cd6ac885fd6d" containerName="nova-api-api" Dec 09 11:34:30 crc kubenswrapper[5002]: E1209 11:34:30.172956 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160668f-3435-4b33-ac14-34d6a19c9b62" containerName="nova-metadata-metadata" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.172966 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160668f-3435-4b33-ac14-34d6a19c9b62" containerName="nova-metadata-metadata" Dec 09 11:34:30 crc kubenswrapper[5002]: E1209 11:34:30.172984 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b301c9-b2dc-4e36-9940-cd6ac885fd6d" containerName="nova-api-log" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.172992 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b301c9-b2dc-4e36-9940-cd6ac885fd6d" containerName="nova-api-log" Dec 09 11:34:30 crc kubenswrapper[5002]: E1209 11:34:30.173008 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160668f-3435-4b33-ac14-34d6a19c9b62" containerName="nova-metadata-log" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.173016 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160668f-3435-4b33-ac14-34d6a19c9b62" containerName="nova-metadata-log" Dec 09 11:34:30 crc kubenswrapper[5002]: E1209 11:34:30.173028 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb0a2a3-32f7-4406-9a40-db702f2e2786" containerName="nova-manage" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.173035 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb0a2a3-32f7-4406-9a40-db702f2e2786" containerName="nova-manage" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.173270 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b301c9-b2dc-4e36-9940-cd6ac885fd6d" containerName="nova-api-api" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.173289 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160668f-3435-4b33-ac14-34d6a19c9b62" containerName="nova-metadata-metadata" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.173302 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fb0a2a3-32f7-4406-9a40-db702f2e2786" containerName="nova-manage" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.173314 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160668f-3435-4b33-ac14-34d6a19c9b62" containerName="nova-metadata-log" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.173323 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b301c9-b2dc-4e36-9940-cd6ac885fd6d" containerName="nova-api-log" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.174530 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: E1209 11:34:30.185182 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b\": container with ID starting with 056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b not found: ID does not exist" containerID="056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.185238 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b"} err="failed to get container status \"056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b\": rpc error: code = NotFound desc = could not find container \"056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b\": container with ID starting with 056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b not found: ID does not exist" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.185271 5002 scope.go:117] "RemoveContainer" containerID="131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.185553 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 11:34:30 crc kubenswrapper[5002]: E1209 11:34:30.195116 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c\": container with ID starting with 131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c not found: ID does not exist" containerID="131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.195167 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c"} err="failed to get container status \"131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c\": rpc error: code = NotFound desc = could not find container \"131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c\": container with ID starting with 131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c not found: ID does not exist" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.195201 5002 scope.go:117] "RemoveContainer" containerID="056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.195568 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b"} err="failed to get container status \"056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b\": rpc error: code = NotFound desc = could not find container \"056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b\": container with ID starting with 056c67fc6f0c843c33687ec709708de4c6c3eabd9841795d69e0d0cccdb7d14b not found: ID does not exist" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.195591 5002 scope.go:117] "RemoveContainer" containerID="131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.195844 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c"} err="failed to get container status \"131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c\": rpc error: code = NotFound desc = could not find container \"131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c\": container with ID starting with 131a3e7f0e5ef161f1820210852aa4f1f988a413c734d128c5a9d922428b5c5c not found: ID does not exist" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.195869 5002 scope.go:117] "RemoveContainer" containerID="f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.217135 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.217203 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84da9958-b32b-4279-a0c6-25dcaeb857af-logs\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.217291 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sstqp\" (UniqueName: \"kubernetes.io/projected/84da9958-b32b-4279-a0c6-25dcaeb857af-kube-api-access-sstqp\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.217422 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-config-data\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.221761 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.244455 5002 scope.go:117] "RemoveContainer" containerID="cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.247572 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.258387 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.267198 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.267414 5002 scope.go:117] "RemoveContainer" containerID="f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a" Dec 09 11:34:30 crc kubenswrapper[5002]: E1209 11:34:30.267805 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a\": container with ID starting with f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a not found: ID does not exist" containerID="f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.267864 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a"} err="failed to get container status \"f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a\": rpc error: code = NotFound desc = could not find container \"f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a\": container with ID starting with f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a not found: ID does not exist" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.267947 5002 scope.go:117] "RemoveContainer" containerID="cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05" Dec 09 11:34:30 crc kubenswrapper[5002]: E1209 11:34:30.268420 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05\": container with ID starting with cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05 not found: ID does not exist" containerID="cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.268443 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05"} err="failed to get container status \"cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05\": rpc error: code = NotFound desc = could not find container \"cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05\": container with ID starting with cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05 not found: ID does not exist" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.268456 5002 scope.go:117] "RemoveContainer" containerID="f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.268663 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a"} err="failed to get container status \"f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a\": rpc error: code = NotFound desc = could not find container \"f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a\": container with ID starting with f13fe00e77baba4e4b1c63fb0b98b8e877ad92f8495f7fad687cac355e2f676a not found: ID does not exist" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.268675 5002 scope.go:117] "RemoveContainer" containerID="cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.269262 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05"} err="failed to get container status \"cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05\": rpc error: code = NotFound desc = could not find container \"cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05\": container with ID starting with cb82877223675fe9b44ffcb1eaa5bd6256170c97eb8e82caf200449a75461b05 not found: ID does not exist" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.269368 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.272715 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.276005 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.319458 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84da9958-b32b-4279-a0c6-25dcaeb857af-logs\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.319524 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-config-data\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.319562 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sstqp\" (UniqueName: \"kubernetes.io/projected/84da9958-b32b-4279-a0c6-25dcaeb857af-kube-api-access-sstqp\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.319597 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-config-data\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.319656 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld5dh\" (UniqueName: \"kubernetes.io/projected/1d91cb14-b0c2-42b5-9803-711c6d9af174-kube-api-access-ld5dh\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.319672 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.319705 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.319720 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d91cb14-b0c2-42b5-9803-711c6d9af174-logs\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.320196 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84da9958-b32b-4279-a0c6-25dcaeb857af-logs\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.322973 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.323101 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-config-data\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.337727 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sstqp\" (UniqueName: \"kubernetes.io/projected/84da9958-b32b-4279-a0c6-25dcaeb857af-kube-api-access-sstqp\") pod \"nova-api-0\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.421209 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld5dh\" (UniqueName: \"kubernetes.io/projected/1d91cb14-b0c2-42b5-9803-711c6d9af174-kube-api-access-ld5dh\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.421267 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.421302 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d91cb14-b0c2-42b5-9803-711c6d9af174-logs\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.421359 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-config-data\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.422023 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d91cb14-b0c2-42b5-9803-711c6d9af174-logs\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.431757 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.431959 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-config-data\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.443261 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld5dh\" (UniqueName: \"kubernetes.io/projected/1d91cb14-b0c2-42b5-9803-711c6d9af174-kube-api-access-ld5dh\") pod \"nova-metadata-0\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.517237 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.600601 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.697004 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.716519 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.722006 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.795239 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-l9t2c"] Dec 09 11:34:30 crc kubenswrapper[5002]: I1209 11:34:30.795462 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" podUID="16eec812-f54c-47c6-a589-678dde276c4a" containerName="dnsmasq-dns" containerID="cri-o://1701310af47e13f7baace4b38b57de9c5fb3d4eabeda36b3c7ed6c88475a9973" gracePeriod=10 Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.001683 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.196871 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84da9958-b32b-4279-a0c6-25dcaeb857af","Type":"ContainerStarted","Data":"e8f328d7083cdeb486781e82832727ba330f2e4d19c896e423b57362eba2d879"} Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.204423 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.255925 5002 generic.go:334] "Generic (PLEG): container finished" podID="16eec812-f54c-47c6-a589-678dde276c4a" containerID="1701310af47e13f7baace4b38b57de9c5fb3d4eabeda36b3c7ed6c88475a9973" exitCode=0 Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.256074 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" event={"ID":"16eec812-f54c-47c6-a589-678dde276c4a","Type":"ContainerDied","Data":"1701310af47e13f7baace4b38b57de9c5fb3d4eabeda36b3c7ed6c88475a9973"} Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.289449 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.382994 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.455857 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-dns-svc\") pod \"16eec812-f54c-47c6-a589-678dde276c4a\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.455945 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-sb\") pod \"16eec812-f54c-47c6-a589-678dde276c4a\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.456009 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-nb\") pod \"16eec812-f54c-47c6-a589-678dde276c4a\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.456092 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-config\") pod \"16eec812-f54c-47c6-a589-678dde276c4a\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.456192 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj4z8\" (UniqueName: \"kubernetes.io/projected/16eec812-f54c-47c6-a589-678dde276c4a-kube-api-access-bj4z8\") pod \"16eec812-f54c-47c6-a589-678dde276c4a\" (UID: \"16eec812-f54c-47c6-a589-678dde276c4a\") " Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.459710 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16eec812-f54c-47c6-a589-678dde276c4a-kube-api-access-bj4z8" (OuterVolumeSpecName: "kube-api-access-bj4z8") pod "16eec812-f54c-47c6-a589-678dde276c4a" (UID: "16eec812-f54c-47c6-a589-678dde276c4a"). InnerVolumeSpecName "kube-api-access-bj4z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.507456 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "16eec812-f54c-47c6-a589-678dde276c4a" (UID: "16eec812-f54c-47c6-a589-678dde276c4a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.529354 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "16eec812-f54c-47c6-a589-678dde276c4a" (UID: "16eec812-f54c-47c6-a589-678dde276c4a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.530216 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-config" (OuterVolumeSpecName: "config") pod "16eec812-f54c-47c6-a589-678dde276c4a" (UID: "16eec812-f54c-47c6-a589-678dde276c4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.537740 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "16eec812-f54c-47c6-a589-678dde276c4a" (UID: "16eec812-f54c-47c6-a589-678dde276c4a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.558526 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj4z8\" (UniqueName: \"kubernetes.io/projected/16eec812-f54c-47c6-a589-678dde276c4a-kube-api-access-bj4z8\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.558565 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.558579 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.558593 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:31 crc kubenswrapper[5002]: I1209 11:34:31.558605 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16eec812-f54c-47c6-a589-678dde276c4a-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.070681 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b301c9-b2dc-4e36-9940-cd6ac885fd6d" path="/var/lib/kubelet/pods/79b301c9-b2dc-4e36-9940-cd6ac885fd6d/volumes" Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.071555 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d160668f-3435-4b33-ac14-34d6a19c9b62" path="/var/lib/kubelet/pods/d160668f-3435-4b33-ac14-34d6a19c9b62/volumes" Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.266201 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84da9958-b32b-4279-a0c6-25dcaeb857af","Type":"ContainerStarted","Data":"f66a9b6309169cf9769d3e459eef0516a1aeea865415a1b4b60f21ba7d425dc3"} Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.266312 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84da9958-b32b-4279-a0c6-25dcaeb857af","Type":"ContainerStarted","Data":"35d471ead1600a036b804e060a1f5044fce6c2704a91082fb6e3da7aab2d290e"} Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.267915 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d91cb14-b0c2-42b5-9803-711c6d9af174","Type":"ContainerStarted","Data":"e8e832526211217777a7440c662e8cb9b3b5584abab65702ef6d78e063eeab90"} Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.267975 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d91cb14-b0c2-42b5-9803-711c6d9af174","Type":"ContainerStarted","Data":"3d050d6134b33b6d6c3c5a3f715f2ea41386bb69cdceff35cf125975e187a9c6"} Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.267994 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d91cb14-b0c2-42b5-9803-711c6d9af174","Type":"ContainerStarted","Data":"9c4e3ad1fcd33f0f8d764018b74d70ce5cfd6c83967ae8f5d90f2a6323caaa19"} Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.269700 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.269739 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74df65d56c-l9t2c" event={"ID":"16eec812-f54c-47c6-a589-678dde276c4a","Type":"ContainerDied","Data":"40adc1e7b694f9eced2f615e323391ea074c4ef56a092ea0bdbd9505e373ce28"} Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.269770 5002 scope.go:117] "RemoveContainer" containerID="1701310af47e13f7baace4b38b57de9c5fb3d4eabeda36b3c7ed6c88475a9973" Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.289628 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.289608403 podStartE2EDuration="2.289608403s" podCreationTimestamp="2025-12-09 11:34:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:32.282917415 +0000 UTC m=+5604.674968506" watchObservedRunningTime="2025-12-09 11:34:32.289608403 +0000 UTC m=+5604.681659484" Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.291291 5002 scope.go:117] "RemoveContainer" containerID="43cedbe4b3b104d834d8e2e8d92dfc163b26dcfd6ecd5f47edaf171e0619e046" Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.308313 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-l9t2c"] Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.323753 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-l9t2c"] Dec 09 11:34:32 crc kubenswrapper[5002]: I1209 11:34:32.326659 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.326643854 podStartE2EDuration="2.326643854s" podCreationTimestamp="2025-12-09 11:34:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:32.315365273 +0000 UTC m=+5604.707416374" watchObservedRunningTime="2025-12-09 11:34:32.326643854 +0000 UTC m=+5604.718694925" Dec 09 11:34:34 crc kubenswrapper[5002]: I1209 11:34:34.073425 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16eec812-f54c-47c6-a589-678dde276c4a" path="/var/lib/kubelet/pods/16eec812-f54c-47c6-a589-678dde276c4a/volumes" Dec 09 11:34:34 crc kubenswrapper[5002]: I1209 11:34:34.287271 5002 generic.go:334] "Generic (PLEG): container finished" podID="e9faf59d-a3e1-4d74-acd5-481eebd94f63" containerID="ced19961c74f67c4400fff13faf2518cec0899932c3ec03808ce8ef4152854e7" exitCode=0 Dec 09 11:34:34 crc kubenswrapper[5002]: I1209 11:34:34.287315 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9faf59d-a3e1-4d74-acd5-481eebd94f63","Type":"ContainerDied","Data":"ced19961c74f67c4400fff13faf2518cec0899932c3ec03808ce8ef4152854e7"} Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.171884 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.225247 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-combined-ca-bundle\") pod \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.225382 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-config-data\") pod \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.225472 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5stz\" (UniqueName: \"kubernetes.io/projected/e9faf59d-a3e1-4d74-acd5-481eebd94f63-kube-api-access-f5stz\") pod \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\" (UID: \"e9faf59d-a3e1-4d74-acd5-481eebd94f63\") " Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.232186 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9faf59d-a3e1-4d74-acd5-481eebd94f63-kube-api-access-f5stz" (OuterVolumeSpecName: "kube-api-access-f5stz") pod "e9faf59d-a3e1-4d74-acd5-481eebd94f63" (UID: "e9faf59d-a3e1-4d74-acd5-481eebd94f63"). InnerVolumeSpecName "kube-api-access-f5stz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.251834 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9faf59d-a3e1-4d74-acd5-481eebd94f63" (UID: "e9faf59d-a3e1-4d74-acd5-481eebd94f63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.256621 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-config-data" (OuterVolumeSpecName: "config-data") pod "e9faf59d-a3e1-4d74-acd5-481eebd94f63" (UID: "e9faf59d-a3e1-4d74-acd5-481eebd94f63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.300428 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9faf59d-a3e1-4d74-acd5-481eebd94f63","Type":"ContainerDied","Data":"0e739f6dc53124f6994b215ab19565b78ddf25ef4030a98b6aeb284b912fd585"} Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.300597 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.300690 5002 scope.go:117] "RemoveContainer" containerID="ced19961c74f67c4400fff13faf2518cec0899932c3ec03808ce8ef4152854e7" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.333184 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.333228 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5stz\" (UniqueName: \"kubernetes.io/projected/e9faf59d-a3e1-4d74-acd5-481eebd94f63-kube-api-access-f5stz\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.333246 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9faf59d-a3e1-4d74-acd5-481eebd94f63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.340521 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.355126 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.365003 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:35 crc kubenswrapper[5002]: E1209 11:34:35.365491 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16eec812-f54c-47c6-a589-678dde276c4a" containerName="dnsmasq-dns" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.365514 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="16eec812-f54c-47c6-a589-678dde276c4a" containerName="dnsmasq-dns" Dec 09 11:34:35 crc kubenswrapper[5002]: E1209 11:34:35.365541 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16eec812-f54c-47c6-a589-678dde276c4a" containerName="init" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.365549 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="16eec812-f54c-47c6-a589-678dde276c4a" containerName="init" Dec 09 11:34:35 crc kubenswrapper[5002]: E1209 11:34:35.365583 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9faf59d-a3e1-4d74-acd5-481eebd94f63" containerName="nova-scheduler-scheduler" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.365591 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9faf59d-a3e1-4d74-acd5-481eebd94f63" containerName="nova-scheduler-scheduler" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.365834 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="16eec812-f54c-47c6-a589-678dde276c4a" containerName="dnsmasq-dns" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.365869 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9faf59d-a3e1-4d74-acd5-481eebd94f63" containerName="nova-scheduler-scheduler" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.366678 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.371933 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.372740 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.434547 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.434612 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdst7\" (UniqueName: \"kubernetes.io/projected/752b8b3b-bb27-4905-9f0a-32af13e00b64-kube-api-access-sdst7\") pod \"nova-scheduler-0\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.434907 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-config-data\") pod \"nova-scheduler-0\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.539757 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-config-data\") pod \"nova-scheduler-0\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.539899 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.539928 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdst7\" (UniqueName: \"kubernetes.io/projected/752b8b3b-bb27-4905-9f0a-32af13e00b64-kube-api-access-sdst7\") pod \"nova-scheduler-0\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.549750 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-config-data\") pod \"nova-scheduler-0\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.576659 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.579557 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdst7\" (UniqueName: \"kubernetes.io/projected/752b8b3b-bb27-4905-9f0a-32af13e00b64-kube-api-access-sdst7\") pod \"nova-scheduler-0\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.600849 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.601274 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:34:35 crc kubenswrapper[5002]: I1209 11:34:35.686357 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:34:36 crc kubenswrapper[5002]: I1209 11:34:36.073033 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9faf59d-a3e1-4d74-acd5-481eebd94f63" path="/var/lib/kubelet/pods/e9faf59d-a3e1-4d74-acd5-481eebd94f63/volumes" Dec 09 11:34:36 crc kubenswrapper[5002]: I1209 11:34:36.135169 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:36 crc kubenswrapper[5002]: W1209 11:34:36.141041 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod752b8b3b_bb27_4905_9f0a_32af13e00b64.slice/crio-ec845a9852aaaf8a1b18e422ab1476dd4ec37264d3cdb5d13e6f269eab52bf86 WatchSource:0}: Error finding container ec845a9852aaaf8a1b18e422ab1476dd4ec37264d3cdb5d13e6f269eab52bf86: Status 404 returned error can't find the container with id ec845a9852aaaf8a1b18e422ab1476dd4ec37264d3cdb5d13e6f269eab52bf86 Dec 09 11:34:36 crc kubenswrapper[5002]: I1209 11:34:36.308900 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"752b8b3b-bb27-4905-9f0a-32af13e00b64","Type":"ContainerStarted","Data":"ec845a9852aaaf8a1b18e422ab1476dd4ec37264d3cdb5d13e6f269eab52bf86"} Dec 09 11:34:37 crc kubenswrapper[5002]: I1209 11:34:37.320549 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"752b8b3b-bb27-4905-9f0a-32af13e00b64","Type":"ContainerStarted","Data":"6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322"} Dec 09 11:34:37 crc kubenswrapper[5002]: I1209 11:34:37.343663 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.343640892 podStartE2EDuration="2.343640892s" podCreationTimestamp="2025-12-09 11:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:37.336477341 +0000 UTC m=+5609.728528442" watchObservedRunningTime="2025-12-09 11:34:37.343640892 +0000 UTC m=+5609.735691963" Dec 09 11:34:38 crc kubenswrapper[5002]: I1209 11:34:38.490757 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 09 11:34:38 crc kubenswrapper[5002]: I1209 11:34:38.993239 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-8kjkp"] Dec 09 11:34:38 crc kubenswrapper[5002]: I1209 11:34:38.994974 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:38 crc kubenswrapper[5002]: I1209 11:34:38.997870 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 09 11:34:38 crc kubenswrapper[5002]: I1209 11:34:38.997969 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.005766 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8kjkp"] Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.111580 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-scripts\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.111996 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.112026 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlq94\" (UniqueName: \"kubernetes.io/projected/8013239c-dfd4-407a-aa67-d57b2ce2aac5-kube-api-access-hlq94\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.112074 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-config-data\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.213484 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-scripts\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.213935 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.214054 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlq94\" (UniqueName: \"kubernetes.io/projected/8013239c-dfd4-407a-aa67-d57b2ce2aac5-kube-api-access-hlq94\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.214172 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-config-data\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.219467 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.219474 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-config-data\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.221865 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-scripts\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.237337 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlq94\" (UniqueName: \"kubernetes.io/projected/8013239c-dfd4-407a-aa67-d57b2ce2aac5-kube-api-access-hlq94\") pod \"nova-cell1-cell-mapping-8kjkp\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.325434 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:39 crc kubenswrapper[5002]: I1209 11:34:39.797428 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8kjkp"] Dec 09 11:34:40 crc kubenswrapper[5002]: I1209 11:34:40.371682 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8kjkp" event={"ID":"8013239c-dfd4-407a-aa67-d57b2ce2aac5","Type":"ContainerStarted","Data":"634ad4e82176747ed0439293087d0fa0499d64dac7e1b6fb3087fd5178f4cf7d"} Dec 09 11:34:40 crc kubenswrapper[5002]: I1209 11:34:40.372076 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8kjkp" event={"ID":"8013239c-dfd4-407a-aa67-d57b2ce2aac5","Type":"ContainerStarted","Data":"e13b2c600bcac3f534d11f63cca8563b0d516d9dfef489b1dafdd78e9503e673"} Dec 09 11:34:40 crc kubenswrapper[5002]: I1209 11:34:40.393334 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-8kjkp" podStartSLOduration=2.393309318 podStartE2EDuration="2.393309318s" podCreationTimestamp="2025-12-09 11:34:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:40.387092352 +0000 UTC m=+5612.779143433" watchObservedRunningTime="2025-12-09 11:34:40.393309318 +0000 UTC m=+5612.785360409" Dec 09 11:34:40 crc kubenswrapper[5002]: I1209 11:34:40.517617 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:34:40 crc kubenswrapper[5002]: I1209 11:34:40.517669 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:34:40 crc kubenswrapper[5002]: I1209 11:34:40.601791 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 11:34:40 crc kubenswrapper[5002]: I1209 11:34:40.602069 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 11:34:40 crc kubenswrapper[5002]: I1209 11:34:40.687118 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 11:34:41 crc kubenswrapper[5002]: I1209 11:34:41.601055 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.62:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:34:41 crc kubenswrapper[5002]: I1209 11:34:41.601282 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.62:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:34:41 crc kubenswrapper[5002]: I1209 11:34:41.685107 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.63:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:34:41 crc kubenswrapper[5002]: I1209 11:34:41.685381 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.63:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:34:45 crc kubenswrapper[5002]: I1209 11:34:45.429328 5002 generic.go:334] "Generic (PLEG): container finished" podID="8013239c-dfd4-407a-aa67-d57b2ce2aac5" containerID="634ad4e82176747ed0439293087d0fa0499d64dac7e1b6fb3087fd5178f4cf7d" exitCode=0 Dec 09 11:34:45 crc kubenswrapper[5002]: I1209 11:34:45.429597 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8kjkp" event={"ID":"8013239c-dfd4-407a-aa67-d57b2ce2aac5","Type":"ContainerDied","Data":"634ad4e82176747ed0439293087d0fa0499d64dac7e1b6fb3087fd5178f4cf7d"} Dec 09 11:34:45 crc kubenswrapper[5002]: I1209 11:34:45.687397 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 11:34:45 crc kubenswrapper[5002]: I1209 11:34:45.721243 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 11:34:46 crc kubenswrapper[5002]: I1209 11:34:46.477238 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 11:34:46 crc kubenswrapper[5002]: I1209 11:34:46.508046 5002 scope.go:117] "RemoveContainer" containerID="b680270350965904d5be19221fc23cfb017210cc89b8668f43e079f9af6459cb" Dec 09 11:34:46 crc kubenswrapper[5002]: I1209 11:34:46.802995 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:46 crc kubenswrapper[5002]: I1209 11:34:46.992663 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlq94\" (UniqueName: \"kubernetes.io/projected/8013239c-dfd4-407a-aa67-d57b2ce2aac5-kube-api-access-hlq94\") pod \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " Dec 09 11:34:46 crc kubenswrapper[5002]: I1209 11:34:46.992777 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-config-data\") pod \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " Dec 09 11:34:46 crc kubenswrapper[5002]: I1209 11:34:46.992877 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-combined-ca-bundle\") pod \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " Dec 09 11:34:46 crc kubenswrapper[5002]: I1209 11:34:46.992911 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-scripts\") pod \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\" (UID: \"8013239c-dfd4-407a-aa67-d57b2ce2aac5\") " Dec 09 11:34:46 crc kubenswrapper[5002]: I1209 11:34:46.998949 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-scripts" (OuterVolumeSpecName: "scripts") pod "8013239c-dfd4-407a-aa67-d57b2ce2aac5" (UID: "8013239c-dfd4-407a-aa67-d57b2ce2aac5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:46 crc kubenswrapper[5002]: I1209 11:34:46.998971 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8013239c-dfd4-407a-aa67-d57b2ce2aac5-kube-api-access-hlq94" (OuterVolumeSpecName: "kube-api-access-hlq94") pod "8013239c-dfd4-407a-aa67-d57b2ce2aac5" (UID: "8013239c-dfd4-407a-aa67-d57b2ce2aac5"). InnerVolumeSpecName "kube-api-access-hlq94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.018117 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8013239c-dfd4-407a-aa67-d57b2ce2aac5" (UID: "8013239c-dfd4-407a-aa67-d57b2ce2aac5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.031863 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-config-data" (OuterVolumeSpecName: "config-data") pod "8013239c-dfd4-407a-aa67-d57b2ce2aac5" (UID: "8013239c-dfd4-407a-aa67-d57b2ce2aac5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.096523 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.096567 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.096582 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8013239c-dfd4-407a-aa67-d57b2ce2aac5-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.096595 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlq94\" (UniqueName: \"kubernetes.io/projected/8013239c-dfd4-407a-aa67-d57b2ce2aac5-kube-api-access-hlq94\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.450078 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8kjkp" Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.450102 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8kjkp" event={"ID":"8013239c-dfd4-407a-aa67-d57b2ce2aac5","Type":"ContainerDied","Data":"e13b2c600bcac3f534d11f63cca8563b0d516d9dfef489b1dafdd78e9503e673"} Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.450202 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e13b2c600bcac3f534d11f63cca8563b0d516d9dfef489b1dafdd78e9503e673" Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.643192 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.644299 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerName="nova-api-log" containerID="cri-o://35d471ead1600a036b804e060a1f5044fce6c2704a91082fb6e3da7aab2d290e" gracePeriod=30 Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.644375 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerName="nova-api-api" containerID="cri-o://f66a9b6309169cf9769d3e459eef0516a1aeea865415a1b4b60f21ba7d425dc3" gracePeriod=30 Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.664682 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.683199 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.683479 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerName="nova-metadata-log" containerID="cri-o://3d050d6134b33b6d6c3c5a3f715f2ea41386bb69cdceff35cf125975e187a9c6" gracePeriod=30 Dec 09 11:34:47 crc kubenswrapper[5002]: I1209 11:34:47.683538 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerName="nova-metadata-metadata" containerID="cri-o://e8e832526211217777a7440c662e8cb9b3b5584abab65702ef6d78e063eeab90" gracePeriod=30 Dec 09 11:34:48 crc kubenswrapper[5002]: I1209 11:34:48.461276 5002 generic.go:334] "Generic (PLEG): container finished" podID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerID="35d471ead1600a036b804e060a1f5044fce6c2704a91082fb6e3da7aab2d290e" exitCode=143 Dec 09 11:34:48 crc kubenswrapper[5002]: I1209 11:34:48.461310 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84da9958-b32b-4279-a0c6-25dcaeb857af","Type":"ContainerDied","Data":"35d471ead1600a036b804e060a1f5044fce6c2704a91082fb6e3da7aab2d290e"} Dec 09 11:34:48 crc kubenswrapper[5002]: I1209 11:34:48.463349 5002 generic.go:334] "Generic (PLEG): container finished" podID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerID="3d050d6134b33b6d6c3c5a3f715f2ea41386bb69cdceff35cf125975e187a9c6" exitCode=143 Dec 09 11:34:48 crc kubenswrapper[5002]: I1209 11:34:48.463401 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d91cb14-b0c2-42b5-9803-711c6d9af174","Type":"ContainerDied","Data":"3d050d6134b33b6d6c3c5a3f715f2ea41386bb69cdceff35cf125975e187a9c6"} Dec 09 11:34:48 crc kubenswrapper[5002]: I1209 11:34:48.463501 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="752b8b3b-bb27-4905-9f0a-32af13e00b64" containerName="nova-scheduler-scheduler" containerID="cri-o://6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322" gracePeriod=30 Dec 09 11:34:50 crc kubenswrapper[5002]: E1209 11:34:50.697170 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:34:50 crc kubenswrapper[5002]: E1209 11:34:50.701231 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:34:50 crc kubenswrapper[5002]: E1209 11:34:50.703490 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:34:50 crc kubenswrapper[5002]: E1209 11:34:50.703553 5002 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="752b8b3b-bb27-4905-9f0a-32af13e00b64" containerName="nova-scheduler-scheduler" Dec 09 11:34:51 crc kubenswrapper[5002]: I1209 11:34:51.492311 5002 generic.go:334] "Generic (PLEG): container finished" podID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerID="f66a9b6309169cf9769d3e459eef0516a1aeea865415a1b4b60f21ba7d425dc3" exitCode=0 Dec 09 11:34:51 crc kubenswrapper[5002]: I1209 11:34:51.492383 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84da9958-b32b-4279-a0c6-25dcaeb857af","Type":"ContainerDied","Data":"f66a9b6309169cf9769d3e459eef0516a1aeea865415a1b4b60f21ba7d425dc3"} Dec 09 11:34:51 crc kubenswrapper[5002]: I1209 11:34:51.494510 5002 generic.go:334] "Generic (PLEG): container finished" podID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerID="e8e832526211217777a7440c662e8cb9b3b5584abab65702ef6d78e063eeab90" exitCode=0 Dec 09 11:34:51 crc kubenswrapper[5002]: I1209 11:34:51.494559 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d91cb14-b0c2-42b5-9803-711c6d9af174","Type":"ContainerDied","Data":"e8e832526211217777a7440c662e8cb9b3b5584abab65702ef6d78e063eeab90"} Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.197206 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.204982 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.383888 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-combined-ca-bundle\") pod \"1d91cb14-b0c2-42b5-9803-711c6d9af174\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.383991 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sstqp\" (UniqueName: \"kubernetes.io/projected/84da9958-b32b-4279-a0c6-25dcaeb857af-kube-api-access-sstqp\") pod \"84da9958-b32b-4279-a0c6-25dcaeb857af\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.384053 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-combined-ca-bundle\") pod \"84da9958-b32b-4279-a0c6-25dcaeb857af\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.384806 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-config-data\") pod \"84da9958-b32b-4279-a0c6-25dcaeb857af\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.385091 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-config-data\") pod \"1d91cb14-b0c2-42b5-9803-711c6d9af174\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.385120 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld5dh\" (UniqueName: \"kubernetes.io/projected/1d91cb14-b0c2-42b5-9803-711c6d9af174-kube-api-access-ld5dh\") pod \"1d91cb14-b0c2-42b5-9803-711c6d9af174\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.385177 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d91cb14-b0c2-42b5-9803-711c6d9af174-logs\") pod \"1d91cb14-b0c2-42b5-9803-711c6d9af174\" (UID: \"1d91cb14-b0c2-42b5-9803-711c6d9af174\") " Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.385203 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84da9958-b32b-4279-a0c6-25dcaeb857af-logs\") pod \"84da9958-b32b-4279-a0c6-25dcaeb857af\" (UID: \"84da9958-b32b-4279-a0c6-25dcaeb857af\") " Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.385754 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d91cb14-b0c2-42b5-9803-711c6d9af174-logs" (OuterVolumeSpecName: "logs") pod "1d91cb14-b0c2-42b5-9803-711c6d9af174" (UID: "1d91cb14-b0c2-42b5-9803-711c6d9af174"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.386016 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84da9958-b32b-4279-a0c6-25dcaeb857af-logs" (OuterVolumeSpecName: "logs") pod "84da9958-b32b-4279-a0c6-25dcaeb857af" (UID: "84da9958-b32b-4279-a0c6-25dcaeb857af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.386353 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d91cb14-b0c2-42b5-9803-711c6d9af174-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.386378 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84da9958-b32b-4279-a0c6-25dcaeb857af-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.394409 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84da9958-b32b-4279-a0c6-25dcaeb857af-kube-api-access-sstqp" (OuterVolumeSpecName: "kube-api-access-sstqp") pod "84da9958-b32b-4279-a0c6-25dcaeb857af" (UID: "84da9958-b32b-4279-a0c6-25dcaeb857af"). InnerVolumeSpecName "kube-api-access-sstqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.408715 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d91cb14-b0c2-42b5-9803-711c6d9af174-kube-api-access-ld5dh" (OuterVolumeSpecName: "kube-api-access-ld5dh") pod "1d91cb14-b0c2-42b5-9803-711c6d9af174" (UID: "1d91cb14-b0c2-42b5-9803-711c6d9af174"). InnerVolumeSpecName "kube-api-access-ld5dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.414731 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84da9958-b32b-4279-a0c6-25dcaeb857af" (UID: "84da9958-b32b-4279-a0c6-25dcaeb857af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.419960 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-config-data" (OuterVolumeSpecName: "config-data") pod "1d91cb14-b0c2-42b5-9803-711c6d9af174" (UID: "1d91cb14-b0c2-42b5-9803-711c6d9af174"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.421048 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-config-data" (OuterVolumeSpecName: "config-data") pod "84da9958-b32b-4279-a0c6-25dcaeb857af" (UID: "84da9958-b32b-4279-a0c6-25dcaeb857af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.424730 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d91cb14-b0c2-42b5-9803-711c6d9af174" (UID: "1d91cb14-b0c2-42b5-9803-711c6d9af174"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.490263 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sstqp\" (UniqueName: \"kubernetes.io/projected/84da9958-b32b-4279-a0c6-25dcaeb857af-kube-api-access-sstqp\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.490310 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.490324 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84da9958-b32b-4279-a0c6-25dcaeb857af-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.490339 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.490354 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld5dh\" (UniqueName: \"kubernetes.io/projected/1d91cb14-b0c2-42b5-9803-711c6d9af174-kube-api-access-ld5dh\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.490368 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d91cb14-b0c2-42b5-9803-711c6d9af174-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.503999 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84da9958-b32b-4279-a0c6-25dcaeb857af","Type":"ContainerDied","Data":"e8f328d7083cdeb486781e82832727ba330f2e4d19c896e423b57362eba2d879"} Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.504054 5002 scope.go:117] "RemoveContainer" containerID="f66a9b6309169cf9769d3e459eef0516a1aeea865415a1b4b60f21ba7d425dc3" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.504174 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.513779 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d91cb14-b0c2-42b5-9803-711c6d9af174","Type":"ContainerDied","Data":"9c4e3ad1fcd33f0f8d764018b74d70ce5cfd6c83967ae8f5d90f2a6323caaa19"} Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.513894 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.554977 5002 scope.go:117] "RemoveContainer" containerID="35d471ead1600a036b804e060a1f5044fce6c2704a91082fb6e3da7aab2d290e" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.603958 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.619734 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.621032 5002 scope.go:117] "RemoveContainer" containerID="e8e832526211217777a7440c662e8cb9b3b5584abab65702ef6d78e063eeab90" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.631061 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.639890 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.650895 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:52 crc kubenswrapper[5002]: E1209 11:34:52.651230 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerName="nova-metadata-metadata" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.651247 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerName="nova-metadata-metadata" Dec 09 11:34:52 crc kubenswrapper[5002]: E1209 11:34:52.651272 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerName="nova-api-api" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.651279 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerName="nova-api-api" Dec 09 11:34:52 crc kubenswrapper[5002]: E1209 11:34:52.651289 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerName="nova-metadata-log" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.651295 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerName="nova-metadata-log" Dec 09 11:34:52 crc kubenswrapper[5002]: E1209 11:34:52.651305 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8013239c-dfd4-407a-aa67-d57b2ce2aac5" containerName="nova-manage" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.651311 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8013239c-dfd4-407a-aa67-d57b2ce2aac5" containerName="nova-manage" Dec 09 11:34:52 crc kubenswrapper[5002]: E1209 11:34:52.651326 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerName="nova-api-log" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.651332 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerName="nova-api-log" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.651492 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="8013239c-dfd4-407a-aa67-d57b2ce2aac5" containerName="nova-manage" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.651506 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerName="nova-metadata-log" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.651516 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerName="nova-api-api" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.651531 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d91cb14-b0c2-42b5-9803-711c6d9af174" containerName="nova-metadata-metadata" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.651541 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="84da9958-b32b-4279-a0c6-25dcaeb857af" containerName="nova-api-log" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.652533 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.655794 5002 scope.go:117] "RemoveContainer" containerID="3d050d6134b33b6d6c3c5a3f715f2ea41386bb69cdceff35cf125975e187a9c6" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.656159 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.662060 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.664010 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.670334 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.674922 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.688475 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.803136 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-config-data\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.803220 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7a608e-8ea7-4852-b11a-148d988edf80-logs\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.803243 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.803289 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnjqf\" (UniqueName: \"kubernetes.io/projected/3e7a608e-8ea7-4852-b11a-148d988edf80-kube-api-access-dnjqf\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.803316 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-config-data\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.803342 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfjxz\" (UniqueName: \"kubernetes.io/projected/508684a5-04cd-49b4-b3e0-716ccad025d9-kube-api-access-sfjxz\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.804280 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/508684a5-04cd-49b4-b3e0-716ccad025d9-logs\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.804354 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.906387 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.906666 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-config-data\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.906710 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7a608e-8ea7-4852-b11a-148d988edf80-logs\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.906725 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.906764 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnjqf\" (UniqueName: \"kubernetes.io/projected/3e7a608e-8ea7-4852-b11a-148d988edf80-kube-api-access-dnjqf\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.906784 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-config-data\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.906830 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfjxz\" (UniqueName: \"kubernetes.io/projected/508684a5-04cd-49b4-b3e0-716ccad025d9-kube-api-access-sfjxz\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.906890 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/508684a5-04cd-49b4-b3e0-716ccad025d9-logs\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.907139 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7a608e-8ea7-4852-b11a-148d988edf80-logs\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.907305 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/508684a5-04cd-49b4-b3e0-716ccad025d9-logs\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.910574 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.910582 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-config-data\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.911136 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.911472 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-config-data\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.923066 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfjxz\" (UniqueName: \"kubernetes.io/projected/508684a5-04cd-49b4-b3e0-716ccad025d9-kube-api-access-sfjxz\") pod \"nova-api-0\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.923454 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnjqf\" (UniqueName: \"kubernetes.io/projected/3e7a608e-8ea7-4852-b11a-148d988edf80-kube-api-access-dnjqf\") pod \"nova-metadata-0\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " pod="openstack/nova-metadata-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.985983 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:34:52 crc kubenswrapper[5002]: I1209 11:34:52.998658 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:34:53 crc kubenswrapper[5002]: I1209 11:34:53.043912 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:34:53 crc kubenswrapper[5002]: I1209 11:34:53.113360 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-config-data\") pod \"752b8b3b-bb27-4905-9f0a-32af13e00b64\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " Dec 09 11:34:53 crc kubenswrapper[5002]: I1209 11:34:53.113439 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdst7\" (UniqueName: \"kubernetes.io/projected/752b8b3b-bb27-4905-9f0a-32af13e00b64-kube-api-access-sdst7\") pod \"752b8b3b-bb27-4905-9f0a-32af13e00b64\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " Dec 09 11:34:53 crc kubenswrapper[5002]: I1209 11:34:53.113503 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-combined-ca-bundle\") pod \"752b8b3b-bb27-4905-9f0a-32af13e00b64\" (UID: \"752b8b3b-bb27-4905-9f0a-32af13e00b64\") " Dec 09 11:34:53 crc kubenswrapper[5002]: I1209 11:34:53.119346 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752b8b3b-bb27-4905-9f0a-32af13e00b64-kube-api-access-sdst7" (OuterVolumeSpecName: "kube-api-access-sdst7") pod "752b8b3b-bb27-4905-9f0a-32af13e00b64" (UID: "752b8b3b-bb27-4905-9f0a-32af13e00b64"). InnerVolumeSpecName "kube-api-access-sdst7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:53 crc kubenswrapper[5002]: I1209 11:34:53.144316 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-config-data" (OuterVolumeSpecName: "config-data") pod "752b8b3b-bb27-4905-9f0a-32af13e00b64" (UID: "752b8b3b-bb27-4905-9f0a-32af13e00b64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:53 crc kubenswrapper[5002]: I1209 11:34:53.157145 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "752b8b3b-bb27-4905-9f0a-32af13e00b64" (UID: "752b8b3b-bb27-4905-9f0a-32af13e00b64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.219206 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.219446 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdst7\" (UniqueName: \"kubernetes.io/projected/752b8b3b-bb27-4905-9f0a-32af13e00b64-kube-api-access-sdst7\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.219460 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752b8b3b-bb27-4905-9f0a-32af13e00b64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.483850 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:34:55 crc kubenswrapper[5002]: W1209 11:34:53.490418 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod508684a5_04cd_49b4_b3e0_716ccad025d9.slice/crio-8298ec65f139658a35e6545c0252564013353584123a403cf5c165c722d0ce74 WatchSource:0}: Error finding container 8298ec65f139658a35e6545c0252564013353584123a403cf5c165c722d0ce74: Status 404 returned error can't find the container with id 8298ec65f139658a35e6545c0252564013353584123a403cf5c165c722d0ce74 Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.523924 5002 generic.go:334] "Generic (PLEG): container finished" podID="752b8b3b-bb27-4905-9f0a-32af13e00b64" containerID="6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322" exitCode=0 Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.523966 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.524047 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"752b8b3b-bb27-4905-9f0a-32af13e00b64","Type":"ContainerDied","Data":"6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322"} Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.524113 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"752b8b3b-bb27-4905-9f0a-32af13e00b64","Type":"ContainerDied","Data":"ec845a9852aaaf8a1b18e422ab1476dd4ec37264d3cdb5d13e6f269eab52bf86"} Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.524145 5002 scope.go:117] "RemoveContainer" containerID="6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.532036 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"508684a5-04cd-49b4-b3e0-716ccad025d9","Type":"ContainerStarted","Data":"8298ec65f139658a35e6545c0252564013353584123a403cf5c165c722d0ce74"} Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.606053 5002 scope.go:117] "RemoveContainer" containerID="6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322" Dec 09 11:34:55 crc kubenswrapper[5002]: E1209 11:34:53.607196 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322\": container with ID starting with 6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322 not found: ID does not exist" containerID="6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.607226 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322"} err="failed to get container status \"6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322\": rpc error: code = NotFound desc = could not find container \"6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322\": container with ID starting with 6fb67cbc5aa10d627dfc05a900b094b7b232440b615e5910bbcd69eaf6c47322 not found: ID does not exist" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.627699 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.642116 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.665297 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:55 crc kubenswrapper[5002]: E1209 11:34:53.665805 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752b8b3b-bb27-4905-9f0a-32af13e00b64" containerName="nova-scheduler-scheduler" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.665837 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="752b8b3b-bb27-4905-9f0a-32af13e00b64" containerName="nova-scheduler-scheduler" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.666094 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="752b8b3b-bb27-4905-9f0a-32af13e00b64" containerName="nova-scheduler-scheduler" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.666973 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.669087 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.677714 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.736634 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-config-data\") pod \"nova-scheduler-0\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.737233 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bstp\" (UniqueName: \"kubernetes.io/projected/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-kube-api-access-7bstp\") pod \"nova-scheduler-0\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.737316 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.839225 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.839337 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-config-data\") pod \"nova-scheduler-0\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.839370 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bstp\" (UniqueName: \"kubernetes.io/projected/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-kube-api-access-7bstp\") pod \"nova-scheduler-0\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.866185 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.883502 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-config-data\") pod \"nova-scheduler-0\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.891276 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bstp\" (UniqueName: \"kubernetes.io/projected/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-kube-api-access-7bstp\") pod \"nova-scheduler-0\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:53.990242 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:54.073739 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d91cb14-b0c2-42b5-9803-711c6d9af174" path="/var/lib/kubelet/pods/1d91cb14-b0c2-42b5-9803-711c6d9af174/volumes" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:54.074462 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="752b8b3b-bb27-4905-9f0a-32af13e00b64" path="/var/lib/kubelet/pods/752b8b3b-bb27-4905-9f0a-32af13e00b64/volumes" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:54.075419 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84da9958-b32b-4279-a0c6-25dcaeb857af" path="/var/lib/kubelet/pods/84da9958-b32b-4279-a0c6-25dcaeb857af/volumes" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:54.544281 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"508684a5-04cd-49b4-b3e0-716ccad025d9","Type":"ContainerStarted","Data":"8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1"} Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:54.544374 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"508684a5-04cd-49b4-b3e0-716ccad025d9","Type":"ContainerStarted","Data":"eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2"} Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:55.577529 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.577513292 podStartE2EDuration="3.577513292s" podCreationTimestamp="2025-12-09 11:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:55.574422829 +0000 UTC m=+5627.966473920" watchObservedRunningTime="2025-12-09 11:34:55.577513292 +0000 UTC m=+5627.969564373" Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:55.654857 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:34:55 crc kubenswrapper[5002]: I1209 11:34:55.664332 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:34:56 crc kubenswrapper[5002]: I1209 11:34:56.566664 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2","Type":"ContainerStarted","Data":"8a91c1f212ac18d512531d99263be14280296db593fe5c90de534dfaaeef66d7"} Dec 09 11:34:56 crc kubenswrapper[5002]: I1209 11:34:56.567067 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2","Type":"ContainerStarted","Data":"b5d0b83d3e00a4eb8e616b99102b72ba63acda49b683ed60d25baafa8ee1f599"} Dec 09 11:34:56 crc kubenswrapper[5002]: I1209 11:34:56.568232 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e7a608e-8ea7-4852-b11a-148d988edf80","Type":"ContainerStarted","Data":"6f28cf966e43a907662091d0d63d04f817aae004b749004b4811848bd325ae45"} Dec 09 11:34:56 crc kubenswrapper[5002]: I1209 11:34:56.568282 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e7a608e-8ea7-4852-b11a-148d988edf80","Type":"ContainerStarted","Data":"5c434b80a34154b996afc300fcd845f4d30f3d76b2b35fa8aff5d6d2869a2792"} Dec 09 11:34:56 crc kubenswrapper[5002]: I1209 11:34:56.568296 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e7a608e-8ea7-4852-b11a-148d988edf80","Type":"ContainerStarted","Data":"f9d110f0ea12470ecef4d1eca10da4768a84775e9fbfe5987d3727761446f27a"} Dec 09 11:34:56 crc kubenswrapper[5002]: I1209 11:34:56.586583 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.586567133 podStartE2EDuration="3.586567133s" podCreationTimestamp="2025-12-09 11:34:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:56.581694113 +0000 UTC m=+5628.973745214" watchObservedRunningTime="2025-12-09 11:34:56.586567133 +0000 UTC m=+5628.978618214" Dec 09 11:34:56 crc kubenswrapper[5002]: I1209 11:34:56.606067 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.606046604 podStartE2EDuration="4.606046604s" podCreationTimestamp="2025-12-09 11:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:56.5991505 +0000 UTC m=+5628.991201591" watchObservedRunningTime="2025-12-09 11:34:56.606046604 +0000 UTC m=+5628.998097685" Dec 09 11:34:58 crc kubenswrapper[5002]: I1209 11:34:57.999800 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:34:58 crc kubenswrapper[5002]: I1209 11:34:58.000175 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:34:58 crc kubenswrapper[5002]: I1209 11:34:58.990879 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 11:35:02 crc kubenswrapper[5002]: I1209 11:35:02.986436 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:35:02 crc kubenswrapper[5002]: I1209 11:35:02.986852 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:35:02 crc kubenswrapper[5002]: I1209 11:35:02.999637 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 11:35:02 crc kubenswrapper[5002]: I1209 11:35:02.999675 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 11:35:03 crc kubenswrapper[5002]: I1209 11:35:03.990693 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 11:35:04 crc kubenswrapper[5002]: I1209 11:35:04.022017 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 11:35:04 crc kubenswrapper[5002]: I1209 11:35:04.155055 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:35:04 crc kubenswrapper[5002]: I1209 11:35:04.155105 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:35:04 crc kubenswrapper[5002]: I1209 11:35:04.155053 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:35:04 crc kubenswrapper[5002]: I1209 11:35:04.155195 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:35:04 crc kubenswrapper[5002]: I1209 11:35:04.684993 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 11:35:12 crc kubenswrapper[5002]: I1209 11:35:12.991569 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 11:35:12 crc kubenswrapper[5002]: I1209 11:35:12.992298 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 11:35:12 crc kubenswrapper[5002]: I1209 11:35:12.992703 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 11:35:12 crc kubenswrapper[5002]: I1209 11:35:12.992735 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 11:35:12 crc kubenswrapper[5002]: I1209 11:35:12.998851 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.002954 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.004452 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.004503 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.006044 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.213672 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qfxwb"] Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.215089 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.233610 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qfxwb"] Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.298456 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-sb\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.298695 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-config\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.298784 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-nb\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.298839 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbzh5\" (UniqueName: \"kubernetes.io/projected/089e55db-3c5c-4320-92e2-2aeca1510fed-kube-api-access-bbzh5\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.298946 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-dns-svc\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.400624 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-config\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.400912 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-nb\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.401030 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbzh5\" (UniqueName: \"kubernetes.io/projected/089e55db-3c5c-4320-92e2-2aeca1510fed-kube-api-access-bbzh5\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.401104 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-dns-svc\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.401203 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-sb\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.401757 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-config\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.402125 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-sb\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.403972 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-nb\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.403987 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-dns-svc\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.427989 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbzh5\" (UniqueName: \"kubernetes.io/projected/089e55db-3c5c-4320-92e2-2aeca1510fed-kube-api-access-bbzh5\") pod \"dnsmasq-dns-64b8d7d4fc-qfxwb\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.536444 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.750912 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 11:35:13 crc kubenswrapper[5002]: I1209 11:35:13.995959 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qfxwb"] Dec 09 11:35:14 crc kubenswrapper[5002]: I1209 11:35:14.758719 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" event={"ID":"089e55db-3c5c-4320-92e2-2aeca1510fed","Type":"ContainerStarted","Data":"4a4254140a826c2e4019575c95758182f4d9fd69af3e11e91728f88a6dfdf6b2"} Dec 09 11:35:15 crc kubenswrapper[5002]: I1209 11:35:15.770102 5002 generic.go:334] "Generic (PLEG): container finished" podID="089e55db-3c5c-4320-92e2-2aeca1510fed" containerID="e7381888e89b8cd833bebcd7157a2127bcf3e548f640b9542569dbb6f8b20442" exitCode=0 Dec 09 11:35:15 crc kubenswrapper[5002]: I1209 11:35:15.770223 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" event={"ID":"089e55db-3c5c-4320-92e2-2aeca1510fed","Type":"ContainerDied","Data":"e7381888e89b8cd833bebcd7157a2127bcf3e548f640b9542569dbb6f8b20442"} Dec 09 11:35:16 crc kubenswrapper[5002]: I1209 11:35:16.782125 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" event={"ID":"089e55db-3c5c-4320-92e2-2aeca1510fed","Type":"ContainerStarted","Data":"9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214"} Dec 09 11:35:16 crc kubenswrapper[5002]: I1209 11:35:16.783940 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:16 crc kubenswrapper[5002]: I1209 11:35:16.811673 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" podStartSLOduration=3.811649264 podStartE2EDuration="3.811649264s" podCreationTimestamp="2025-12-09 11:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:16.801502312 +0000 UTC m=+5649.193553393" watchObservedRunningTime="2025-12-09 11:35:16.811649264 +0000 UTC m=+5649.203700345" Dec 09 11:35:23 crc kubenswrapper[5002]: I1209 11:35:23.538002 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:23 crc kubenswrapper[5002]: I1209 11:35:23.614710 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-n4m46"] Dec 09 11:35:23 crc kubenswrapper[5002]: I1209 11:35:23.615114 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" podUID="902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" containerName="dnsmasq-dns" containerID="cri-o://dbf109bd991efc11573cd767cb66e79d5b108a2b043b7bf68501792d400c35e4" gracePeriod=10 Dec 09 11:35:23 crc kubenswrapper[5002]: I1209 11:35:23.868095 5002 generic.go:334] "Generic (PLEG): container finished" podID="902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" containerID="dbf109bd991efc11573cd767cb66e79d5b108a2b043b7bf68501792d400c35e4" exitCode=0 Dec 09 11:35:23 crc kubenswrapper[5002]: I1209 11:35:23.868145 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" event={"ID":"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0","Type":"ContainerDied","Data":"dbf109bd991efc11573cd767cb66e79d5b108a2b043b7bf68501792d400c35e4"} Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.122706 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.200634 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-nb\") pod \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.200746 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-dns-svc\") pod \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.200804 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrqj5\" (UniqueName: \"kubernetes.io/projected/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-kube-api-access-lrqj5\") pod \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.200902 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-sb\") pod \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.201002 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-config\") pod \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\" (UID: \"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0\") " Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.206175 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-kube-api-access-lrqj5" (OuterVolumeSpecName: "kube-api-access-lrqj5") pod "902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" (UID: "902ce8a4-b7c1-4b23-a506-f66fb2c84cb0"). InnerVolumeSpecName "kube-api-access-lrqj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.250420 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" (UID: "902ce8a4-b7c1-4b23-a506-f66fb2c84cb0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.256392 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-config" (OuterVolumeSpecName: "config") pod "902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" (UID: "902ce8a4-b7c1-4b23-a506-f66fb2c84cb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.262734 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" (UID: "902ce8a4-b7c1-4b23-a506-f66fb2c84cb0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.263732 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" (UID: "902ce8a4-b7c1-4b23-a506-f66fb2c84cb0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.302623 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.302649 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.302661 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.302672 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrqj5\" (UniqueName: \"kubernetes.io/projected/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-kube-api-access-lrqj5\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.302685 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.878394 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" event={"ID":"902ce8a4-b7c1-4b23-a506-f66fb2c84cb0","Type":"ContainerDied","Data":"72b83b503b2ba78c9f7744aa21e836956d24071948013d03beb51eb281e5cd46"} Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.878765 5002 scope.go:117] "RemoveContainer" containerID="dbf109bd991efc11573cd767cb66e79d5b108a2b043b7bf68501792d400c35e4" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.879009 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696f9966c7-n4m46" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.915750 5002 scope.go:117] "RemoveContainer" containerID="1464fed4c2ffc7539d32bdc329de1df2b4ee9271af080b41cf86c5a18184e769" Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.924583 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-n4m46"] Dec 09 11:35:24 crc kubenswrapper[5002]: I1209 11:35:24.932931 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-n4m46"] Dec 09 11:35:26 crc kubenswrapper[5002]: I1209 11:35:26.075163 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" path="/var/lib/kubelet/pods/902ce8a4-b7c1-4b23-a506-f66fb2c84cb0/volumes" Dec 09 11:35:26 crc kubenswrapper[5002]: I1209 11:35:26.961304 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-x6t76"] Dec 09 11:35:26 crc kubenswrapper[5002]: E1209 11:35:26.961722 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" containerName="dnsmasq-dns" Dec 09 11:35:26 crc kubenswrapper[5002]: I1209 11:35:26.961743 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" containerName="dnsmasq-dns" Dec 09 11:35:26 crc kubenswrapper[5002]: E1209 11:35:26.961770 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" containerName="init" Dec 09 11:35:26 crc kubenswrapper[5002]: I1209 11:35:26.961781 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" containerName="init" Dec 09 11:35:26 crc kubenswrapper[5002]: I1209 11:35:26.962018 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="902ce8a4-b7c1-4b23-a506-f66fb2c84cb0" containerName="dnsmasq-dns" Dec 09 11:35:26 crc kubenswrapper[5002]: I1209 11:35:26.962614 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x6t76" Dec 09 11:35:26 crc kubenswrapper[5002]: I1209 11:35:26.979434 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-x6t76"] Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.054617 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b9fdf9c-44bc-48c9-8651-5394aca76af1-operator-scripts\") pod \"cinder-db-create-x6t76\" (UID: \"9b9fdf9c-44bc-48c9-8651-5394aca76af1\") " pod="openstack/cinder-db-create-x6t76" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.054676 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fbr7\" (UniqueName: \"kubernetes.io/projected/9b9fdf9c-44bc-48c9-8651-5394aca76af1-kube-api-access-9fbr7\") pod \"cinder-db-create-x6t76\" (UID: \"9b9fdf9c-44bc-48c9-8651-5394aca76af1\") " pod="openstack/cinder-db-create-x6t76" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.077501 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3a2d-account-create-update-j2627"] Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.078613 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3a2d-account-create-update-j2627" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.083200 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.094487 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3a2d-account-create-update-j2627"] Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.158326 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fbr7\" (UniqueName: \"kubernetes.io/projected/9b9fdf9c-44bc-48c9-8651-5394aca76af1-kube-api-access-9fbr7\") pod \"cinder-db-create-x6t76\" (UID: \"9b9fdf9c-44bc-48c9-8651-5394aca76af1\") " pod="openstack/cinder-db-create-x6t76" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.158600 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ef5176-0dfd-4c7b-8b18-088cd7959673-operator-scripts\") pod \"cinder-3a2d-account-create-update-j2627\" (UID: \"e7ef5176-0dfd-4c7b-8b18-088cd7959673\") " pod="openstack/cinder-3a2d-account-create-update-j2627" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.158665 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrv6g\" (UniqueName: \"kubernetes.io/projected/e7ef5176-0dfd-4c7b-8b18-088cd7959673-kube-api-access-qrv6g\") pod \"cinder-3a2d-account-create-update-j2627\" (UID: \"e7ef5176-0dfd-4c7b-8b18-088cd7959673\") " pod="openstack/cinder-3a2d-account-create-update-j2627" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.158731 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b9fdf9c-44bc-48c9-8651-5394aca76af1-operator-scripts\") pod \"cinder-db-create-x6t76\" (UID: \"9b9fdf9c-44bc-48c9-8651-5394aca76af1\") " pod="openstack/cinder-db-create-x6t76" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.159352 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b9fdf9c-44bc-48c9-8651-5394aca76af1-operator-scripts\") pod \"cinder-db-create-x6t76\" (UID: \"9b9fdf9c-44bc-48c9-8651-5394aca76af1\") " pod="openstack/cinder-db-create-x6t76" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.190306 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fbr7\" (UniqueName: \"kubernetes.io/projected/9b9fdf9c-44bc-48c9-8651-5394aca76af1-kube-api-access-9fbr7\") pod \"cinder-db-create-x6t76\" (UID: \"9b9fdf9c-44bc-48c9-8651-5394aca76af1\") " pod="openstack/cinder-db-create-x6t76" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.260544 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrv6g\" (UniqueName: \"kubernetes.io/projected/e7ef5176-0dfd-4c7b-8b18-088cd7959673-kube-api-access-qrv6g\") pod \"cinder-3a2d-account-create-update-j2627\" (UID: \"e7ef5176-0dfd-4c7b-8b18-088cd7959673\") " pod="openstack/cinder-3a2d-account-create-update-j2627" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.260708 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ef5176-0dfd-4c7b-8b18-088cd7959673-operator-scripts\") pod \"cinder-3a2d-account-create-update-j2627\" (UID: \"e7ef5176-0dfd-4c7b-8b18-088cd7959673\") " pod="openstack/cinder-3a2d-account-create-update-j2627" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.261519 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ef5176-0dfd-4c7b-8b18-088cd7959673-operator-scripts\") pod \"cinder-3a2d-account-create-update-j2627\" (UID: \"e7ef5176-0dfd-4c7b-8b18-088cd7959673\") " pod="openstack/cinder-3a2d-account-create-update-j2627" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.278792 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrv6g\" (UniqueName: \"kubernetes.io/projected/e7ef5176-0dfd-4c7b-8b18-088cd7959673-kube-api-access-qrv6g\") pod \"cinder-3a2d-account-create-update-j2627\" (UID: \"e7ef5176-0dfd-4c7b-8b18-088cd7959673\") " pod="openstack/cinder-3a2d-account-create-update-j2627" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.299280 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x6t76" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.406202 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3a2d-account-create-update-j2627" Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.776775 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-x6t76"] Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.914509 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-x6t76" event={"ID":"9b9fdf9c-44bc-48c9-8651-5394aca76af1","Type":"ContainerStarted","Data":"e577b5e0063157cbd9a5fb3a3d3c2d2672e91363786879843bda1fef1cdf6321"} Dec 09 11:35:27 crc kubenswrapper[5002]: I1209 11:35:27.950790 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3a2d-account-create-update-j2627"] Dec 09 11:35:28 crc kubenswrapper[5002]: I1209 11:35:28.946089 5002 generic.go:334] "Generic (PLEG): container finished" podID="e7ef5176-0dfd-4c7b-8b18-088cd7959673" containerID="0106580d39d6e64710016e0d7529a027fc971a4c865017b02395d114443f7614" exitCode=0 Dec 09 11:35:28 crc kubenswrapper[5002]: I1209 11:35:28.946157 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3a2d-account-create-update-j2627" event={"ID":"e7ef5176-0dfd-4c7b-8b18-088cd7959673","Type":"ContainerDied","Data":"0106580d39d6e64710016e0d7529a027fc971a4c865017b02395d114443f7614"} Dec 09 11:35:28 crc kubenswrapper[5002]: I1209 11:35:28.946219 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3a2d-account-create-update-j2627" event={"ID":"e7ef5176-0dfd-4c7b-8b18-088cd7959673","Type":"ContainerStarted","Data":"f8e76edd1f3361843e68f9fd68082cda86227a7e7ebfbe8a746ed840ffe919eb"} Dec 09 11:35:28 crc kubenswrapper[5002]: I1209 11:35:28.951551 5002 generic.go:334] "Generic (PLEG): container finished" podID="9b9fdf9c-44bc-48c9-8651-5394aca76af1" containerID="a443ecafaeb19ab0b0061806e2bf6231088995fa3701d88012ba579c3ee0022e" exitCode=0 Dec 09 11:35:28 crc kubenswrapper[5002]: I1209 11:35:28.951607 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-x6t76" event={"ID":"9b9fdf9c-44bc-48c9-8651-5394aca76af1","Type":"ContainerDied","Data":"a443ecafaeb19ab0b0061806e2bf6231088995fa3701d88012ba579c3ee0022e"} Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.430729 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3a2d-account-create-update-j2627" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.436246 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x6t76" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.517355 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fbr7\" (UniqueName: \"kubernetes.io/projected/9b9fdf9c-44bc-48c9-8651-5394aca76af1-kube-api-access-9fbr7\") pod \"9b9fdf9c-44bc-48c9-8651-5394aca76af1\" (UID: \"9b9fdf9c-44bc-48c9-8651-5394aca76af1\") " Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.517773 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ef5176-0dfd-4c7b-8b18-088cd7959673-operator-scripts\") pod \"e7ef5176-0dfd-4c7b-8b18-088cd7959673\" (UID: \"e7ef5176-0dfd-4c7b-8b18-088cd7959673\") " Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.517800 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrv6g\" (UniqueName: \"kubernetes.io/projected/e7ef5176-0dfd-4c7b-8b18-088cd7959673-kube-api-access-qrv6g\") pod \"e7ef5176-0dfd-4c7b-8b18-088cd7959673\" (UID: \"e7ef5176-0dfd-4c7b-8b18-088cd7959673\") " Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.517895 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b9fdf9c-44bc-48c9-8651-5394aca76af1-operator-scripts\") pod \"9b9fdf9c-44bc-48c9-8651-5394aca76af1\" (UID: \"9b9fdf9c-44bc-48c9-8651-5394aca76af1\") " Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.518317 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b9fdf9c-44bc-48c9-8651-5394aca76af1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b9fdf9c-44bc-48c9-8651-5394aca76af1" (UID: "9b9fdf9c-44bc-48c9-8651-5394aca76af1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.518418 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7ef5176-0dfd-4c7b-8b18-088cd7959673-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7ef5176-0dfd-4c7b-8b18-088cd7959673" (UID: "e7ef5176-0dfd-4c7b-8b18-088cd7959673"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.534720 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ef5176-0dfd-4c7b-8b18-088cd7959673-kube-api-access-qrv6g" (OuterVolumeSpecName: "kube-api-access-qrv6g") pod "e7ef5176-0dfd-4c7b-8b18-088cd7959673" (UID: "e7ef5176-0dfd-4c7b-8b18-088cd7959673"). InnerVolumeSpecName "kube-api-access-qrv6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.534892 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b9fdf9c-44bc-48c9-8651-5394aca76af1-kube-api-access-9fbr7" (OuterVolumeSpecName: "kube-api-access-9fbr7") pod "9b9fdf9c-44bc-48c9-8651-5394aca76af1" (UID: "9b9fdf9c-44bc-48c9-8651-5394aca76af1"). InnerVolumeSpecName "kube-api-access-9fbr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.621180 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ef5176-0dfd-4c7b-8b18-088cd7959673-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.621237 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrv6g\" (UniqueName: \"kubernetes.io/projected/e7ef5176-0dfd-4c7b-8b18-088cd7959673-kube-api-access-qrv6g\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.621258 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b9fdf9c-44bc-48c9-8651-5394aca76af1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.621279 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fbr7\" (UniqueName: \"kubernetes.io/projected/9b9fdf9c-44bc-48c9-8651-5394aca76af1-kube-api-access-9fbr7\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.974719 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-x6t76" event={"ID":"9b9fdf9c-44bc-48c9-8651-5394aca76af1","Type":"ContainerDied","Data":"e577b5e0063157cbd9a5fb3a3d3c2d2672e91363786879843bda1fef1cdf6321"} Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.974757 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e577b5e0063157cbd9a5fb3a3d3c2d2672e91363786879843bda1fef1cdf6321" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.974757 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x6t76" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.976889 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3a2d-account-create-update-j2627" event={"ID":"e7ef5176-0dfd-4c7b-8b18-088cd7959673","Type":"ContainerDied","Data":"f8e76edd1f3361843e68f9fd68082cda86227a7e7ebfbe8a746ed840ffe919eb"} Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.976917 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8e76edd1f3361843e68f9fd68082cda86227a7e7ebfbe8a746ed840ffe919eb" Dec 09 11:35:30 crc kubenswrapper[5002]: I1209 11:35:30.976954 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3a2d-account-create-update-j2627" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.354251 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-7t5r2"] Dec 09 11:35:32 crc kubenswrapper[5002]: E1209 11:35:32.355035 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9fdf9c-44bc-48c9-8651-5394aca76af1" containerName="mariadb-database-create" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.355052 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9fdf9c-44bc-48c9-8651-5394aca76af1" containerName="mariadb-database-create" Dec 09 11:35:32 crc kubenswrapper[5002]: E1209 11:35:32.355092 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ef5176-0dfd-4c7b-8b18-088cd7959673" containerName="mariadb-account-create-update" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.355100 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ef5176-0dfd-4c7b-8b18-088cd7959673" containerName="mariadb-account-create-update" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.355342 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ef5176-0dfd-4c7b-8b18-088cd7959673" containerName="mariadb-account-create-update" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.355362 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9fdf9c-44bc-48c9-8651-5394aca76af1" containerName="mariadb-database-create" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.356299 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.359526 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-svccs" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.359540 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.359596 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.364995 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7t5r2"] Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.460709 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-config-data\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.460765 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qknsk\" (UniqueName: \"kubernetes.io/projected/dc57b111-13f4-4d51-9ea3-26261b356a72-kube-api-access-qknsk\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.460801 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc57b111-13f4-4d51-9ea3-26261b356a72-etc-machine-id\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.460865 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-scripts\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.460897 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-combined-ca-bundle\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.460934 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-db-sync-config-data\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.562249 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-config-data\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.562344 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qknsk\" (UniqueName: \"kubernetes.io/projected/dc57b111-13f4-4d51-9ea3-26261b356a72-kube-api-access-qknsk\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.562381 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc57b111-13f4-4d51-9ea3-26261b356a72-etc-machine-id\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.562464 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-scripts\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.562511 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-combined-ca-bundle\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.562560 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc57b111-13f4-4d51-9ea3-26261b356a72-etc-machine-id\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.562585 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-db-sync-config-data\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.568383 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-db-sync-config-data\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.568936 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-combined-ca-bundle\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.569041 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-scripts\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.569069 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-config-data\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.582130 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qknsk\" (UniqueName: \"kubernetes.io/projected/dc57b111-13f4-4d51-9ea3-26261b356a72-kube-api-access-qknsk\") pod \"cinder-db-sync-7t5r2\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:32 crc kubenswrapper[5002]: I1209 11:35:32.680199 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:33 crc kubenswrapper[5002]: I1209 11:35:33.169527 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7t5r2"] Dec 09 11:35:34 crc kubenswrapper[5002]: I1209 11:35:34.005201 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7t5r2" event={"ID":"dc57b111-13f4-4d51-9ea3-26261b356a72","Type":"ContainerStarted","Data":"2e1a378880476ce70b8a2de64ee1edfdff36107a284ade3fc53d8b8074c1fbaa"} Dec 09 11:35:34 crc kubenswrapper[5002]: I1209 11:35:34.005709 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7t5r2" event={"ID":"dc57b111-13f4-4d51-9ea3-26261b356a72","Type":"ContainerStarted","Data":"4efae6774c42fc207bea5ada9b190199c979b4ba361a735749004749fe1be150"} Dec 09 11:35:34 crc kubenswrapper[5002]: I1209 11:35:34.031393 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-7t5r2" podStartSLOduration=2.031375795 podStartE2EDuration="2.031375795s" podCreationTimestamp="2025-12-09 11:35:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:34.029249118 +0000 UTC m=+5666.421300199" watchObservedRunningTime="2025-12-09 11:35:34.031375795 +0000 UTC m=+5666.423426866" Dec 09 11:35:38 crc kubenswrapper[5002]: I1209 11:35:38.052352 5002 generic.go:334] "Generic (PLEG): container finished" podID="dc57b111-13f4-4d51-9ea3-26261b356a72" containerID="2e1a378880476ce70b8a2de64ee1edfdff36107a284ade3fc53d8b8074c1fbaa" exitCode=0 Dec 09 11:35:38 crc kubenswrapper[5002]: I1209 11:35:38.052467 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7t5r2" event={"ID":"dc57b111-13f4-4d51-9ea3-26261b356a72","Type":"ContainerDied","Data":"2e1a378880476ce70b8a2de64ee1edfdff36107a284ade3fc53d8b8074c1fbaa"} Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.413116 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.602615 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qknsk\" (UniqueName: \"kubernetes.io/projected/dc57b111-13f4-4d51-9ea3-26261b356a72-kube-api-access-qknsk\") pod \"dc57b111-13f4-4d51-9ea3-26261b356a72\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.602660 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-config-data\") pod \"dc57b111-13f4-4d51-9ea3-26261b356a72\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.602789 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-combined-ca-bundle\") pod \"dc57b111-13f4-4d51-9ea3-26261b356a72\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.602833 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc57b111-13f4-4d51-9ea3-26261b356a72-etc-machine-id\") pod \"dc57b111-13f4-4d51-9ea3-26261b356a72\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.602898 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-db-sync-config-data\") pod \"dc57b111-13f4-4d51-9ea3-26261b356a72\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.602961 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-scripts\") pod \"dc57b111-13f4-4d51-9ea3-26261b356a72\" (UID: \"dc57b111-13f4-4d51-9ea3-26261b356a72\") " Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.604796 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc57b111-13f4-4d51-9ea3-26261b356a72-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dc57b111-13f4-4d51-9ea3-26261b356a72" (UID: "dc57b111-13f4-4d51-9ea3-26261b356a72"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.609974 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-scripts" (OuterVolumeSpecName: "scripts") pod "dc57b111-13f4-4d51-9ea3-26261b356a72" (UID: "dc57b111-13f4-4d51-9ea3-26261b356a72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.611166 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc57b111-13f4-4d51-9ea3-26261b356a72-kube-api-access-qknsk" (OuterVolumeSpecName: "kube-api-access-qknsk") pod "dc57b111-13f4-4d51-9ea3-26261b356a72" (UID: "dc57b111-13f4-4d51-9ea3-26261b356a72"). InnerVolumeSpecName "kube-api-access-qknsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.612365 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dc57b111-13f4-4d51-9ea3-26261b356a72" (UID: "dc57b111-13f4-4d51-9ea3-26261b356a72"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.630553 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc57b111-13f4-4d51-9ea3-26261b356a72" (UID: "dc57b111-13f4-4d51-9ea3-26261b356a72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.651283 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-config-data" (OuterVolumeSpecName: "config-data") pod "dc57b111-13f4-4d51-9ea3-26261b356a72" (UID: "dc57b111-13f4-4d51-9ea3-26261b356a72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.705681 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.705752 5002 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc57b111-13f4-4d51-9ea3-26261b356a72-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.705777 5002 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.705800 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.705862 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qknsk\" (UniqueName: \"kubernetes.io/projected/dc57b111-13f4-4d51-9ea3-26261b356a72-kube-api-access-qknsk\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:39 crc kubenswrapper[5002]: I1209 11:35:39.705891 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc57b111-13f4-4d51-9ea3-26261b356a72-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.085304 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7t5r2" event={"ID":"dc57b111-13f4-4d51-9ea3-26261b356a72","Type":"ContainerDied","Data":"4efae6774c42fc207bea5ada9b190199c979b4ba361a735749004749fe1be150"} Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.085378 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4efae6774c42fc207bea5ada9b190199c979b4ba361a735749004749fe1be150" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.085432 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7t5r2" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.485595 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-9s9kk"] Dec 09 11:35:40 crc kubenswrapper[5002]: E1209 11:35:40.512953 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc57b111-13f4-4d51-9ea3-26261b356a72" containerName="cinder-db-sync" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.513004 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc57b111-13f4-4d51-9ea3-26261b356a72" containerName="cinder-db-sync" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.513441 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc57b111-13f4-4d51-9ea3-26261b356a72" containerName="cinder-db-sync" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.514556 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-9s9kk"] Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.514669 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.625277 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-config\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.625375 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s96bx\" (UniqueName: \"kubernetes.io/projected/e7c18105-50bb-4814-aebb-df97e8948e01-kube-api-access-s96bx\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.625544 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-dns-svc\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.625718 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.625746 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.655775 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.657752 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.659950 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.660034 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.661527 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-svccs" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.661786 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.672033 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.729094 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.729144 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.729278 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-config\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.729330 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s96bx\" (UniqueName: \"kubernetes.io/projected/e7c18105-50bb-4814-aebb-df97e8948e01-kube-api-access-s96bx\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.729374 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-dns-svc\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.730481 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.730516 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-config\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.730540 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-dns-svc\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.730749 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.761347 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s96bx\" (UniqueName: \"kubernetes.io/projected/e7c18105-50bb-4814-aebb-df97e8948e01-kube-api-access-s96bx\") pod \"dnsmasq-dns-58dc68b6c7-9s9kk\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.830398 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.830779 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.830870 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-logs\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.830903 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjnbl\" (UniqueName: \"kubernetes.io/projected/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-kube-api-access-bjnbl\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.830923 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.831293 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.831509 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-scripts\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.844222 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.932909 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.933567 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-logs\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.933657 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjnbl\" (UniqueName: \"kubernetes.io/projected/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-kube-api-access-bjnbl\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.933727 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.933877 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.933975 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-scripts\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.934052 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.934192 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.934561 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-logs\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.942116 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-scripts\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.942386 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.942480 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.946733 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.959344 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjnbl\" (UniqueName: \"kubernetes.io/projected/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-kube-api-access-bjnbl\") pod \"cinder-api-0\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " pod="openstack/cinder-api-0" Dec 09 11:35:40 crc kubenswrapper[5002]: I1209 11:35:40.976096 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:35:41 crc kubenswrapper[5002]: I1209 11:35:41.386328 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-9s9kk"] Dec 09 11:35:41 crc kubenswrapper[5002]: I1209 11:35:41.498991 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:35:41 crc kubenswrapper[5002]: W1209 11:35:41.508205 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b341517_4db7_4996_8aa3_d2bc4ad0ef6b.slice/crio-a86837081964b41e343bcd933e880b5ac05cb4087ef7bbe140cb8aee26bcb0e0 WatchSource:0}: Error finding container a86837081964b41e343bcd933e880b5ac05cb4087ef7bbe140cb8aee26bcb0e0: Status 404 returned error can't find the container with id a86837081964b41e343bcd933e880b5ac05cb4087ef7bbe140cb8aee26bcb0e0 Dec 09 11:35:42 crc kubenswrapper[5002]: I1209 11:35:42.117019 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b","Type":"ContainerStarted","Data":"b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9"} Dec 09 11:35:42 crc kubenswrapper[5002]: I1209 11:35:42.117382 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b","Type":"ContainerStarted","Data":"a86837081964b41e343bcd933e880b5ac05cb4087ef7bbe140cb8aee26bcb0e0"} Dec 09 11:35:42 crc kubenswrapper[5002]: I1209 11:35:42.122786 5002 generic.go:334] "Generic (PLEG): container finished" podID="e7c18105-50bb-4814-aebb-df97e8948e01" containerID="632d80b35a4374799f90b84c6841468d945f85dfec8ee9ef5b70b7083f4ea4c4" exitCode=0 Dec 09 11:35:42 crc kubenswrapper[5002]: I1209 11:35:42.122893 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" event={"ID":"e7c18105-50bb-4814-aebb-df97e8948e01","Type":"ContainerDied","Data":"632d80b35a4374799f90b84c6841468d945f85dfec8ee9ef5b70b7083f4ea4c4"} Dec 09 11:35:42 crc kubenswrapper[5002]: I1209 11:35:42.122969 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" event={"ID":"e7c18105-50bb-4814-aebb-df97e8948e01","Type":"ContainerStarted","Data":"4e303effba94515a8e34d5ff108296121c4cf8929a6a64e8d08b18efaa17c189"} Dec 09 11:35:43 crc kubenswrapper[5002]: I1209 11:35:43.132914 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b","Type":"ContainerStarted","Data":"ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af"} Dec 09 11:35:43 crc kubenswrapper[5002]: I1209 11:35:43.133752 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 09 11:35:43 crc kubenswrapper[5002]: I1209 11:35:43.134472 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" event={"ID":"e7c18105-50bb-4814-aebb-df97e8948e01","Type":"ContainerStarted","Data":"22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8"} Dec 09 11:35:43 crc kubenswrapper[5002]: I1209 11:35:43.135284 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:43 crc kubenswrapper[5002]: I1209 11:35:43.157794 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.157770948 podStartE2EDuration="3.157770948s" podCreationTimestamp="2025-12-09 11:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:43.153464233 +0000 UTC m=+5675.545515324" watchObservedRunningTime="2025-12-09 11:35:43.157770948 +0000 UTC m=+5675.549822029" Dec 09 11:35:43 crc kubenswrapper[5002]: I1209 11:35:43.180324 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" podStartSLOduration=3.180307271 podStartE2EDuration="3.180307271s" podCreationTimestamp="2025-12-09 11:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:43.176095258 +0000 UTC m=+5675.568146349" watchObservedRunningTime="2025-12-09 11:35:43.180307271 +0000 UTC m=+5675.572358352" Dec 09 11:35:50 crc kubenswrapper[5002]: I1209 11:35:50.844982 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:35:50 crc kubenswrapper[5002]: I1209 11:35:50.956766 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qfxwb"] Dec 09 11:35:50 crc kubenswrapper[5002]: I1209 11:35:50.957053 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" podUID="089e55db-3c5c-4320-92e2-2aeca1510fed" containerName="dnsmasq-dns" containerID="cri-o://9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214" gracePeriod=10 Dec 09 11:35:51 crc kubenswrapper[5002]: I1209 11:35:51.967138 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.076338 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-sb\") pod \"089e55db-3c5c-4320-92e2-2aeca1510fed\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.076400 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-dns-svc\") pod \"089e55db-3c5c-4320-92e2-2aeca1510fed\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.076513 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-nb\") pod \"089e55db-3c5c-4320-92e2-2aeca1510fed\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.076554 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbzh5\" (UniqueName: \"kubernetes.io/projected/089e55db-3c5c-4320-92e2-2aeca1510fed-kube-api-access-bbzh5\") pod \"089e55db-3c5c-4320-92e2-2aeca1510fed\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.076599 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-config\") pod \"089e55db-3c5c-4320-92e2-2aeca1510fed\" (UID: \"089e55db-3c5c-4320-92e2-2aeca1510fed\") " Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.089483 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/089e55db-3c5c-4320-92e2-2aeca1510fed-kube-api-access-bbzh5" (OuterVolumeSpecName: "kube-api-access-bbzh5") pod "089e55db-3c5c-4320-92e2-2aeca1510fed" (UID: "089e55db-3c5c-4320-92e2-2aeca1510fed"). InnerVolumeSpecName "kube-api-access-bbzh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.140911 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "089e55db-3c5c-4320-92e2-2aeca1510fed" (UID: "089e55db-3c5c-4320-92e2-2aeca1510fed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.141390 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-config" (OuterVolumeSpecName: "config") pod "089e55db-3c5c-4320-92e2-2aeca1510fed" (UID: "089e55db-3c5c-4320-92e2-2aeca1510fed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.143224 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "089e55db-3c5c-4320-92e2-2aeca1510fed" (UID: "089e55db-3c5c-4320-92e2-2aeca1510fed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.154759 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "089e55db-3c5c-4320-92e2-2aeca1510fed" (UID: "089e55db-3c5c-4320-92e2-2aeca1510fed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.178971 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.179014 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.179023 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.179032 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbzh5\" (UniqueName: \"kubernetes.io/projected/089e55db-3c5c-4320-92e2-2aeca1510fed-kube-api-access-bbzh5\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.179041 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/089e55db-3c5c-4320-92e2-2aeca1510fed-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.229053 5002 generic.go:334] "Generic (PLEG): container finished" podID="089e55db-3c5c-4320-92e2-2aeca1510fed" containerID="9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214" exitCode=0 Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.229097 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" event={"ID":"089e55db-3c5c-4320-92e2-2aeca1510fed","Type":"ContainerDied","Data":"9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214"} Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.229127 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" event={"ID":"089e55db-3c5c-4320-92e2-2aeca1510fed","Type":"ContainerDied","Data":"4a4254140a826c2e4019575c95758182f4d9fd69af3e11e91728f88a6dfdf6b2"} Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.229126 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b8d7d4fc-qfxwb" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.229148 5002 scope.go:117] "RemoveContainer" containerID="9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.272475 5002 scope.go:117] "RemoveContainer" containerID="e7381888e89b8cd833bebcd7157a2127bcf3e548f640b9542569dbb6f8b20442" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.285498 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qfxwb"] Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.292255 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qfxwb"] Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.299174 5002 scope.go:117] "RemoveContainer" containerID="9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214" Dec 09 11:35:52 crc kubenswrapper[5002]: E1209 11:35:52.299713 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214\": container with ID starting with 9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214 not found: ID does not exist" containerID="9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.299747 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214"} err="failed to get container status \"9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214\": rpc error: code = NotFound desc = could not find container \"9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214\": container with ID starting with 9ffe50c95f5d1659c9beabc0a62a3cc93d0610e0541c1c44d943da5199f06214 not found: ID does not exist" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.299772 5002 scope.go:117] "RemoveContainer" containerID="e7381888e89b8cd833bebcd7157a2127bcf3e548f640b9542569dbb6f8b20442" Dec 09 11:35:52 crc kubenswrapper[5002]: E1209 11:35:52.300303 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7381888e89b8cd833bebcd7157a2127bcf3e548f640b9542569dbb6f8b20442\": container with ID starting with e7381888e89b8cd833bebcd7157a2127bcf3e548f640b9542569dbb6f8b20442 not found: ID does not exist" containerID="e7381888e89b8cd833bebcd7157a2127bcf3e548f640b9542569dbb6f8b20442" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.300332 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7381888e89b8cd833bebcd7157a2127bcf3e548f640b9542569dbb6f8b20442"} err="failed to get container status \"e7381888e89b8cd833bebcd7157a2127bcf3e548f640b9542569dbb6f8b20442\": rpc error: code = NotFound desc = could not find container \"e7381888e89b8cd833bebcd7157a2127bcf3e548f640b9542569dbb6f8b20442\": container with ID starting with e7381888e89b8cd833bebcd7157a2127bcf3e548f640b9542569dbb6f8b20442 not found: ID does not exist" Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.328407 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.328707 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-log" containerID="cri-o://eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2" gracePeriod=30 Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.329098 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-api" containerID="cri-o://8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1" gracePeriod=30 Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.347380 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.347963 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="1bc03877-56b5-44c0-9565-ce459d9f28da" containerName="nova-cell0-conductor-conductor" containerID="cri-o://0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0" gracePeriod=30 Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.362415 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.362936 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b3483538-8c2c-4c39-a9f1-5c8b6779d43a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://aa4b67030850ec07ff3d003861c1733d259c14d2364627e83d950e4c96243b51" gracePeriod=30 Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.382170 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.382482 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2" containerName="nova-scheduler-scheduler" containerID="cri-o://8a91c1f212ac18d512531d99263be14280296db593fe5c90de534dfaaeef66d7" gracePeriod=30 Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.407702 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.407970 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-log" containerID="cri-o://5c434b80a34154b996afc300fcd845f4d30f3d76b2b35fa8aff5d6d2869a2792" gracePeriod=30 Dec 09 11:35:52 crc kubenswrapper[5002]: I1209 11:35:52.408461 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-metadata" containerID="cri-o://6f28cf966e43a907662091d0d63d04f817aae004b749004b4811848bd325ae45" gracePeriod=30 Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.096279 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.243132 5002 generic.go:334] "Generic (PLEG): container finished" podID="b3483538-8c2c-4c39-a9f1-5c8b6779d43a" containerID="aa4b67030850ec07ff3d003861c1733d259c14d2364627e83d950e4c96243b51" exitCode=0 Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.243196 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b3483538-8c2c-4c39-a9f1-5c8b6779d43a","Type":"ContainerDied","Data":"aa4b67030850ec07ff3d003861c1733d259c14d2364627e83d950e4c96243b51"} Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.245419 5002 generic.go:334] "Generic (PLEG): container finished" podID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerID="eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2" exitCode=143 Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.245482 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"508684a5-04cd-49b4-b3e0-716ccad025d9","Type":"ContainerDied","Data":"eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2"} Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.247835 5002 generic.go:334] "Generic (PLEG): container finished" podID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerID="5c434b80a34154b996afc300fcd845f4d30f3d76b2b35fa8aff5d6d2869a2792" exitCode=143 Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.247918 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e7a608e-8ea7-4852-b11a-148d988edf80","Type":"ContainerDied","Data":"5c434b80a34154b996afc300fcd845f4d30f3d76b2b35fa8aff5d6d2869a2792"} Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.783108 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.920325 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-combined-ca-bundle\") pod \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.920894 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-config-data\") pod \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.920941 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccwq7\" (UniqueName: \"kubernetes.io/projected/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-kube-api-access-ccwq7\") pod \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\" (UID: \"b3483538-8c2c-4c39-a9f1-5c8b6779d43a\") " Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.927129 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-kube-api-access-ccwq7" (OuterVolumeSpecName: "kube-api-access-ccwq7") pod "b3483538-8c2c-4c39-a9f1-5c8b6779d43a" (UID: "b3483538-8c2c-4c39-a9f1-5c8b6779d43a"). InnerVolumeSpecName "kube-api-access-ccwq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.949070 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3483538-8c2c-4c39-a9f1-5c8b6779d43a" (UID: "b3483538-8c2c-4c39-a9f1-5c8b6779d43a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.950607 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:53 crc kubenswrapper[5002]: I1209 11:35:53.954444 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-config-data" (OuterVolumeSpecName: "config-data") pod "b3483538-8c2c-4c39-a9f1-5c8b6779d43a" (UID: "b3483538-8c2c-4c39-a9f1-5c8b6779d43a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:53 crc kubenswrapper[5002]: E1209 11:35:53.995356 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a91c1f212ac18d512531d99263be14280296db593fe5c90de534dfaaeef66d7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:35:53 crc kubenswrapper[5002]: E1209 11:35:53.996547 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a91c1f212ac18d512531d99263be14280296db593fe5c90de534dfaaeef66d7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:35:53 crc kubenswrapper[5002]: E1209 11:35:53.997759 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a91c1f212ac18d512531d99263be14280296db593fe5c90de534dfaaeef66d7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:35:53 crc kubenswrapper[5002]: E1209 11:35:53.997786 5002 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2" containerName="nova-scheduler-scheduler" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.025637 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.025677 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccwq7\" (UniqueName: \"kubernetes.io/projected/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-kube-api-access-ccwq7\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.025690 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3483538-8c2c-4c39-a9f1-5c8b6779d43a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.075283 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="089e55db-3c5c-4320-92e2-2aeca1510fed" path="/var/lib/kubelet/pods/089e55db-3c5c-4320-92e2-2aeca1510fed/volumes" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.126759 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kcp2\" (UniqueName: \"kubernetes.io/projected/1bc03877-56b5-44c0-9565-ce459d9f28da-kube-api-access-7kcp2\") pod \"1bc03877-56b5-44c0-9565-ce459d9f28da\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.126924 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-config-data\") pod \"1bc03877-56b5-44c0-9565-ce459d9f28da\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.126988 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-combined-ca-bundle\") pod \"1bc03877-56b5-44c0-9565-ce459d9f28da\" (UID: \"1bc03877-56b5-44c0-9565-ce459d9f28da\") " Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.134950 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc03877-56b5-44c0-9565-ce459d9f28da-kube-api-access-7kcp2" (OuterVolumeSpecName: "kube-api-access-7kcp2") pod "1bc03877-56b5-44c0-9565-ce459d9f28da" (UID: "1bc03877-56b5-44c0-9565-ce459d9f28da"). InnerVolumeSpecName "kube-api-access-7kcp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.150602 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bc03877-56b5-44c0-9565-ce459d9f28da" (UID: "1bc03877-56b5-44c0-9565-ce459d9f28da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.153919 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-config-data" (OuterVolumeSpecName: "config-data") pod "1bc03877-56b5-44c0-9565-ce459d9f28da" (UID: "1bc03877-56b5-44c0-9565-ce459d9f28da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.228539 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kcp2\" (UniqueName: \"kubernetes.io/projected/1bc03877-56b5-44c0-9565-ce459d9f28da-kube-api-access-7kcp2\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.228572 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.228582 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc03877-56b5-44c0-9565-ce459d9f28da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.286630 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b3483538-8c2c-4c39-a9f1-5c8b6779d43a","Type":"ContainerDied","Data":"74657b77970a967a07566401238df14e9d7b6b635c85c4e98bb6b08e2755ede3"} Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.286704 5002 scope.go:117] "RemoveContainer" containerID="aa4b67030850ec07ff3d003861c1733d259c14d2364627e83d950e4c96243b51" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.286761 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.304799 5002 generic.go:334] "Generic (PLEG): container finished" podID="1bc03877-56b5-44c0-9565-ce459d9f28da" containerID="0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0" exitCode=0 Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.304851 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1bc03877-56b5-44c0-9565-ce459d9f28da","Type":"ContainerDied","Data":"0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0"} Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.304876 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1bc03877-56b5-44c0-9565-ce459d9f28da","Type":"ContainerDied","Data":"5ebe7d63153bd97ec1d9a3746c2e50eedbcfc28a2cfcba84efa0072399de1e6d"} Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.304933 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.360935 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.361371 5002 scope.go:117] "RemoveContainer" containerID="0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.385609 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.394189 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.401356 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.408099 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:35:54 crc kubenswrapper[5002]: E1209 11:35:54.408614 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc03877-56b5-44c0-9565-ce459d9f28da" containerName="nova-cell0-conductor-conductor" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.408631 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc03877-56b5-44c0-9565-ce459d9f28da" containerName="nova-cell0-conductor-conductor" Dec 09 11:35:54 crc kubenswrapper[5002]: E1209 11:35:54.408656 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3483538-8c2c-4c39-a9f1-5c8b6779d43a" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.408663 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3483538-8c2c-4c39-a9f1-5c8b6779d43a" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 11:35:54 crc kubenswrapper[5002]: E1209 11:35:54.408672 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089e55db-3c5c-4320-92e2-2aeca1510fed" containerName="init" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.408678 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="089e55db-3c5c-4320-92e2-2aeca1510fed" containerName="init" Dec 09 11:35:54 crc kubenswrapper[5002]: E1209 11:35:54.408691 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089e55db-3c5c-4320-92e2-2aeca1510fed" containerName="dnsmasq-dns" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.408697 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="089e55db-3c5c-4320-92e2-2aeca1510fed" containerName="dnsmasq-dns" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.408899 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3483538-8c2c-4c39-a9f1-5c8b6779d43a" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.408916 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="089e55db-3c5c-4320-92e2-2aeca1510fed" containerName="dnsmasq-dns" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.408931 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc03877-56b5-44c0-9565-ce459d9f28da" containerName="nova-cell0-conductor-conductor" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.409653 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.430774 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.431872 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.431887 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.431951 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.441867 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.442049 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.453421 5002 scope.go:117] "RemoveContainer" containerID="0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0" Dec 09 11:35:54 crc kubenswrapper[5002]: E1209 11:35:54.465837 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0\": container with ID starting with 0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0 not found: ID does not exist" containerID="0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.465880 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0"} err="failed to get container status \"0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0\": rpc error: code = NotFound desc = could not find container \"0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0\": container with ID starting with 0f4aef0a2426b1d7d164e37f0f54bc4b014cab5d7b256879db415d42eea46ca0 not found: ID does not exist" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.538895 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b037088-45da-4b2b-85d1-777da6060b4f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b037088-45da-4b2b-85d1-777da6060b4f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.539345 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.539422 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b037088-45da-4b2b-85d1-777da6060b4f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b037088-45da-4b2b-85d1-777da6060b4f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.539460 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcv4b\" (UniqueName: \"kubernetes.io/projected/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-kube-api-access-pcv4b\") pod \"nova-cell0-conductor-0\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.539480 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.539502 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqz4r\" (UniqueName: \"kubernetes.io/projected/6b037088-45da-4b2b-85d1-777da6060b4f-kube-api-access-qqz4r\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b037088-45da-4b2b-85d1-777da6060b4f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.640978 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.641115 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b037088-45da-4b2b-85d1-777da6060b4f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b037088-45da-4b2b-85d1-777da6060b4f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.641156 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcv4b\" (UniqueName: \"kubernetes.io/projected/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-kube-api-access-pcv4b\") pod \"nova-cell0-conductor-0\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.641183 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.641222 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqz4r\" (UniqueName: \"kubernetes.io/projected/6b037088-45da-4b2b-85d1-777da6060b4f-kube-api-access-qqz4r\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b037088-45da-4b2b-85d1-777da6060b4f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.641318 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b037088-45da-4b2b-85d1-777da6060b4f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b037088-45da-4b2b-85d1-777da6060b4f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.646764 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b037088-45da-4b2b-85d1-777da6060b4f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b037088-45da-4b2b-85d1-777da6060b4f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.647497 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b037088-45da-4b2b-85d1-777da6060b4f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b037088-45da-4b2b-85d1-777da6060b4f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.649840 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.665922 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.667324 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqz4r\" (UniqueName: \"kubernetes.io/projected/6b037088-45da-4b2b-85d1-777da6060b4f-kube-api-access-qqz4r\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b037088-45da-4b2b-85d1-777da6060b4f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.673234 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcv4b\" (UniqueName: \"kubernetes.io/projected/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-kube-api-access-pcv4b\") pod \"nova-cell0-conductor-0\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.754169 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:54 crc kubenswrapper[5002]: I1209 11:35:54.764321 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.267485 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.325100 5002 generic.go:334] "Generic (PLEG): container finished" podID="4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2" containerID="8a91c1f212ac18d512531d99263be14280296db593fe5c90de534dfaaeef66d7" exitCode=0 Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.325175 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2","Type":"ContainerDied","Data":"8a91c1f212ac18d512531d99263be14280296db593fe5c90de534dfaaeef66d7"} Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.327169 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b037088-45da-4b2b-85d1-777da6060b4f","Type":"ContainerStarted","Data":"8f1ae04a09d1f84c99fb5ad09b2a54c44129daee45b336664f7937a4a32d76cd"} Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.338929 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.447621 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.498171 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": read tcp 10.217.0.2:48200->10.217.1.66:8774: read: connection reset by peer" Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.498172 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": read tcp 10.217.0.2:48194->10.217.1.66:8774: read: connection reset by peer" Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.538885 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.539313 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="90e280fc-dcb3-4d43-828d-89c8abf26988" containerName="nova-cell1-conductor-conductor" containerID="cri-o://ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821" gracePeriod=30 Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.558573 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-combined-ca-bundle\") pod \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.558668 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-config-data\") pod \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.558748 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bstp\" (UniqueName: \"kubernetes.io/projected/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-kube-api-access-7bstp\") pod \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\" (UID: \"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2\") " Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.563934 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-kube-api-access-7bstp" (OuterVolumeSpecName: "kube-api-access-7bstp") pod "4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2" (UID: "4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2"). InnerVolumeSpecName "kube-api-access-7bstp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.571050 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": read tcp 10.217.0.2:35512->10.217.1.67:8775: read: connection reset by peer" Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.571050 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": read tcp 10.217.0.2:35526->10.217.1.67:8775: read: connection reset by peer" Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.586949 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2" (UID: "4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.589371 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-config-data" (OuterVolumeSpecName: "config-data") pod "4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2" (UID: "4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.660746 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.660784 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bstp\" (UniqueName: \"kubernetes.io/projected/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-kube-api-access-7bstp\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:55 crc kubenswrapper[5002]: I1209 11:35:55.660798 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:56 crc kubenswrapper[5002]: E1209 11:35:56.016842 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e7a608e_8ea7_4852_b11a_148d988edf80.slice/crio-conmon-6f28cf966e43a907662091d0d63d04f817aae004b749004b4811848bd325ae45.scope\": RecentStats: unable to find data in memory cache]" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.077243 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc03877-56b5-44c0-9565-ce459d9f28da" path="/var/lib/kubelet/pods/1bc03877-56b5-44c0-9565-ce459d9f28da/volumes" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.078344 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3483538-8c2c-4c39-a9f1-5c8b6779d43a" path="/var/lib/kubelet/pods/b3483538-8c2c-4c39-a9f1-5c8b6779d43a/volumes" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.203704 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.339205 5002 generic.go:334] "Generic (PLEG): container finished" podID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerID="6f28cf966e43a907662091d0d63d04f817aae004b749004b4811848bd325ae45" exitCode=0 Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.339273 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e7a608e-8ea7-4852-b11a-148d988edf80","Type":"ContainerDied","Data":"6f28cf966e43a907662091d0d63d04f817aae004b749004b4811848bd325ae45"} Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.345491 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1","Type":"ContainerStarted","Data":"5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772"} Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.345536 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1","Type":"ContainerStarted","Data":"9ef541577938a089596ea3b9f0db2391e229f83220bd4d1ecdefbc0450b7c9bb"} Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.345585 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.351461 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2","Type":"ContainerDied","Data":"b5d0b83d3e00a4eb8e616b99102b72ba63acda49b683ed60d25baafa8ee1f599"} Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.351528 5002 scope.go:117] "RemoveContainer" containerID="8a91c1f212ac18d512531d99263be14280296db593fe5c90de534dfaaeef66d7" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.351628 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.362618 5002 generic.go:334] "Generic (PLEG): container finished" podID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerID="8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1" exitCode=0 Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.362673 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"508684a5-04cd-49b4-b3e0-716ccad025d9","Type":"ContainerDied","Data":"8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1"} Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.362703 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"508684a5-04cd-49b4-b3e0-716ccad025d9","Type":"ContainerDied","Data":"8298ec65f139658a35e6545c0252564013353584123a403cf5c165c722d0ce74"} Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.362770 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.363967 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.363959269 podStartE2EDuration="2.363959269s" podCreationTimestamp="2025-12-09 11:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:56.361739 +0000 UTC m=+5688.753790101" watchObservedRunningTime="2025-12-09 11:35:56.363959269 +0000 UTC m=+5688.756010350" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.369546 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b037088-45da-4b2b-85d1-777da6060b4f","Type":"ContainerStarted","Data":"6ad077aa059a4b33e580d13e0ce4ea362eb7f56035ced24c2a36adf1d31591b4"} Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.385015 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-config-data\") pod \"508684a5-04cd-49b4-b3e0-716ccad025d9\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.385279 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/508684a5-04cd-49b4-b3e0-716ccad025d9-logs\") pod \"508684a5-04cd-49b4-b3e0-716ccad025d9\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.385443 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfjxz\" (UniqueName: \"kubernetes.io/projected/508684a5-04cd-49b4-b3e0-716ccad025d9-kube-api-access-sfjxz\") pod \"508684a5-04cd-49b4-b3e0-716ccad025d9\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.385736 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-combined-ca-bundle\") pod \"508684a5-04cd-49b4-b3e0-716ccad025d9\" (UID: \"508684a5-04cd-49b4-b3e0-716ccad025d9\") " Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.386841 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/508684a5-04cd-49b4-b3e0-716ccad025d9-logs" (OuterVolumeSpecName: "logs") pod "508684a5-04cd-49b4-b3e0-716ccad025d9" (UID: "508684a5-04cd-49b4-b3e0-716ccad025d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.394613 5002 scope.go:117] "RemoveContainer" containerID="8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.416969 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/508684a5-04cd-49b4-b3e0-716ccad025d9-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.429368 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/508684a5-04cd-49b4-b3e0-716ccad025d9-kube-api-access-sfjxz" (OuterVolumeSpecName: "kube-api-access-sfjxz") pod "508684a5-04cd-49b4-b3e0-716ccad025d9" (UID: "508684a5-04cd-49b4-b3e0-716ccad025d9"). InnerVolumeSpecName "kube-api-access-sfjxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.466244 5002 scope.go:117] "RemoveContainer" containerID="eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.476386 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.483983 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "508684a5-04cd-49b4-b3e0-716ccad025d9" (UID: "508684a5-04cd-49b4-b3e0-716ccad025d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.521015 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.521055 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfjxz\" (UniqueName: \"kubernetes.io/projected/508684a5-04cd-49b4-b3e0-716ccad025d9-kube-api-access-sfjxz\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.570329 5002 scope.go:117] "RemoveContainer" containerID="8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1" Dec 09 11:35:56 crc kubenswrapper[5002]: E1209 11:35:56.571916 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1\": container with ID starting with 8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1 not found: ID does not exist" containerID="8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.572185 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1"} err="failed to get container status \"8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1\": rpc error: code = NotFound desc = could not find container \"8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1\": container with ID starting with 8c2e29170c137ebf193d874d180b8f59acbdfc002c01df8ce8d34c22fd802cc1 not found: ID does not exist" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.572207 5002 scope.go:117] "RemoveContainer" containerID="eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2" Dec 09 11:35:56 crc kubenswrapper[5002]: E1209 11:35:56.580388 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2\": container with ID starting with eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2 not found: ID does not exist" containerID="eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.580495 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2"} err="failed to get container status \"eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2\": rpc error: code = NotFound desc = could not find container \"eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2\": container with ID starting with eba09b67b8bd4964616fabee2ba3552e2e3d53427bf901903b70f474a64f25f2 not found: ID does not exist" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.583674 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.587217 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-config-data" (OuterVolumeSpecName: "config-data") pod "508684a5-04cd-49b4-b3e0-716ccad025d9" (UID: "508684a5-04cd-49b4-b3e0-716ccad025d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.603857 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.6038364659999997 podStartE2EDuration="2.603836466s" podCreationTimestamp="2025-12-09 11:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:56.45144889 +0000 UTC m=+5688.843499981" watchObservedRunningTime="2025-12-09 11:35:56.603836466 +0000 UTC m=+5688.995887547" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.607347 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:35:56 crc kubenswrapper[5002]: E1209 11:35:56.607786 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-api" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.607823 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-api" Dec 09 11:35:56 crc kubenswrapper[5002]: E1209 11:35:56.607849 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2" containerName="nova-scheduler-scheduler" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.607856 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2" containerName="nova-scheduler-scheduler" Dec 09 11:35:56 crc kubenswrapper[5002]: E1209 11:35:56.607864 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-log" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.607870 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-log" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.608058 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-log" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.608080 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2" containerName="nova-scheduler-scheduler" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.608094 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" containerName="nova-api-api" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.608658 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.608746 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.619236 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.626198 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " pod="openstack/nova-scheduler-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.626325 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-config-data\") pod \"nova-scheduler-0\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " pod="openstack/nova-scheduler-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.626357 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfz9l\" (UniqueName: \"kubernetes.io/projected/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-kube-api-access-rfz9l\") pod \"nova-scheduler-0\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " pod="openstack/nova-scheduler-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.626439 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/508684a5-04cd-49b4-b3e0-716ccad025d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.674446 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.736371 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-config-data\") pod \"nova-scheduler-0\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " pod="openstack/nova-scheduler-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.739852 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfz9l\" (UniqueName: \"kubernetes.io/projected/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-kube-api-access-rfz9l\") pod \"nova-scheduler-0\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " pod="openstack/nova-scheduler-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.740135 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " pod="openstack/nova-scheduler-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.752561 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-config-data\") pod \"nova-scheduler-0\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " pod="openstack/nova-scheduler-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.764453 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " pod="openstack/nova-scheduler-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.780472 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfz9l\" (UniqueName: \"kubernetes.io/projected/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-kube-api-access-rfz9l\") pod \"nova-scheduler-0\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " pod="openstack/nova-scheduler-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.799065 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.830287 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.849098 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-combined-ca-bundle\") pod \"3e7a608e-8ea7-4852-b11a-148d988edf80\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.849139 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7a608e-8ea7-4852-b11a-148d988edf80-logs\") pod \"3e7a608e-8ea7-4852-b11a-148d988edf80\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.849212 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-config-data\") pod \"3e7a608e-8ea7-4852-b11a-148d988edf80\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.849356 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnjqf\" (UniqueName: \"kubernetes.io/projected/3e7a608e-8ea7-4852-b11a-148d988edf80-kube-api-access-dnjqf\") pod \"3e7a608e-8ea7-4852-b11a-148d988edf80\" (UID: \"3e7a608e-8ea7-4852-b11a-148d988edf80\") " Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.850303 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7a608e-8ea7-4852-b11a-148d988edf80-logs" (OuterVolumeSpecName: "logs") pod "3e7a608e-8ea7-4852-b11a-148d988edf80" (UID: "3e7a608e-8ea7-4852-b11a-148d988edf80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.853136 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7a608e-8ea7-4852-b11a-148d988edf80-kube-api-access-dnjqf" (OuterVolumeSpecName: "kube-api-access-dnjqf") pod "3e7a608e-8ea7-4852-b11a-148d988edf80" (UID: "3e7a608e-8ea7-4852-b11a-148d988edf80"). InnerVolumeSpecName "kube-api-access-dnjqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.880233 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-config-data" (OuterVolumeSpecName: "config-data") pod "3e7a608e-8ea7-4852-b11a-148d988edf80" (UID: "3e7a608e-8ea7-4852-b11a-148d988edf80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.895958 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 11:35:56 crc kubenswrapper[5002]: E1209 11:35:56.896369 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-log" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.896388 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-log" Dec 09 11:35:56 crc kubenswrapper[5002]: E1209 11:35:56.896406 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-metadata" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.896419 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-metadata" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.896597 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-log" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.896619 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" containerName="nova-metadata-metadata" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.897583 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.910053 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.915738 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.931995 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e7a608e-8ea7-4852-b11a-148d988edf80" (UID: "3e7a608e-8ea7-4852-b11a-148d988edf80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.954750 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-config-data\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.954833 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hslx\" (UniqueName: \"kubernetes.io/projected/c46fe918-c220-4c57-812e-7a085b199362-kube-api-access-5hslx\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.954902 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.955251 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46fe918-c220-4c57-812e-7a085b199362-logs\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.955557 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.955633 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnjqf\" (UniqueName: \"kubernetes.io/projected/3e7a608e-8ea7-4852-b11a-148d988edf80-kube-api-access-dnjqf\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.955689 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7a608e-8ea7-4852-b11a-148d988edf80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.955748 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7a608e-8ea7-4852-b11a-148d988edf80-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:56 crc kubenswrapper[5002]: I1209 11:35:56.956335 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.058732 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-config-data\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.058783 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hslx\" (UniqueName: \"kubernetes.io/projected/c46fe918-c220-4c57-812e-7a085b199362-kube-api-access-5hslx\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.058851 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.058937 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46fe918-c220-4c57-812e-7a085b199362-logs\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.061089 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46fe918-c220-4c57-812e-7a085b199362-logs\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.065043 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-config-data\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.074680 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.077090 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hslx\" (UniqueName: \"kubernetes.io/projected/c46fe918-c220-4c57-812e-7a085b199362-kube-api-access-5hslx\") pod \"nova-api-0\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " pod="openstack/nova-api-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.225120 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.393426 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:35:57 crc kubenswrapper[5002]: W1209 11:35:57.409037 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4356bbcb_7ff8_4553_ac2e_bc86d2a9c1d8.slice/crio-b645a19032d129bdc850b33463612cf05f6314c0748ac5c46b79b9db8ec5d3ba WatchSource:0}: Error finding container b645a19032d129bdc850b33463612cf05f6314c0748ac5c46b79b9db8ec5d3ba: Status 404 returned error can't find the container with id b645a19032d129bdc850b33463612cf05f6314c0748ac5c46b79b9db8ec5d3ba Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.426877 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e7a608e-8ea7-4852-b11a-148d988edf80","Type":"ContainerDied","Data":"f9d110f0ea12470ecef4d1eca10da4768a84775e9fbfe5987d3727761446f27a"} Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.426926 5002 scope.go:117] "RemoveContainer" containerID="6f28cf966e43a907662091d0d63d04f817aae004b749004b4811848bd325ae45" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.427060 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.515741 5002 scope.go:117] "RemoveContainer" containerID="5c434b80a34154b996afc300fcd845f4d30f3d76b2b35fa8aff5d6d2869a2792" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.524577 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.535883 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.540712 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.542130 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.545201 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.563896 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.575787 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.575870 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-config-data\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.575911 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq42m\" (UniqueName: \"kubernetes.io/projected/dd771216-04c2-4469-9146-6a732a36843d-kube-api-access-zq42m\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.575945 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd771216-04c2-4469-9146-6a732a36843d-logs\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.677698 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.677832 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-config-data\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.678377 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq42m\" (UniqueName: \"kubernetes.io/projected/dd771216-04c2-4469-9146-6a732a36843d-kube-api-access-zq42m\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.678425 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd771216-04c2-4469-9146-6a732a36843d-logs\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.680513 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd771216-04c2-4469-9146-6a732a36843d-logs\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.681183 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.682441 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-config-data\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.693014 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq42m\" (UniqueName: \"kubernetes.io/projected/dd771216-04c2-4469-9146-6a732a36843d-kube-api-access-zq42m\") pod \"nova-metadata-0\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " pod="openstack/nova-metadata-0" Dec 09 11:35:57 crc kubenswrapper[5002]: W1209 11:35:57.709090 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc46fe918_c220_4c57_812e_7a085b199362.slice/crio-7b92fffe4b161f51da5de6be9cf521b7451e8261d8afedf2f90ca0da8e3a0f12 WatchSource:0}: Error finding container 7b92fffe4b161f51da5de6be9cf521b7451e8261d8afedf2f90ca0da8e3a0f12: Status 404 returned error can't find the container with id 7b92fffe4b161f51da5de6be9cf521b7451e8261d8afedf2f90ca0da8e3a0f12 Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.714917 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:35:57 crc kubenswrapper[5002]: I1209 11:35:57.878403 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.085303 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7a608e-8ea7-4852-b11a-148d988edf80" path="/var/lib/kubelet/pods/3e7a608e-8ea7-4852-b11a-148d988edf80/volumes" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.086741 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2" path="/var/lib/kubelet/pods/4e3abcbc-4d1c-44fe-8eda-3e2b401e0ab2/volumes" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.088113 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="508684a5-04cd-49b4-b3e0-716ccad025d9" path="/var/lib/kubelet/pods/508684a5-04cd-49b4-b3e0-716ccad025d9/volumes" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.334043 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:35:58 crc kubenswrapper[5002]: W1209 11:35:58.339197 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd771216_04c2_4469_9146_6a732a36843d.slice/crio-061bcacb81e49f1850ad96ad24125d6c2c4b7345885f80879aece99ed3c55362 WatchSource:0}: Error finding container 061bcacb81e49f1850ad96ad24125d6c2c4b7345885f80879aece99ed3c55362: Status 404 returned error can't find the container with id 061bcacb81e49f1850ad96ad24125d6c2c4b7345885f80879aece99ed3c55362 Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.451202 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46fe918-c220-4c57-812e-7a085b199362","Type":"ContainerStarted","Data":"a5f4153c494e051ca3a95763a488b6d59775fcea18f2eef4a7f65e480a887a30"} Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.451277 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46fe918-c220-4c57-812e-7a085b199362","Type":"ContainerStarted","Data":"7b92fffe4b161f51da5de6be9cf521b7451e8261d8afedf2f90ca0da8e3a0f12"} Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.456874 5002 generic.go:334] "Generic (PLEG): container finished" podID="90e280fc-dcb3-4d43-828d-89c8abf26988" containerID="ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821" exitCode=0 Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.457143 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"90e280fc-dcb3-4d43-828d-89c8abf26988","Type":"ContainerDied","Data":"ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821"} Dec 09 11:35:58 crc kubenswrapper[5002]: E1209 11:35:58.460779 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821 is running failed: container process not found" containerID="ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 11:35:58 crc kubenswrapper[5002]: E1209 11:35:58.464418 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821 is running failed: container process not found" containerID="ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.464576 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd771216-04c2-4469-9146-6a732a36843d","Type":"ContainerStarted","Data":"061bcacb81e49f1850ad96ad24125d6c2c4b7345885f80879aece99ed3c55362"} Dec 09 11:35:58 crc kubenswrapper[5002]: E1209 11:35:58.466751 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821 is running failed: container process not found" containerID="ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 11:35:58 crc kubenswrapper[5002]: E1209 11:35:58.466787 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="90e280fc-dcb3-4d43-828d-89c8abf26988" containerName="nova-cell1-conductor-conductor" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.471440 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8","Type":"ContainerStarted","Data":"c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac"} Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.471498 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8","Type":"ContainerStarted","Data":"b645a19032d129bdc850b33463612cf05f6314c0748ac5c46b79b9db8ec5d3ba"} Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.577784 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.602637 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6026120109999997 podStartE2EDuration="2.602612011s" podCreationTimestamp="2025-12-09 11:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:58.48967282 +0000 UTC m=+5690.881723901" watchObservedRunningTime="2025-12-09 11:35:58.602612011 +0000 UTC m=+5690.994663092" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.698539 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx4hz\" (UniqueName: \"kubernetes.io/projected/90e280fc-dcb3-4d43-828d-89c8abf26988-kube-api-access-xx4hz\") pod \"90e280fc-dcb3-4d43-828d-89c8abf26988\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.698802 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-combined-ca-bundle\") pod \"90e280fc-dcb3-4d43-828d-89c8abf26988\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.698860 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-config-data\") pod \"90e280fc-dcb3-4d43-828d-89c8abf26988\" (UID: \"90e280fc-dcb3-4d43-828d-89c8abf26988\") " Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.702491 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e280fc-dcb3-4d43-828d-89c8abf26988-kube-api-access-xx4hz" (OuterVolumeSpecName: "kube-api-access-xx4hz") pod "90e280fc-dcb3-4d43-828d-89c8abf26988" (UID: "90e280fc-dcb3-4d43-828d-89c8abf26988"). InnerVolumeSpecName "kube-api-access-xx4hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.720913 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90e280fc-dcb3-4d43-828d-89c8abf26988" (UID: "90e280fc-dcb3-4d43-828d-89c8abf26988"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.744981 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-config-data" (OuterVolumeSpecName: "config-data") pod "90e280fc-dcb3-4d43-828d-89c8abf26988" (UID: "90e280fc-dcb3-4d43-828d-89c8abf26988"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.800906 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.800939 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e280fc-dcb3-4d43-828d-89c8abf26988-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:58 crc kubenswrapper[5002]: I1209 11:35:58.800948 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx4hz\" (UniqueName: \"kubernetes.io/projected/90e280fc-dcb3-4d43-828d-89c8abf26988-kube-api-access-xx4hz\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.485886 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.488886 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"90e280fc-dcb3-4d43-828d-89c8abf26988","Type":"ContainerDied","Data":"1598f12a7d883046533a4134c774c9fba49de612f60ec3a332bbfa5f9871bcd7"} Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.488947 5002 scope.go:117] "RemoveContainer" containerID="ba5d508c985be31ea67f932d9b586a33685597b5adc65e900f42835227f33821" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.512153 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd771216-04c2-4469-9146-6a732a36843d","Type":"ContainerStarted","Data":"458af308bf65a903ca6da8a820f6f3734105a18e2c74e18e70f8c34b16c61eaf"} Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.512217 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd771216-04c2-4469-9146-6a732a36843d","Type":"ContainerStarted","Data":"0ae179bd45135554ae40623e3f8436aebb71e338b5e536f221e0e5bb9258cbbc"} Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.518578 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46fe918-c220-4c57-812e-7a085b199362","Type":"ContainerStarted","Data":"79506c16e9a3624e2e56c4d1192962fd11abf39bba5efc9e0da33fae1bea68a5"} Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.538207 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.538186637 podStartE2EDuration="2.538186637s" podCreationTimestamp="2025-12-09 11:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:59.531118908 +0000 UTC m=+5691.923170019" watchObservedRunningTime="2025-12-09 11:35:59.538186637 +0000 UTC m=+5691.930237728" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.569009 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.584348 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.584632 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.584614029 podStartE2EDuration="3.584614029s" podCreationTimestamp="2025-12-09 11:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:59.558586823 +0000 UTC m=+5691.950637914" watchObservedRunningTime="2025-12-09 11:35:59.584614029 +0000 UTC m=+5691.976665110" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.607426 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:35:59 crc kubenswrapper[5002]: E1209 11:35:59.607967 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e280fc-dcb3-4d43-828d-89c8abf26988" containerName="nova-cell1-conductor-conductor" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.607987 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e280fc-dcb3-4d43-828d-89c8abf26988" containerName="nova-cell1-conductor-conductor" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.608223 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e280fc-dcb3-4d43-828d-89c8abf26988" containerName="nova-cell1-conductor-conductor" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.609052 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.614189 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.617471 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.716800 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.716988 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.717094 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95h97\" (UniqueName: \"kubernetes.io/projected/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-kube-api-access-95h97\") pod \"nova-cell1-conductor-0\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.765157 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.819347 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.819412 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.819483 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95h97\" (UniqueName: \"kubernetes.io/projected/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-kube-api-access-95h97\") pod \"nova-cell1-conductor-0\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.823967 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.824475 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.842713 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95h97\" (UniqueName: \"kubernetes.io/projected/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-kube-api-access-95h97\") pod \"nova-cell1-conductor-0\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:35:59 crc kubenswrapper[5002]: I1209 11:35:59.926337 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 11:36:00 crc kubenswrapper[5002]: I1209 11:36:00.084766 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e280fc-dcb3-4d43-828d-89c8abf26988" path="/var/lib/kubelet/pods/90e280fc-dcb3-4d43-828d-89c8abf26988/volumes" Dec 09 11:36:00 crc kubenswrapper[5002]: I1209 11:36:00.427245 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:36:00 crc kubenswrapper[5002]: I1209 11:36:00.536387 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09","Type":"ContainerStarted","Data":"59dfbed8097003e16b2f892d0f431bba75ded2ff53dd1f3352d8a6c94d85d267"} Dec 09 11:36:01 crc kubenswrapper[5002]: I1209 11:36:01.549166 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09","Type":"ContainerStarted","Data":"4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f"} Dec 09 11:36:01 crc kubenswrapper[5002]: I1209 11:36:01.549549 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 09 11:36:01 crc kubenswrapper[5002]: I1209 11:36:01.581761 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.581740101 podStartE2EDuration="2.581740101s" podCreationTimestamp="2025-12-09 11:35:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:36:01.56749856 +0000 UTC m=+5693.959549651" watchObservedRunningTime="2025-12-09 11:36:01.581740101 +0000 UTC m=+5693.973791182" Dec 09 11:36:01 crc kubenswrapper[5002]: I1209 11:36:01.957246 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 11:36:02 crc kubenswrapper[5002]: I1209 11:36:02.878665 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:36:02 crc kubenswrapper[5002]: I1209 11:36:02.879054 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:36:04 crc kubenswrapper[5002]: I1209 11:36:04.765172 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:36:04 crc kubenswrapper[5002]: I1209 11:36:04.777125 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:36:04 crc kubenswrapper[5002]: I1209 11:36:04.807963 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 09 11:36:05 crc kubenswrapper[5002]: I1209 11:36:05.606197 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:36:06 crc kubenswrapper[5002]: I1209 11:36:06.957077 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 11:36:06 crc kubenswrapper[5002]: I1209 11:36:06.990430 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 11:36:07 crc kubenswrapper[5002]: I1209 11:36:07.226634 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:36:07 crc kubenswrapper[5002]: I1209 11:36:07.226722 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:36:07 crc kubenswrapper[5002]: I1209 11:36:07.676648 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 11:36:07 crc kubenswrapper[5002]: I1209 11:36:07.878862 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 11:36:07 crc kubenswrapper[5002]: I1209 11:36:07.878927 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 11:36:08 crc kubenswrapper[5002]: I1209 11:36:08.267038 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c46fe918-c220-4c57-812e-7a085b199362" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.78:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:36:08 crc kubenswrapper[5002]: I1209 11:36:08.308097 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c46fe918-c220-4c57-812e-7a085b199362" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.78:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:36:08 crc kubenswrapper[5002]: I1209 11:36:08.961057 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.79:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:36:08 crc kubenswrapper[5002]: I1209 11:36:08.961058 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.79:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:36:09 crc kubenswrapper[5002]: I1209 11:36:09.955220 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.082476 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.084866 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.086704 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.096993 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.134037 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.134088 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65hk2\" (UniqueName: \"kubernetes.io/projected/bea6c83a-466a-43bd-b336-9328f839cf88-kube-api-access-65hk2\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.134173 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.134208 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-scripts\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.134256 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.134380 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bea6c83a-466a-43bd-b336-9328f839cf88-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.236322 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bea6c83a-466a-43bd-b336-9328f839cf88-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.236406 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.236457 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65hk2\" (UniqueName: \"kubernetes.io/projected/bea6c83a-466a-43bd-b336-9328f839cf88-kube-api-access-65hk2\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.236546 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.236593 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-scripts\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.236668 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.237589 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bea6c83a-466a-43bd-b336-9328f839cf88-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.244440 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-scripts\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.244730 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.252579 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.253565 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.261418 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65hk2\" (UniqueName: \"kubernetes.io/projected/bea6c83a-466a-43bd-b336-9328f839cf88-kube-api-access-65hk2\") pod \"cinder-scheduler-0\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.412998 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:36:11 crc kubenswrapper[5002]: I1209 11:36:11.859369 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:36:11 crc kubenswrapper[5002]: W1209 11:36:11.873224 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbea6c83a_466a_43bd_b336_9328f839cf88.slice/crio-3a0afdb182a0cf27b47fd8e3b7f6d367accb3705e8a6a0d05c3d7cddefe6c8cd WatchSource:0}: Error finding container 3a0afdb182a0cf27b47fd8e3b7f6d367accb3705e8a6a0d05c3d7cddefe6c8cd: Status 404 returned error can't find the container with id 3a0afdb182a0cf27b47fd8e3b7f6d367accb3705e8a6a0d05c3d7cddefe6c8cd Dec 09 11:36:12 crc kubenswrapper[5002]: I1209 11:36:12.363696 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:36:12 crc kubenswrapper[5002]: I1209 11:36:12.364383 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" containerName="cinder-api-log" containerID="cri-o://b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9" gracePeriod=30 Dec 09 11:36:12 crc kubenswrapper[5002]: I1209 11:36:12.364488 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" containerName="cinder-api" containerID="cri-o://ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af" gracePeriod=30 Dec 09 11:36:12 crc kubenswrapper[5002]: I1209 11:36:12.730912 5002 generic.go:334] "Generic (PLEG): container finished" podID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" containerID="b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9" exitCode=143 Dec 09 11:36:12 crc kubenswrapper[5002]: I1209 11:36:12.731175 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b","Type":"ContainerDied","Data":"b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9"} Dec 09 11:36:12 crc kubenswrapper[5002]: I1209 11:36:12.740802 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bea6c83a-466a-43bd-b336-9328f839cf88","Type":"ContainerStarted","Data":"bf5268736e978035bf26b4117148146270ec502b4973ec90dc9f35d7781e241f"} Dec 09 11:36:12 crc kubenswrapper[5002]: I1209 11:36:12.740859 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bea6c83a-466a-43bd-b336-9328f839cf88","Type":"ContainerStarted","Data":"3a0afdb182a0cf27b47fd8e3b7f6d367accb3705e8a6a0d05c3d7cddefe6c8cd"} Dec 09 11:36:12 crc kubenswrapper[5002]: I1209 11:36:12.988219 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 09 11:36:12 crc kubenswrapper[5002]: I1209 11:36:12.990251 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.000781 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.001889 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.096942 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-run\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097039 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097156 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097472 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-dev\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097508 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097544 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-sys\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097587 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097611 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097652 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097701 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097737 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097893 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htp5s\" (UniqueName: \"kubernetes.io/projected/44fcf36e-889c-47c3-96a7-cdc8263c3a01-kube-api-access-htp5s\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097961 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.097997 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.098082 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/44fcf36e-889c-47c3-96a7-cdc8263c3a01-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.098130 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200091 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200181 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200209 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htp5s\" (UniqueName: \"kubernetes.io/projected/44fcf36e-889c-47c3-96a7-cdc8263c3a01-kube-api-access-htp5s\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200235 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200261 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200293 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/44fcf36e-889c-47c3-96a7-cdc8263c3a01-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200329 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200383 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-run\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200411 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200477 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200502 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-dev\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200519 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200539 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200572 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200601 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-sys\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200656 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200687 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200689 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-sys\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200737 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200795 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200843 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200950 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.200987 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.201132 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-dev\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.201198 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-run\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.201336 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/44fcf36e-889c-47c3-96a7-cdc8263c3a01-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.204868 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.205541 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.205548 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/44fcf36e-889c-47c3-96a7-cdc8263c3a01-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.208897 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.211444 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44fcf36e-889c-47c3-96a7-cdc8263c3a01-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.222770 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htp5s\" (UniqueName: \"kubernetes.io/projected/44fcf36e-889c-47c3-96a7-cdc8263c3a01-kube-api-access-htp5s\") pod \"cinder-volume-volume1-0\" (UID: \"44fcf36e-889c-47c3-96a7-cdc8263c3a01\") " pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.308471 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.697548 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.700035 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.701950 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.714525 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.765701 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bea6c83a-466a-43bd-b336-9328f839cf88","Type":"ContainerStarted","Data":"7305b27c755f2e22a8faa6ffde78e3cb7bd0e4583b4b8fbca1cfe085a506d986"} Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.788177 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.788158219 podStartE2EDuration="2.788158219s" podCreationTimestamp="2025-12-09 11:36:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:36:13.781423529 +0000 UTC m=+5706.173474620" watchObservedRunningTime="2025-12-09 11:36:13.788158219 +0000 UTC m=+5706.180209300" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.812486 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.812539 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.812561 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-sys\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.812644 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.812712 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-dev\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.812739 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.812775 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-ceph\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.812837 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96zbn\" (UniqueName: \"kubernetes.io/projected/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-kube-api-access-96zbn\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.812868 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.812890 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-run\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.813009 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.813127 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-scripts\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.813166 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.813204 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-config-data\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.813271 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.813385 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-lib-modules\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915205 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915557 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-run\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915616 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915658 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-scripts\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915675 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915706 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-config-data\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915713 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-run\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915730 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915775 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915842 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915893 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-lib-modules\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.915938 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.916000 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.916054 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-sys\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.916130 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.916258 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-dev\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.916291 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.916314 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-sys\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.916344 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-ceph\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.916613 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-dev\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.917086 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.917161 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96zbn\" (UniqueName: \"kubernetes.io/projected/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-kube-api-access-96zbn\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.917494 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.917521 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.917542 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-lib-modules\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.917660 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.921135 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.924906 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.925706 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-scripts\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.926104 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-ceph\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.929180 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-config-data\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.934724 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96zbn\" (UniqueName: \"kubernetes.io/projected/494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f-kube-api-access-96zbn\") pod \"cinder-backup-0\" (UID: \"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f\") " pod="openstack/cinder-backup-0" Dec 09 11:36:13 crc kubenswrapper[5002]: I1209 11:36:13.937739 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 09 11:36:14 crc kubenswrapper[5002]: I1209 11:36:14.036908 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 09 11:36:14 crc kubenswrapper[5002]: I1209 11:36:14.621445 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 09 11:36:14 crc kubenswrapper[5002]: I1209 11:36:14.774783 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f","Type":"ContainerStarted","Data":"f320f6f869bb8d3598f99aa97e9a87a8e4613456df852a3273b8108c6d0325e8"} Dec 09 11:36:14 crc kubenswrapper[5002]: I1209 11:36:14.776538 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"44fcf36e-889c-47c3-96a7-cdc8263c3a01","Type":"ContainerStarted","Data":"b984d8b3788d7f99fdc1ee7bbaad1de3c920137d1e07b805d32cf53324627ece"} Dec 09 11:36:15 crc kubenswrapper[5002]: I1209 11:36:15.977369 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.74:8776/healthcheck\": dial tcp 10.217.1.74:8776: connect: connection refused" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.413122 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.652288 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.774564 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-combined-ca-bundle\") pod \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.774628 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-etc-machine-id\") pod \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.774712 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-logs\") pod \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.774824 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data-custom\") pod \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.774854 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjnbl\" (UniqueName: \"kubernetes.io/projected/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-kube-api-access-bjnbl\") pod \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.774911 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-scripts\") pod \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.774972 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data\") pod \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\" (UID: \"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b\") " Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.776114 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" (UID: "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.777488 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-logs" (OuterVolumeSpecName: "logs") pod "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" (UID: "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.780685 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" (UID: "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.781352 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-scripts" (OuterVolumeSpecName: "scripts") pod "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" (UID: "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.781672 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-kube-api-access-bjnbl" (OuterVolumeSpecName: "kube-api-access-bjnbl") pod "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" (UID: "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b"). InnerVolumeSpecName "kube-api-access-bjnbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.800579 5002 generic.go:334] "Generic (PLEG): container finished" podID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" containerID="ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af" exitCode=0 Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.800636 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b","Type":"ContainerDied","Data":"ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af"} Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.800672 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4b341517-4db7-4996-8aa3-d2bc4ad0ef6b","Type":"ContainerDied","Data":"a86837081964b41e343bcd933e880b5ac05cb4087ef7bbe140cb8aee26bcb0e0"} Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.800694 5002 scope.go:117] "RemoveContainer" containerID="ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.800917 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.824717 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data" (OuterVolumeSpecName: "config-data") pod "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" (UID: "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.831417 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" (UID: "4b341517-4db7-4996-8aa3-d2bc4ad0ef6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.877246 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.877293 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjnbl\" (UniqueName: \"kubernetes.io/projected/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-kube-api-access-bjnbl\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.877306 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.877320 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.877334 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.877345 5002 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.877357 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.918389 5002 scope.go:117] "RemoveContainer" containerID="b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.939751 5002 scope.go:117] "RemoveContainer" containerID="ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af" Dec 09 11:36:16 crc kubenswrapper[5002]: E1209 11:36:16.941948 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af\": container with ID starting with ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af not found: ID does not exist" containerID="ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.941988 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af"} err="failed to get container status \"ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af\": rpc error: code = NotFound desc = could not find container \"ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af\": container with ID starting with ec1a49143e2d278fc2d98442cce33d0a91681ed1f1f627b1a3ec982c442f83af not found: ID does not exist" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.942037 5002 scope.go:117] "RemoveContainer" containerID="b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9" Dec 09 11:36:16 crc kubenswrapper[5002]: E1209 11:36:16.942593 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9\": container with ID starting with b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9 not found: ID does not exist" containerID="b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9" Dec 09 11:36:16 crc kubenswrapper[5002]: I1209 11:36:16.942615 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9"} err="failed to get container status \"b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9\": rpc error: code = NotFound desc = could not find container \"b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9\": container with ID starting with b22170a9eb9d79cdeb24167950009ea054636ad8af84c361a819433e849048e9 not found: ID does not exist" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.135420 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.145233 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.156943 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:36:17 crc kubenswrapper[5002]: E1209 11:36:17.157331 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" containerName="cinder-api" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.157351 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" containerName="cinder-api" Dec 09 11:36:17 crc kubenswrapper[5002]: E1209 11:36:17.157364 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" containerName="cinder-api-log" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.157373 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" containerName="cinder-api-log" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.157595 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" containerName="cinder-api" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.157626 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" containerName="cinder-api-log" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.159592 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.161521 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.174887 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.231081 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.232666 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.234102 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.236742 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.283948 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89609918-9011-4ef3-82eb-d8f791fc2979-etc-machine-id\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.284017 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.284048 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-config-data-custom\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.284087 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89609918-9011-4ef3-82eb-d8f791fc2979-logs\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.284105 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-config-data\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.284251 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nq9h\" (UniqueName: \"kubernetes.io/projected/89609918-9011-4ef3-82eb-d8f791fc2979-kube-api-access-4nq9h\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.284534 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-scripts\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.386907 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89609918-9011-4ef3-82eb-d8f791fc2979-etc-machine-id\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.387097 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89609918-9011-4ef3-82eb-d8f791fc2979-etc-machine-id\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.387457 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.387596 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-config-data-custom\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.387634 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89609918-9011-4ef3-82eb-d8f791fc2979-logs\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.387655 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-config-data\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.387701 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nq9h\" (UniqueName: \"kubernetes.io/projected/89609918-9011-4ef3-82eb-d8f791fc2979-kube-api-access-4nq9h\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.387918 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-scripts\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.388602 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89609918-9011-4ef3-82eb-d8f791fc2979-logs\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.391083 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.393620 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-scripts\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.393860 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-config-data-custom\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.403357 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89609918-9011-4ef3-82eb-d8f791fc2979-config-data\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.412610 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nq9h\" (UniqueName: \"kubernetes.io/projected/89609918-9011-4ef3-82eb-d8f791fc2979-kube-api-access-4nq9h\") pod \"cinder-api-0\" (UID: \"89609918-9011-4ef3-82eb-d8f791fc2979\") " pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.482248 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.810074 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f","Type":"ContainerStarted","Data":"b4d17970977f2c43ebd2d9db5b6604849f1613246f381e960cac76df305ea3f7"} Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.810434 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f","Type":"ContainerStarted","Data":"6112f9a5fd92770608cdf6f7355cc22eb7fdbc0c86fc8ed91828f11b1d3b8de1"} Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.812766 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"44fcf36e-889c-47c3-96a7-cdc8263c3a01","Type":"ContainerStarted","Data":"eb8c9886c12fce6821d61174aaa473b4eaee6381ce39d506b738c9620de9bcd4"} Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.812796 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"44fcf36e-889c-47c3-96a7-cdc8263c3a01","Type":"ContainerStarted","Data":"86c0a44d4707303a89d38b8030c569c43e8ff7c5161ce8d77a14f90b26f682a9"} Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.813134 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.816738 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.884126 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.885900 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.890831 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 11:36:17 crc kubenswrapper[5002]: I1209 11:36:17.965330 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:36:18 crc kubenswrapper[5002]: I1209 11:36:18.085866 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b341517-4db7-4996-8aa3-d2bc4ad0ef6b" path="/var/lib/kubelet/pods/4b341517-4db7-4996-8aa3-d2bc4ad0ef6b/volumes" Dec 09 11:36:18 crc kubenswrapper[5002]: I1209 11:36:18.830077 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89609918-9011-4ef3-82eb-d8f791fc2979","Type":"ContainerStarted","Data":"7b4b2b10cdc2256b8397e9452495ab2e3476fb2de7dc2fb8a625d95b8c88b14a"} Dec 09 11:36:18 crc kubenswrapper[5002]: I1209 11:36:18.830437 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89609918-9011-4ef3-82eb-d8f791fc2979","Type":"ContainerStarted","Data":"a20295d4fab7e8527fefb81a831f903f69d3adbdcfdefb7cba47788f7bf6737a"} Dec 09 11:36:18 crc kubenswrapper[5002]: I1209 11:36:18.840350 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 11:36:18 crc kubenswrapper[5002]: I1209 11:36:18.857791 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.983216385 podStartE2EDuration="5.857772857s" podCreationTimestamp="2025-12-09 11:36:13 +0000 UTC" firstStartedPulling="2025-12-09 11:36:14.628824947 +0000 UTC m=+5707.020876028" lastFinishedPulling="2025-12-09 11:36:16.503381419 +0000 UTC m=+5708.895432500" observedRunningTime="2025-12-09 11:36:18.855295161 +0000 UTC m=+5711.247346282" watchObservedRunningTime="2025-12-09 11:36:18.857772857 +0000 UTC m=+5711.249823938" Dec 09 11:36:18 crc kubenswrapper[5002]: I1209 11:36:18.910710 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=4.500164753 podStartE2EDuration="6.910677382s" podCreationTimestamp="2025-12-09 11:36:12 +0000 UTC" firstStartedPulling="2025-12-09 11:36:13.920601932 +0000 UTC m=+5706.312653013" lastFinishedPulling="2025-12-09 11:36:16.331114561 +0000 UTC m=+5708.723165642" observedRunningTime="2025-12-09 11:36:18.909158562 +0000 UTC m=+5711.301209643" watchObservedRunningTime="2025-12-09 11:36:18.910677382 +0000 UTC m=+5711.302728463" Dec 09 11:36:19 crc kubenswrapper[5002]: I1209 11:36:19.037348 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 09 11:36:19 crc kubenswrapper[5002]: I1209 11:36:19.838626 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89609918-9011-4ef3-82eb-d8f791fc2979","Type":"ContainerStarted","Data":"b0928f8a499bc2bd2883873d19c8c116d5cc641bf7aa831e215171660a9023c0"} Dec 09 11:36:19 crc kubenswrapper[5002]: I1209 11:36:19.865152 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.865130332 podStartE2EDuration="2.865130332s" podCreationTimestamp="2025-12-09 11:36:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:36:19.855195947 +0000 UTC m=+5712.247247038" watchObservedRunningTime="2025-12-09 11:36:19.865130332 +0000 UTC m=+5712.257181433" Dec 09 11:36:20 crc kubenswrapper[5002]: I1209 11:36:20.848434 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 09 11:36:21 crc kubenswrapper[5002]: I1209 11:36:21.638003 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 09 11:36:21 crc kubenswrapper[5002]: I1209 11:36:21.696207 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:36:21 crc kubenswrapper[5002]: I1209 11:36:21.857202 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="bea6c83a-466a-43bd-b336-9328f839cf88" containerName="cinder-scheduler" containerID="cri-o://bf5268736e978035bf26b4117148146270ec502b4973ec90dc9f35d7781e241f" gracePeriod=30 Dec 09 11:36:21 crc kubenswrapper[5002]: I1209 11:36:21.857323 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="bea6c83a-466a-43bd-b336-9328f839cf88" containerName="probe" containerID="cri-o://7305b27c755f2e22a8faa6ffde78e3cb7bd0e4583b4b8fbca1cfe085a506d986" gracePeriod=30 Dec 09 11:36:22 crc kubenswrapper[5002]: I1209 11:36:22.868931 5002 generic.go:334] "Generic (PLEG): container finished" podID="bea6c83a-466a-43bd-b336-9328f839cf88" containerID="7305b27c755f2e22a8faa6ffde78e3cb7bd0e4583b4b8fbca1cfe085a506d986" exitCode=0 Dec 09 11:36:22 crc kubenswrapper[5002]: I1209 11:36:22.869532 5002 generic.go:334] "Generic (PLEG): container finished" podID="bea6c83a-466a-43bd-b336-9328f839cf88" containerID="bf5268736e978035bf26b4117148146270ec502b4973ec90dc9f35d7781e241f" exitCode=0 Dec 09 11:36:22 crc kubenswrapper[5002]: I1209 11:36:22.868989 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bea6c83a-466a-43bd-b336-9328f839cf88","Type":"ContainerDied","Data":"7305b27c755f2e22a8faa6ffde78e3cb7bd0e4583b4b8fbca1cfe085a506d986"} Dec 09 11:36:22 crc kubenswrapper[5002]: I1209 11:36:22.869596 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bea6c83a-466a-43bd-b336-9328f839cf88","Type":"ContainerDied","Data":"bf5268736e978035bf26b4117148146270ec502b4973ec90dc9f35d7781e241f"} Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.309350 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.366331 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.428798 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data-custom\") pod \"bea6c83a-466a-43bd-b336-9328f839cf88\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.428994 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data\") pod \"bea6c83a-466a-43bd-b336-9328f839cf88\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.429080 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bea6c83a-466a-43bd-b336-9328f839cf88-etc-machine-id\") pod \"bea6c83a-466a-43bd-b336-9328f839cf88\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.429164 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-combined-ca-bundle\") pod \"bea6c83a-466a-43bd-b336-9328f839cf88\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.429195 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65hk2\" (UniqueName: \"kubernetes.io/projected/bea6c83a-466a-43bd-b336-9328f839cf88-kube-api-access-65hk2\") pod \"bea6c83a-466a-43bd-b336-9328f839cf88\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.429239 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-scripts\") pod \"bea6c83a-466a-43bd-b336-9328f839cf88\" (UID: \"bea6c83a-466a-43bd-b336-9328f839cf88\") " Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.429260 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bea6c83a-466a-43bd-b336-9328f839cf88-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bea6c83a-466a-43bd-b336-9328f839cf88" (UID: "bea6c83a-466a-43bd-b336-9328f839cf88"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.429853 5002 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bea6c83a-466a-43bd-b336-9328f839cf88-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.440452 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-scripts" (OuterVolumeSpecName: "scripts") pod "bea6c83a-466a-43bd-b336-9328f839cf88" (UID: "bea6c83a-466a-43bd-b336-9328f839cf88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.440529 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea6c83a-466a-43bd-b336-9328f839cf88-kube-api-access-65hk2" (OuterVolumeSpecName: "kube-api-access-65hk2") pod "bea6c83a-466a-43bd-b336-9328f839cf88" (UID: "bea6c83a-466a-43bd-b336-9328f839cf88"). InnerVolumeSpecName "kube-api-access-65hk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.444137 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bea6c83a-466a-43bd-b336-9328f839cf88" (UID: "bea6c83a-466a-43bd-b336-9328f839cf88"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.482846 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bea6c83a-466a-43bd-b336-9328f839cf88" (UID: "bea6c83a-466a-43bd-b336-9328f839cf88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.531119 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.531160 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65hk2\" (UniqueName: \"kubernetes.io/projected/bea6c83a-466a-43bd-b336-9328f839cf88-kube-api-access-65hk2\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.531176 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.531188 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.546108 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data" (OuterVolumeSpecName: "config-data") pod "bea6c83a-466a-43bd-b336-9328f839cf88" (UID: "bea6c83a-466a-43bd-b336-9328f839cf88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.557851 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.632799 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea6c83a-466a-43bd-b336-9328f839cf88-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.881174 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.882055 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bea6c83a-466a-43bd-b336-9328f839cf88","Type":"ContainerDied","Data":"3a0afdb182a0cf27b47fd8e3b7f6d367accb3705e8a6a0d05c3d7cddefe6c8cd"} Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.882219 5002 scope.go:117] "RemoveContainer" containerID="7305b27c755f2e22a8faa6ffde78e3cb7bd0e4583b4b8fbca1cfe085a506d986" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.925896 5002 scope.go:117] "RemoveContainer" containerID="bf5268736e978035bf26b4117148146270ec502b4973ec90dc9f35d7781e241f" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.932115 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.941285 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.972708 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:36:23 crc kubenswrapper[5002]: E1209 11:36:23.973242 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea6c83a-466a-43bd-b336-9328f839cf88" containerName="cinder-scheduler" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.973261 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea6c83a-466a-43bd-b336-9328f839cf88" containerName="cinder-scheduler" Dec 09 11:36:23 crc kubenswrapper[5002]: E1209 11:36:23.973298 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea6c83a-466a-43bd-b336-9328f839cf88" containerName="probe" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.973305 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea6c83a-466a-43bd-b336-9328f839cf88" containerName="probe" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.973476 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea6c83a-466a-43bd-b336-9328f839cf88" containerName="probe" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.973497 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea6c83a-466a-43bd-b336-9328f839cf88" containerName="cinder-scheduler" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.974661 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.977218 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 09 11:36:23 crc kubenswrapper[5002]: I1209 11:36:23.982715 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.039615 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.039667 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-scripts\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.039690 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-config-data\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.039709 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebae47db-99fe-43b2-bbf8-00d7ba18b901-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.039948 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpg9w\" (UniqueName: \"kubernetes.io/projected/ebae47db-99fe-43b2-bbf8-00d7ba18b901-kube-api-access-dpg9w\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.040137 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.075458 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea6c83a-466a-43bd-b336-9328f839cf88" path="/var/lib/kubelet/pods/bea6c83a-466a-43bd-b336-9328f839cf88/volumes" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.141952 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.142122 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.142146 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-scripts\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.142163 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-config-data\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.142179 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebae47db-99fe-43b2-bbf8-00d7ba18b901-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.142247 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpg9w\" (UniqueName: \"kubernetes.io/projected/ebae47db-99fe-43b2-bbf8-00d7ba18b901-kube-api-access-dpg9w\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.143852 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebae47db-99fe-43b2-bbf8-00d7ba18b901-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.148141 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.157172 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.157839 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-config-data\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.158291 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebae47db-99fe-43b2-bbf8-00d7ba18b901-scripts\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.162132 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpg9w\" (UniqueName: \"kubernetes.io/projected/ebae47db-99fe-43b2-bbf8-00d7ba18b901-kube-api-access-dpg9w\") pod \"cinder-scheduler-0\" (UID: \"ebae47db-99fe-43b2-bbf8-00d7ba18b901\") " pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.291186 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.291762 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.756909 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:36:24 crc kubenswrapper[5002]: I1209 11:36:24.898527 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ebae47db-99fe-43b2-bbf8-00d7ba18b901","Type":"ContainerStarted","Data":"5647ff64e85931938a8fcc2c2f545182819760efda6a284bdf43d6afe686da39"} Dec 09 11:36:25 crc kubenswrapper[5002]: I1209 11:36:25.920968 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ebae47db-99fe-43b2-bbf8-00d7ba18b901","Type":"ContainerStarted","Data":"87a92918cc47d8cd92234b6d03e6fb9589ec95999dea0833e6ea0ba49e5dd223"} Dec 09 11:36:25 crc kubenswrapper[5002]: I1209 11:36:25.923614 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ebae47db-99fe-43b2-bbf8-00d7ba18b901","Type":"ContainerStarted","Data":"733dd5c6a42672fa66756cd6e61551d47d686feaf6645ae68b0b4a48cab35813"} Dec 09 11:36:25 crc kubenswrapper[5002]: I1209 11:36:25.945331 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.9453097120000002 podStartE2EDuration="2.945309712s" podCreationTimestamp="2025-12-09 11:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:36:25.937971036 +0000 UTC m=+5718.330022147" watchObservedRunningTime="2025-12-09 11:36:25.945309712 +0000 UTC m=+5718.337360793" Dec 09 11:36:29 crc kubenswrapper[5002]: I1209 11:36:29.292897 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 09 11:36:29 crc kubenswrapper[5002]: I1209 11:36:29.411187 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 09 11:36:34 crc kubenswrapper[5002]: I1209 11:36:34.489647 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 09 11:36:37 crc kubenswrapper[5002]: I1209 11:36:37.964702 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:36:37 crc kubenswrapper[5002]: I1209 11:36:37.965563 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:37:07 crc kubenswrapper[5002]: I1209 11:37:07.964908 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:37:07 crc kubenswrapper[5002]: I1209 11:37:07.965386 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:37:37 crc kubenswrapper[5002]: I1209 11:37:37.965381 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:37:37 crc kubenswrapper[5002]: I1209 11:37:37.966060 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:37:37 crc kubenswrapper[5002]: I1209 11:37:37.966145 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 11:37:37 crc kubenswrapper[5002]: I1209 11:37:37.967307 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9279947ed00c5b6531641df0eb3e04f34e3d816632d088e326b1acbc67d09a2"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:37:37 crc kubenswrapper[5002]: I1209 11:37:37.967411 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://c9279947ed00c5b6531641df0eb3e04f34e3d816632d088e326b1acbc67d09a2" gracePeriod=600 Dec 09 11:37:38 crc kubenswrapper[5002]: I1209 11:37:38.660795 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="c9279947ed00c5b6531641df0eb3e04f34e3d816632d088e326b1acbc67d09a2" exitCode=0 Dec 09 11:37:38 crc kubenswrapper[5002]: I1209 11:37:38.660921 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"c9279947ed00c5b6531641df0eb3e04f34e3d816632d088e326b1acbc67d09a2"} Dec 09 11:37:38 crc kubenswrapper[5002]: I1209 11:37:38.661296 5002 scope.go:117] "RemoveContainer" containerID="24eda190128d46e2bfa806f4839b38f2462cd8acaa8816efdf9934cf2dc46679" Dec 09 11:37:41 crc kubenswrapper[5002]: I1209 11:37:41.702740 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e"} Dec 09 11:37:51 crc kubenswrapper[5002]: I1209 11:37:51.053630 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bjz5w"] Dec 09 11:37:51 crc kubenswrapper[5002]: I1209 11:37:51.063871 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bjz5w"] Dec 09 11:37:52 crc kubenswrapper[5002]: I1209 11:37:52.026131 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3513-account-create-update-5vpps"] Dec 09 11:37:52 crc kubenswrapper[5002]: I1209 11:37:52.037040 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3513-account-create-update-5vpps"] Dec 09 11:37:52 crc kubenswrapper[5002]: I1209 11:37:52.079836 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62049528-7763-4cd2-bafd-4511f883ec84" path="/var/lib/kubelet/pods/62049528-7763-4cd2-bafd-4511f883ec84/volumes" Dec 09 11:37:52 crc kubenswrapper[5002]: I1209 11:37:52.080536 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba634598-bd87-4c8f-a7c1-9db3e3324383" path="/var/lib/kubelet/pods/ba634598-bd87-4c8f-a7c1-9db3e3324383/volumes" Dec 09 11:37:56 crc kubenswrapper[5002]: I1209 11:37:56.831743 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sjgjq"] Dec 09 11:37:56 crc kubenswrapper[5002]: I1209 11:37:56.834483 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:37:56 crc kubenswrapper[5002]: I1209 11:37:56.866746 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjgjq"] Dec 09 11:37:56 crc kubenswrapper[5002]: I1209 11:37:56.889776 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-catalog-content\") pod \"certified-operators-sjgjq\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:37:56 crc kubenswrapper[5002]: I1209 11:37:56.889919 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdm7q\" (UniqueName: \"kubernetes.io/projected/64094d17-8abb-461e-a4e8-cfdd0e81d77f-kube-api-access-zdm7q\") pod \"certified-operators-sjgjq\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:37:56 crc kubenswrapper[5002]: I1209 11:37:56.889955 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-utilities\") pod \"certified-operators-sjgjq\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:37:56 crc kubenswrapper[5002]: I1209 11:37:56.992074 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdm7q\" (UniqueName: \"kubernetes.io/projected/64094d17-8abb-461e-a4e8-cfdd0e81d77f-kube-api-access-zdm7q\") pod \"certified-operators-sjgjq\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:37:56 crc kubenswrapper[5002]: I1209 11:37:56.992141 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-utilities\") pod \"certified-operators-sjgjq\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:37:56 crc kubenswrapper[5002]: I1209 11:37:56.992208 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-catalog-content\") pod \"certified-operators-sjgjq\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:37:56 crc kubenswrapper[5002]: I1209 11:37:56.992652 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-catalog-content\") pod \"certified-operators-sjgjq\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:37:56 crc kubenswrapper[5002]: I1209 11:37:56.993067 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-utilities\") pod \"certified-operators-sjgjq\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:37:57 crc kubenswrapper[5002]: I1209 11:37:57.019986 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdm7q\" (UniqueName: \"kubernetes.io/projected/64094d17-8abb-461e-a4e8-cfdd0e81d77f-kube-api-access-zdm7q\") pod \"certified-operators-sjgjq\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:37:57 crc kubenswrapper[5002]: I1209 11:37:57.167605 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:37:57 crc kubenswrapper[5002]: I1209 11:37:57.750562 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjgjq"] Dec 09 11:37:57 crc kubenswrapper[5002]: I1209 11:37:57.891346 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjgjq" event={"ID":"64094d17-8abb-461e-a4e8-cfdd0e81d77f","Type":"ContainerStarted","Data":"80e598f35051205ffbb1adb1a8e497c537dc18c92ae1f1fbe7f92b5e46cdf164"} Dec 09 11:37:58 crc kubenswrapper[5002]: I1209 11:37:58.039241 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qw2g9"] Dec 09 11:37:58 crc kubenswrapper[5002]: I1209 11:37:58.053411 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qw2g9"] Dec 09 11:37:58 crc kubenswrapper[5002]: I1209 11:37:58.081441 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13cc15be-2b29-4477-bc55-dde55b36019d" path="/var/lib/kubelet/pods/13cc15be-2b29-4477-bc55-dde55b36019d/volumes" Dec 09 11:37:58 crc kubenswrapper[5002]: I1209 11:37:58.900259 5002 generic.go:334] "Generic (PLEG): container finished" podID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" containerID="dde8908a2860aa1a7a28c903cfde8a9bd1db5ab9198dfe2fe70a440d91448de3" exitCode=0 Dec 09 11:37:58 crc kubenswrapper[5002]: I1209 11:37:58.900325 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjgjq" event={"ID":"64094d17-8abb-461e-a4e8-cfdd0e81d77f","Type":"ContainerDied","Data":"dde8908a2860aa1a7a28c903cfde8a9bd1db5ab9198dfe2fe70a440d91448de3"} Dec 09 11:37:58 crc kubenswrapper[5002]: I1209 11:37:58.902593 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:37:59 crc kubenswrapper[5002]: I1209 11:37:59.911358 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjgjq" event={"ID":"64094d17-8abb-461e-a4e8-cfdd0e81d77f","Type":"ContainerStarted","Data":"4c432f4582c6c7bf1d3abed69c5b8d317e802f2bbf34d3040e4a4e5d0d02bff0"} Dec 09 11:38:00 crc kubenswrapper[5002]: I1209 11:38:00.926608 5002 generic.go:334] "Generic (PLEG): container finished" podID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" containerID="4c432f4582c6c7bf1d3abed69c5b8d317e802f2bbf34d3040e4a4e5d0d02bff0" exitCode=0 Dec 09 11:38:00 crc kubenswrapper[5002]: I1209 11:38:00.926673 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjgjq" event={"ID":"64094d17-8abb-461e-a4e8-cfdd0e81d77f","Type":"ContainerDied","Data":"4c432f4582c6c7bf1d3abed69c5b8d317e802f2bbf34d3040e4a4e5d0d02bff0"} Dec 09 11:38:01 crc kubenswrapper[5002]: I1209 11:38:01.941116 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjgjq" event={"ID":"64094d17-8abb-461e-a4e8-cfdd0e81d77f","Type":"ContainerStarted","Data":"857cc62124846f5a0d84f6ba9922b33bff6a6e058e7a713481c875d04301ebef"} Dec 09 11:38:01 crc kubenswrapper[5002]: I1209 11:38:01.985155 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sjgjq" podStartSLOduration=3.5658210070000003 podStartE2EDuration="5.985130671s" podCreationTimestamp="2025-12-09 11:37:56 +0000 UTC" firstStartedPulling="2025-12-09 11:37:58.902329899 +0000 UTC m=+5811.294380980" lastFinishedPulling="2025-12-09 11:38:01.321639523 +0000 UTC m=+5813.713690644" observedRunningTime="2025-12-09 11:38:01.973564342 +0000 UTC m=+5814.365615463" watchObservedRunningTime="2025-12-09 11:38:01.985130671 +0000 UTC m=+5814.377181782" Dec 09 11:38:07 crc kubenswrapper[5002]: I1209 11:38:07.168721 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:38:07 crc kubenswrapper[5002]: I1209 11:38:07.169586 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:38:07 crc kubenswrapper[5002]: I1209 11:38:07.243771 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:38:08 crc kubenswrapper[5002]: I1209 11:38:08.074249 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:38:08 crc kubenswrapper[5002]: I1209 11:38:08.137563 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjgjq"] Dec 09 11:38:10 crc kubenswrapper[5002]: I1209 11:38:10.032442 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sjgjq" podUID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" containerName="registry-server" containerID="cri-o://857cc62124846f5a0d84f6ba9922b33bff6a6e058e7a713481c875d04301ebef" gracePeriod=2 Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.053497 5002 generic.go:334] "Generic (PLEG): container finished" podID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" containerID="857cc62124846f5a0d84f6ba9922b33bff6a6e058e7a713481c875d04301ebef" exitCode=0 Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.053842 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjgjq" event={"ID":"64094d17-8abb-461e-a4e8-cfdd0e81d77f","Type":"ContainerDied","Data":"857cc62124846f5a0d84f6ba9922b33bff6a6e058e7a713481c875d04301ebef"} Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.194106 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.276516 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdm7q\" (UniqueName: \"kubernetes.io/projected/64094d17-8abb-461e-a4e8-cfdd0e81d77f-kube-api-access-zdm7q\") pod \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.276656 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-catalog-content\") pod \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.276888 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-utilities\") pod \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\" (UID: \"64094d17-8abb-461e-a4e8-cfdd0e81d77f\") " Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.277678 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-utilities" (OuterVolumeSpecName: "utilities") pod "64094d17-8abb-461e-a4e8-cfdd0e81d77f" (UID: "64094d17-8abb-461e-a4e8-cfdd0e81d77f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.282362 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64094d17-8abb-461e-a4e8-cfdd0e81d77f-kube-api-access-zdm7q" (OuterVolumeSpecName: "kube-api-access-zdm7q") pod "64094d17-8abb-461e-a4e8-cfdd0e81d77f" (UID: "64094d17-8abb-461e-a4e8-cfdd0e81d77f"). InnerVolumeSpecName "kube-api-access-zdm7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.333603 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64094d17-8abb-461e-a4e8-cfdd0e81d77f" (UID: "64094d17-8abb-461e-a4e8-cfdd0e81d77f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.378781 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.378828 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdm7q\" (UniqueName: \"kubernetes.io/projected/64094d17-8abb-461e-a4e8-cfdd0e81d77f-kube-api-access-zdm7q\") on node \"crc\" DevicePath \"\"" Dec 09 11:38:11 crc kubenswrapper[5002]: I1209 11:38:11.378839 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64094d17-8abb-461e-a4e8-cfdd0e81d77f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:38:12 crc kubenswrapper[5002]: I1209 11:38:12.040360 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-22c2h"] Dec 09 11:38:12 crc kubenswrapper[5002]: I1209 11:38:12.050897 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-22c2h"] Dec 09 11:38:12 crc kubenswrapper[5002]: I1209 11:38:12.065962 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjgjq" Dec 09 11:38:12 crc kubenswrapper[5002]: I1209 11:38:12.075438 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc546f3-0611-40c8-96bd-04cf184f0ff2" path="/var/lib/kubelet/pods/7cc546f3-0611-40c8-96bd-04cf184f0ff2/volumes" Dec 09 11:38:12 crc kubenswrapper[5002]: I1209 11:38:12.076370 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjgjq" event={"ID":"64094d17-8abb-461e-a4e8-cfdd0e81d77f","Type":"ContainerDied","Data":"80e598f35051205ffbb1adb1a8e497c537dc18c92ae1f1fbe7f92b5e46cdf164"} Dec 09 11:38:12 crc kubenswrapper[5002]: I1209 11:38:12.076411 5002 scope.go:117] "RemoveContainer" containerID="857cc62124846f5a0d84f6ba9922b33bff6a6e058e7a713481c875d04301ebef" Dec 09 11:38:12 crc kubenswrapper[5002]: I1209 11:38:12.108241 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjgjq"] Dec 09 11:38:12 crc kubenswrapper[5002]: I1209 11:38:12.118557 5002 scope.go:117] "RemoveContainer" containerID="4c432f4582c6c7bf1d3abed69c5b8d317e802f2bbf34d3040e4a4e5d0d02bff0" Dec 09 11:38:12 crc kubenswrapper[5002]: I1209 11:38:12.122188 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sjgjq"] Dec 09 11:38:12 crc kubenswrapper[5002]: I1209 11:38:12.152103 5002 scope.go:117] "RemoveContainer" containerID="dde8908a2860aa1a7a28c903cfde8a9bd1db5ab9198dfe2fe70a440d91448de3" Dec 09 11:38:14 crc kubenswrapper[5002]: I1209 11:38:14.083981 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" path="/var/lib/kubelet/pods/64094d17-8abb-461e-a4e8-cfdd0e81d77f/volumes" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.550014 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9glwj"] Dec 09 11:38:20 crc kubenswrapper[5002]: E1209 11:38:20.550971 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" containerName="extract-content" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.550988 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" containerName="extract-content" Dec 09 11:38:20 crc kubenswrapper[5002]: E1209 11:38:20.551003 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" containerName="registry-server" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.551010 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" containerName="registry-server" Dec 09 11:38:20 crc kubenswrapper[5002]: E1209 11:38:20.551030 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" containerName="extract-utilities" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.551038 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" containerName="extract-utilities" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.551302 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="64094d17-8abb-461e-a4e8-cfdd0e81d77f" containerName="registry-server" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.552073 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.553928 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.554172 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hwfgp" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.570354 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-ff8qg"] Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.577960 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.581788 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9glwj"] Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.601217 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ff8qg"] Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.667925 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-scripts\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.667966 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgw22\" (UniqueName: \"kubernetes.io/projected/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-kube-api-access-qgw22\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.667998 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-var-lib\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.668075 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3e229031-30d4-44e3-897b-b4c4252e7b99-var-run\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.668142 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e229031-30d4-44e3-897b-b4c4252e7b99-var-run-ovn\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.668170 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3e229031-30d4-44e3-897b-b4c4252e7b99-var-log-ovn\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.668216 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-etc-ovs\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.668302 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-var-run\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.668396 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdd45\" (UniqueName: \"kubernetes.io/projected/3e229031-30d4-44e3-897b-b4c4252e7b99-kube-api-access-fdd45\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.668444 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e229031-30d4-44e3-897b-b4c4252e7b99-scripts\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.668463 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-var-log\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.770514 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-scripts\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.770598 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgw22\" (UniqueName: \"kubernetes.io/projected/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-kube-api-access-qgw22\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.770655 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-var-lib\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.770691 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3e229031-30d4-44e3-897b-b4c4252e7b99-var-run\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.770770 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e229031-30d4-44e3-897b-b4c4252e7b99-var-run-ovn\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.770836 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3e229031-30d4-44e3-897b-b4c4252e7b99-var-log-ovn\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.770905 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-etc-ovs\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.770984 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-var-run\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.771056 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-var-lib\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.771063 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3e229031-30d4-44e3-897b-b4c4252e7b99-var-log-ovn\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.771093 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd45\" (UniqueName: \"kubernetes.io/projected/3e229031-30d4-44e3-897b-b4c4252e7b99-kube-api-access-fdd45\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.771106 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-etc-ovs\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.771120 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e229031-30d4-44e3-897b-b4c4252e7b99-var-run-ovn\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.771155 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-var-run\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.771168 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3e229031-30d4-44e3-897b-b4c4252e7b99-var-run\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.771279 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e229031-30d4-44e3-897b-b4c4252e7b99-scripts\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.771308 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-var-log\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.771513 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-var-log\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.772945 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-scripts\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.773580 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e229031-30d4-44e3-897b-b4c4252e7b99-scripts\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.790453 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgw22\" (UniqueName: \"kubernetes.io/projected/5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86-kube-api-access-qgw22\") pod \"ovn-controller-ovs-ff8qg\" (UID: \"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86\") " pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.791379 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdd45\" (UniqueName: \"kubernetes.io/projected/3e229031-30d4-44e3-897b-b4c4252e7b99-kube-api-access-fdd45\") pod \"ovn-controller-9glwj\" (UID: \"3e229031-30d4-44e3-897b-b4c4252e7b99\") " pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.876823 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9glwj" Dec 09 11:38:20 crc kubenswrapper[5002]: I1209 11:38:20.901738 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:21 crc kubenswrapper[5002]: I1209 11:38:21.343353 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9glwj"] Dec 09 11:38:21 crc kubenswrapper[5002]: I1209 11:38:21.768943 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ff8qg"] Dec 09 11:38:21 crc kubenswrapper[5002]: W1209 11:38:21.772588 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5707bebc_bcab_4d7a_a4b9_28cb8b6a9b86.slice/crio-293580b040db3a3a4186f9cfc1283003a87648988be8fed69aada2c71a1c4a45 WatchSource:0}: Error finding container 293580b040db3a3a4186f9cfc1283003a87648988be8fed69aada2c71a1c4a45: Status 404 returned error can't find the container with id 293580b040db3a3a4186f9cfc1283003a87648988be8fed69aada2c71a1c4a45 Dec 09 11:38:21 crc kubenswrapper[5002]: I1209 11:38:21.840428 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-wxgg4"] Dec 09 11:38:21 crc kubenswrapper[5002]: I1209 11:38:21.843523 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:21 crc kubenswrapper[5002]: I1209 11:38:21.851042 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 09 11:38:21 crc kubenswrapper[5002]: I1209 11:38:21.852514 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wxgg4"] Dec 09 11:38:21 crc kubenswrapper[5002]: I1209 11:38:21.995371 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-ovn-rundir\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:21 crc kubenswrapper[5002]: I1209 11:38:21.995451 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d2hn\" (UniqueName: \"kubernetes.io/projected/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-kube-api-access-2d2hn\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:21 crc kubenswrapper[5002]: I1209 11:38:21.995507 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-config\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:21 crc kubenswrapper[5002]: I1209 11:38:21.995575 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-ovs-rundir\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.097689 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-ovn-rundir\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.098165 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-ovn-rundir\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.098215 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d2hn\" (UniqueName: \"kubernetes.io/projected/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-kube-api-access-2d2hn\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.098304 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-config\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.098398 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-ovs-rundir\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.098587 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-ovs-rundir\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.099272 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-config\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.123578 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d2hn\" (UniqueName: \"kubernetes.io/projected/f7449046-0fa6-4724-9f8c-dd9cff7bc95d-kube-api-access-2d2hn\") pod \"ovn-controller-metrics-wxgg4\" (UID: \"f7449046-0fa6-4724-9f8c-dd9cff7bc95d\") " pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.201038 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9glwj" event={"ID":"3e229031-30d4-44e3-897b-b4c4252e7b99","Type":"ContainerStarted","Data":"0ae97220bdf71b9847fb365a29e436e7dfa445285afe74c18380cb6eafbb799a"} Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.201105 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9glwj" event={"ID":"3e229031-30d4-44e3-897b-b4c4252e7b99","Type":"ContainerStarted","Data":"d54eb912a46b49fbf2159383b84bd4780903474e903c3dff72a8e0067ac31b79"} Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.201172 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9glwj" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.202748 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ff8qg" event={"ID":"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86","Type":"ContainerStarted","Data":"a7d71e633dafdd49298446c63331b0dd34a2721bd1b0a89b5006bedb78323f51"} Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.202799 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ff8qg" event={"ID":"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86","Type":"ContainerStarted","Data":"293580b040db3a3a4186f9cfc1283003a87648988be8fed69aada2c71a1c4a45"} Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.220732 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9glwj" podStartSLOduration=2.220715544 podStartE2EDuration="2.220715544s" podCreationTimestamp="2025-12-09 11:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:38:22.2142068 +0000 UTC m=+5834.606257871" watchObservedRunningTime="2025-12-09 11:38:22.220715544 +0000 UTC m=+5834.612766625" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.231035 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wxgg4" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.395665 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-4rz5z"] Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.397533 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4rz5z" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.407113 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-4rz5z"] Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.504735 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4a693ed-64c3-4ee0-9b56-505f565ac236-operator-scripts\") pod \"octavia-db-create-4rz5z\" (UID: \"c4a693ed-64c3-4ee0-9b56-505f565ac236\") " pod="openstack/octavia-db-create-4rz5z" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.504807 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfbs4\" (UniqueName: \"kubernetes.io/projected/c4a693ed-64c3-4ee0-9b56-505f565ac236-kube-api-access-rfbs4\") pod \"octavia-db-create-4rz5z\" (UID: \"c4a693ed-64c3-4ee0-9b56-505f565ac236\") " pod="openstack/octavia-db-create-4rz5z" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.606163 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfbs4\" (UniqueName: \"kubernetes.io/projected/c4a693ed-64c3-4ee0-9b56-505f565ac236-kube-api-access-rfbs4\") pod \"octavia-db-create-4rz5z\" (UID: \"c4a693ed-64c3-4ee0-9b56-505f565ac236\") " pod="openstack/octavia-db-create-4rz5z" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.606334 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4a693ed-64c3-4ee0-9b56-505f565ac236-operator-scripts\") pod \"octavia-db-create-4rz5z\" (UID: \"c4a693ed-64c3-4ee0-9b56-505f565ac236\") " pod="openstack/octavia-db-create-4rz5z" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.607022 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4a693ed-64c3-4ee0-9b56-505f565ac236-operator-scripts\") pod \"octavia-db-create-4rz5z\" (UID: \"c4a693ed-64c3-4ee0-9b56-505f565ac236\") " pod="openstack/octavia-db-create-4rz5z" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.631977 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfbs4\" (UniqueName: \"kubernetes.io/projected/c4a693ed-64c3-4ee0-9b56-505f565ac236-kube-api-access-rfbs4\") pod \"octavia-db-create-4rz5z\" (UID: \"c4a693ed-64c3-4ee0-9b56-505f565ac236\") " pod="openstack/octavia-db-create-4rz5z" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.723125 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4rz5z" Dec 09 11:38:22 crc kubenswrapper[5002]: I1209 11:38:22.730432 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wxgg4"] Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.218099 5002 generic.go:334] "Generic (PLEG): container finished" podID="5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86" containerID="a7d71e633dafdd49298446c63331b0dd34a2721bd1b0a89b5006bedb78323f51" exitCode=0 Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.218465 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ff8qg" event={"ID":"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86","Type":"ContainerDied","Data":"a7d71e633dafdd49298446c63331b0dd34a2721bd1b0a89b5006bedb78323f51"} Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.230133 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wxgg4" event={"ID":"f7449046-0fa6-4724-9f8c-dd9cff7bc95d","Type":"ContainerStarted","Data":"470c63a920554075afb9c2c5a54c1d64fee97359196b42bcc3ce6769018f08c4"} Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.230178 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wxgg4" event={"ID":"f7449046-0fa6-4724-9f8c-dd9cff7bc95d","Type":"ContainerStarted","Data":"49cc85c5e6534975713f04bab941c232e5d639504a96a84b2598fc9ad5c9a3f1"} Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.234378 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-4rz5z"] Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.262671 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-wxgg4" podStartSLOduration=2.262648325 podStartE2EDuration="2.262648325s" podCreationTimestamp="2025-12-09 11:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:38:23.256714876 +0000 UTC m=+5835.648765957" watchObservedRunningTime="2025-12-09 11:38:23.262648325 +0000 UTC m=+5835.654699396" Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.792302 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-9572-account-create-update-tndnz"] Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.794213 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9572-account-create-update-tndnz" Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.796043 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.801346 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-9572-account-create-update-tndnz"] Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.930260 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-operator-scripts\") pod \"octavia-9572-account-create-update-tndnz\" (UID: \"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e\") " pod="openstack/octavia-9572-account-create-update-tndnz" Dec 09 11:38:23 crc kubenswrapper[5002]: I1209 11:38:23.930617 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fflsq\" (UniqueName: \"kubernetes.io/projected/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-kube-api-access-fflsq\") pod \"octavia-9572-account-create-update-tndnz\" (UID: \"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e\") " pod="openstack/octavia-9572-account-create-update-tndnz" Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.032913 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-operator-scripts\") pod \"octavia-9572-account-create-update-tndnz\" (UID: \"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e\") " pod="openstack/octavia-9572-account-create-update-tndnz" Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.033004 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fflsq\" (UniqueName: \"kubernetes.io/projected/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-kube-api-access-fflsq\") pod \"octavia-9572-account-create-update-tndnz\" (UID: \"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e\") " pod="openstack/octavia-9572-account-create-update-tndnz" Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.034176 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-operator-scripts\") pod \"octavia-9572-account-create-update-tndnz\" (UID: \"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e\") " pod="openstack/octavia-9572-account-create-update-tndnz" Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.053212 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fflsq\" (UniqueName: \"kubernetes.io/projected/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-kube-api-access-fflsq\") pod \"octavia-9572-account-create-update-tndnz\" (UID: \"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e\") " pod="openstack/octavia-9572-account-create-update-tndnz" Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.165635 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9572-account-create-update-tndnz" Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.244350 5002 generic.go:334] "Generic (PLEG): container finished" podID="c4a693ed-64c3-4ee0-9b56-505f565ac236" containerID="1a26833391e2a4a79ef081dd09398845556a322dc5ac2726c9549d9e3d629d16" exitCode=0 Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.244414 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-4rz5z" event={"ID":"c4a693ed-64c3-4ee0-9b56-505f565ac236","Type":"ContainerDied","Data":"1a26833391e2a4a79ef081dd09398845556a322dc5ac2726c9549d9e3d629d16"} Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.244441 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-4rz5z" event={"ID":"c4a693ed-64c3-4ee0-9b56-505f565ac236","Type":"ContainerStarted","Data":"eddddfadc348f9dc6566ef75be2843db75a1d68f6447344844b64c67dd349f0a"} Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.252776 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ff8qg" event={"ID":"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86","Type":"ContainerStarted","Data":"d135e231964ca717206e6cc1aaa65846f2d40fdf7ae28d0b96b3933ecbfad780"} Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.252825 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ff8qg" event={"ID":"5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86","Type":"ContainerStarted","Data":"a3d752f4740acea45146a8720fef5d639e32df8bb3e1c70b8814ec73eaf32036"} Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.324387 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-ff8qg" podStartSLOduration=4.324362945 podStartE2EDuration="4.324362945s" podCreationTimestamp="2025-12-09 11:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:38:24.30996293 +0000 UTC m=+5836.702014021" watchObservedRunningTime="2025-12-09 11:38:24.324362945 +0000 UTC m=+5836.716414046" Dec 09 11:38:24 crc kubenswrapper[5002]: I1209 11:38:24.728497 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-9572-account-create-update-tndnz"] Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.263970 5002 generic.go:334] "Generic (PLEG): container finished" podID="acb2a13d-3ba7-4bf4-be68-aac295d2fd0e" containerID="00ccc1de5aa50d50b2926b04b4d727d3c300a33afd8b92b01a506e300aac9a61" exitCode=0 Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.264200 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9572-account-create-update-tndnz" event={"ID":"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e","Type":"ContainerDied","Data":"00ccc1de5aa50d50b2926b04b4d727d3c300a33afd8b92b01a506e300aac9a61"} Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.264319 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9572-account-create-update-tndnz" event={"ID":"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e","Type":"ContainerStarted","Data":"d07e3cf4319792df188c18de2808aa8af5e504ca8d1f5c295e187297623797ed"} Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.264871 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.264894 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.579368 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4rz5z" Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.662157 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4a693ed-64c3-4ee0-9b56-505f565ac236-operator-scripts\") pod \"c4a693ed-64c3-4ee0-9b56-505f565ac236\" (UID: \"c4a693ed-64c3-4ee0-9b56-505f565ac236\") " Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.662373 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfbs4\" (UniqueName: \"kubernetes.io/projected/c4a693ed-64c3-4ee0-9b56-505f565ac236-kube-api-access-rfbs4\") pod \"c4a693ed-64c3-4ee0-9b56-505f565ac236\" (UID: \"c4a693ed-64c3-4ee0-9b56-505f565ac236\") " Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.663289 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a693ed-64c3-4ee0-9b56-505f565ac236-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4a693ed-64c3-4ee0-9b56-505f565ac236" (UID: "c4a693ed-64c3-4ee0-9b56-505f565ac236"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.668348 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a693ed-64c3-4ee0-9b56-505f565ac236-kube-api-access-rfbs4" (OuterVolumeSpecName: "kube-api-access-rfbs4") pod "c4a693ed-64c3-4ee0-9b56-505f565ac236" (UID: "c4a693ed-64c3-4ee0-9b56-505f565ac236"). InnerVolumeSpecName "kube-api-access-rfbs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.766992 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfbs4\" (UniqueName: \"kubernetes.io/projected/c4a693ed-64c3-4ee0-9b56-505f565ac236-kube-api-access-rfbs4\") on node \"crc\" DevicePath \"\"" Dec 09 11:38:25 crc kubenswrapper[5002]: I1209 11:38:25.767040 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4a693ed-64c3-4ee0-9b56-505f565ac236-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:38:26 crc kubenswrapper[5002]: I1209 11:38:26.276896 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-4rz5z" event={"ID":"c4a693ed-64c3-4ee0-9b56-505f565ac236","Type":"ContainerDied","Data":"eddddfadc348f9dc6566ef75be2843db75a1d68f6447344844b64c67dd349f0a"} Dec 09 11:38:26 crc kubenswrapper[5002]: I1209 11:38:26.277179 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eddddfadc348f9dc6566ef75be2843db75a1d68f6447344844b64c67dd349f0a" Dec 09 11:38:26 crc kubenswrapper[5002]: I1209 11:38:26.277246 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4rz5z" Dec 09 11:38:26 crc kubenswrapper[5002]: I1209 11:38:26.679226 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9572-account-create-update-tndnz" Dec 09 11:38:26 crc kubenswrapper[5002]: I1209 11:38:26.791591 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fflsq\" (UniqueName: \"kubernetes.io/projected/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-kube-api-access-fflsq\") pod \"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e\" (UID: \"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e\") " Dec 09 11:38:26 crc kubenswrapper[5002]: I1209 11:38:26.791714 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-operator-scripts\") pod \"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e\" (UID: \"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e\") " Dec 09 11:38:26 crc kubenswrapper[5002]: I1209 11:38:26.792440 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "acb2a13d-3ba7-4bf4-be68-aac295d2fd0e" (UID: "acb2a13d-3ba7-4bf4-be68-aac295d2fd0e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:38:26 crc kubenswrapper[5002]: I1209 11:38:26.796621 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-kube-api-access-fflsq" (OuterVolumeSpecName: "kube-api-access-fflsq") pod "acb2a13d-3ba7-4bf4-be68-aac295d2fd0e" (UID: "acb2a13d-3ba7-4bf4-be68-aac295d2fd0e"). InnerVolumeSpecName "kube-api-access-fflsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:38:26 crc kubenswrapper[5002]: I1209 11:38:26.893514 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:38:26 crc kubenswrapper[5002]: I1209 11:38:26.893548 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fflsq\" (UniqueName: \"kubernetes.io/projected/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e-kube-api-access-fflsq\") on node \"crc\" DevicePath \"\"" Dec 09 11:38:27 crc kubenswrapper[5002]: I1209 11:38:27.287602 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9572-account-create-update-tndnz" event={"ID":"acb2a13d-3ba7-4bf4-be68-aac295d2fd0e","Type":"ContainerDied","Data":"d07e3cf4319792df188c18de2808aa8af5e504ca8d1f5c295e187297623797ed"} Dec 09 11:38:27 crc kubenswrapper[5002]: I1209 11:38:27.287905 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d07e3cf4319792df188c18de2808aa8af5e504ca8d1f5c295e187297623797ed" Dec 09 11:38:27 crc kubenswrapper[5002]: I1209 11:38:27.287651 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9572-account-create-update-tndnz" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.386080 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-kbfr9"] Dec 09 11:38:29 crc kubenswrapper[5002]: E1209 11:38:29.386487 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a693ed-64c3-4ee0-9b56-505f565ac236" containerName="mariadb-database-create" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.386502 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a693ed-64c3-4ee0-9b56-505f565ac236" containerName="mariadb-database-create" Dec 09 11:38:29 crc kubenswrapper[5002]: E1209 11:38:29.386529 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb2a13d-3ba7-4bf4-be68-aac295d2fd0e" containerName="mariadb-account-create-update" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.386537 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb2a13d-3ba7-4bf4-be68-aac295d2fd0e" containerName="mariadb-account-create-update" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.386710 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a693ed-64c3-4ee0-9b56-505f565ac236" containerName="mariadb-database-create" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.386730 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb2a13d-3ba7-4bf4-be68-aac295d2fd0e" containerName="mariadb-account-create-update" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.387503 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kbfr9" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.437243 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-kbfr9"] Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.547373 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1def173-a869-4bd2-9013-32339fd3ff8f-operator-scripts\") pod \"octavia-persistence-db-create-kbfr9\" (UID: \"d1def173-a869-4bd2-9013-32339fd3ff8f\") " pod="openstack/octavia-persistence-db-create-kbfr9" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.547482 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m4rc\" (UniqueName: \"kubernetes.io/projected/d1def173-a869-4bd2-9013-32339fd3ff8f-kube-api-access-9m4rc\") pod \"octavia-persistence-db-create-kbfr9\" (UID: \"d1def173-a869-4bd2-9013-32339fd3ff8f\") " pod="openstack/octavia-persistence-db-create-kbfr9" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.649720 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m4rc\" (UniqueName: \"kubernetes.io/projected/d1def173-a869-4bd2-9013-32339fd3ff8f-kube-api-access-9m4rc\") pod \"octavia-persistence-db-create-kbfr9\" (UID: \"d1def173-a869-4bd2-9013-32339fd3ff8f\") " pod="openstack/octavia-persistence-db-create-kbfr9" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.649873 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1def173-a869-4bd2-9013-32339fd3ff8f-operator-scripts\") pod \"octavia-persistence-db-create-kbfr9\" (UID: \"d1def173-a869-4bd2-9013-32339fd3ff8f\") " pod="openstack/octavia-persistence-db-create-kbfr9" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.650670 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1def173-a869-4bd2-9013-32339fd3ff8f-operator-scripts\") pod \"octavia-persistence-db-create-kbfr9\" (UID: \"d1def173-a869-4bd2-9013-32339fd3ff8f\") " pod="openstack/octavia-persistence-db-create-kbfr9" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.678512 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m4rc\" (UniqueName: \"kubernetes.io/projected/d1def173-a869-4bd2-9013-32339fd3ff8f-kube-api-access-9m4rc\") pod \"octavia-persistence-db-create-kbfr9\" (UID: \"d1def173-a869-4bd2-9013-32339fd3ff8f\") " pod="openstack/octavia-persistence-db-create-kbfr9" Dec 09 11:38:29 crc kubenswrapper[5002]: I1209 11:38:29.707209 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kbfr9" Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.076915 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-5d20-account-create-update-hklfw"] Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.078431 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-5d20-account-create-update-hklfw" Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.080474 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.083475 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-5d20-account-create-update-hklfw"] Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.163891 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada679ee-2fbf-4aef-bf7a-82eaa257438a-operator-scripts\") pod \"octavia-5d20-account-create-update-hklfw\" (UID: \"ada679ee-2fbf-4aef-bf7a-82eaa257438a\") " pod="openstack/octavia-5d20-account-create-update-hklfw" Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.164009 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfg2c\" (UniqueName: \"kubernetes.io/projected/ada679ee-2fbf-4aef-bf7a-82eaa257438a-kube-api-access-rfg2c\") pod \"octavia-5d20-account-create-update-hklfw\" (UID: \"ada679ee-2fbf-4aef-bf7a-82eaa257438a\") " pod="openstack/octavia-5d20-account-create-update-hklfw" Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.200370 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-kbfr9"] Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.266191 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada679ee-2fbf-4aef-bf7a-82eaa257438a-operator-scripts\") pod \"octavia-5d20-account-create-update-hklfw\" (UID: \"ada679ee-2fbf-4aef-bf7a-82eaa257438a\") " pod="openstack/octavia-5d20-account-create-update-hklfw" Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.266743 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfg2c\" (UniqueName: \"kubernetes.io/projected/ada679ee-2fbf-4aef-bf7a-82eaa257438a-kube-api-access-rfg2c\") pod \"octavia-5d20-account-create-update-hklfw\" (UID: \"ada679ee-2fbf-4aef-bf7a-82eaa257438a\") " pod="openstack/octavia-5d20-account-create-update-hklfw" Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.267377 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada679ee-2fbf-4aef-bf7a-82eaa257438a-operator-scripts\") pod \"octavia-5d20-account-create-update-hklfw\" (UID: \"ada679ee-2fbf-4aef-bf7a-82eaa257438a\") " pod="openstack/octavia-5d20-account-create-update-hklfw" Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.294945 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfg2c\" (UniqueName: \"kubernetes.io/projected/ada679ee-2fbf-4aef-bf7a-82eaa257438a-kube-api-access-rfg2c\") pod \"octavia-5d20-account-create-update-hklfw\" (UID: \"ada679ee-2fbf-4aef-bf7a-82eaa257438a\") " pod="openstack/octavia-5d20-account-create-update-hklfw" Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.323899 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kbfr9" event={"ID":"d1def173-a869-4bd2-9013-32339fd3ff8f","Type":"ContainerStarted","Data":"73714bac0953de26abc1af30618515cf7bf7497d1bdea18a5d861da507116509"} Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.406055 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-5d20-account-create-update-hklfw" Dec 09 11:38:30 crc kubenswrapper[5002]: I1209 11:38:30.932234 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-5d20-account-create-update-hklfw"] Dec 09 11:38:31 crc kubenswrapper[5002]: I1209 11:38:31.337090 5002 generic.go:334] "Generic (PLEG): container finished" podID="d1def173-a869-4bd2-9013-32339fd3ff8f" containerID="aad73eaa74f82fba0bc03f2558f65514bfddb37fde2a01411140bb881a849a62" exitCode=0 Dec 09 11:38:31 crc kubenswrapper[5002]: I1209 11:38:31.337163 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kbfr9" event={"ID":"d1def173-a869-4bd2-9013-32339fd3ff8f","Type":"ContainerDied","Data":"aad73eaa74f82fba0bc03f2558f65514bfddb37fde2a01411140bb881a849a62"} Dec 09 11:38:31 crc kubenswrapper[5002]: I1209 11:38:31.340159 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-5d20-account-create-update-hklfw" event={"ID":"ada679ee-2fbf-4aef-bf7a-82eaa257438a","Type":"ContainerStarted","Data":"d8ed1401f64444376d1fdd43134a78bef62b903fc936675ec019b767876ce995"} Dec 09 11:38:31 crc kubenswrapper[5002]: I1209 11:38:31.340207 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-5d20-account-create-update-hklfw" event={"ID":"ada679ee-2fbf-4aef-bf7a-82eaa257438a","Type":"ContainerStarted","Data":"2daf098067d0ac4b9013f921888dddf9961f310e75f0b1b2aab32007c2d875d2"} Dec 09 11:38:31 crc kubenswrapper[5002]: I1209 11:38:31.385860 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-5d20-account-create-update-hklfw" podStartSLOduration=1.3858100819999999 podStartE2EDuration="1.385810082s" podCreationTimestamp="2025-12-09 11:38:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:38:31.377299374 +0000 UTC m=+5843.769350475" watchObservedRunningTime="2025-12-09 11:38:31.385810082 +0000 UTC m=+5843.777861183" Dec 09 11:38:32 crc kubenswrapper[5002]: I1209 11:38:32.350124 5002 generic.go:334] "Generic (PLEG): container finished" podID="ada679ee-2fbf-4aef-bf7a-82eaa257438a" containerID="d8ed1401f64444376d1fdd43134a78bef62b903fc936675ec019b767876ce995" exitCode=0 Dec 09 11:38:32 crc kubenswrapper[5002]: I1209 11:38:32.350252 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-5d20-account-create-update-hklfw" event={"ID":"ada679ee-2fbf-4aef-bf7a-82eaa257438a","Type":"ContainerDied","Data":"d8ed1401f64444376d1fdd43134a78bef62b903fc936675ec019b767876ce995"} Dec 09 11:38:32 crc kubenswrapper[5002]: I1209 11:38:32.789921 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kbfr9" Dec 09 11:38:32 crc kubenswrapper[5002]: I1209 11:38:32.917987 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1def173-a869-4bd2-9013-32339fd3ff8f-operator-scripts\") pod \"d1def173-a869-4bd2-9013-32339fd3ff8f\" (UID: \"d1def173-a869-4bd2-9013-32339fd3ff8f\") " Dec 09 11:38:32 crc kubenswrapper[5002]: I1209 11:38:32.918047 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m4rc\" (UniqueName: \"kubernetes.io/projected/d1def173-a869-4bd2-9013-32339fd3ff8f-kube-api-access-9m4rc\") pod \"d1def173-a869-4bd2-9013-32339fd3ff8f\" (UID: \"d1def173-a869-4bd2-9013-32339fd3ff8f\") " Dec 09 11:38:32 crc kubenswrapper[5002]: I1209 11:38:32.918788 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1def173-a869-4bd2-9013-32339fd3ff8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1def173-a869-4bd2-9013-32339fd3ff8f" (UID: "d1def173-a869-4bd2-9013-32339fd3ff8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:38:32 crc kubenswrapper[5002]: I1209 11:38:32.924504 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1def173-a869-4bd2-9013-32339fd3ff8f-kube-api-access-9m4rc" (OuterVolumeSpecName: "kube-api-access-9m4rc") pod "d1def173-a869-4bd2-9013-32339fd3ff8f" (UID: "d1def173-a869-4bd2-9013-32339fd3ff8f"). InnerVolumeSpecName "kube-api-access-9m4rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.020662 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1def173-a869-4bd2-9013-32339fd3ff8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.020696 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m4rc\" (UniqueName: \"kubernetes.io/projected/d1def173-a869-4bd2-9013-32339fd3ff8f-kube-api-access-9m4rc\") on node \"crc\" DevicePath \"\"" Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.362690 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kbfr9" Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.362717 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kbfr9" event={"ID":"d1def173-a869-4bd2-9013-32339fd3ff8f","Type":"ContainerDied","Data":"73714bac0953de26abc1af30618515cf7bf7497d1bdea18a5d861da507116509"} Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.362768 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73714bac0953de26abc1af30618515cf7bf7497d1bdea18a5d861da507116509" Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.727013 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-5d20-account-create-update-hklfw" Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.835890 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfg2c\" (UniqueName: \"kubernetes.io/projected/ada679ee-2fbf-4aef-bf7a-82eaa257438a-kube-api-access-rfg2c\") pod \"ada679ee-2fbf-4aef-bf7a-82eaa257438a\" (UID: \"ada679ee-2fbf-4aef-bf7a-82eaa257438a\") " Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.836327 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada679ee-2fbf-4aef-bf7a-82eaa257438a-operator-scripts\") pod \"ada679ee-2fbf-4aef-bf7a-82eaa257438a\" (UID: \"ada679ee-2fbf-4aef-bf7a-82eaa257438a\") " Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.836864 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada679ee-2fbf-4aef-bf7a-82eaa257438a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ada679ee-2fbf-4aef-bf7a-82eaa257438a" (UID: "ada679ee-2fbf-4aef-bf7a-82eaa257438a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.837423 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada679ee-2fbf-4aef-bf7a-82eaa257438a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.840632 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada679ee-2fbf-4aef-bf7a-82eaa257438a-kube-api-access-rfg2c" (OuterVolumeSpecName: "kube-api-access-rfg2c") pod "ada679ee-2fbf-4aef-bf7a-82eaa257438a" (UID: "ada679ee-2fbf-4aef-bf7a-82eaa257438a"). InnerVolumeSpecName "kube-api-access-rfg2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:38:33 crc kubenswrapper[5002]: I1209 11:38:33.939014 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfg2c\" (UniqueName: \"kubernetes.io/projected/ada679ee-2fbf-4aef-bf7a-82eaa257438a-kube-api-access-rfg2c\") on node \"crc\" DevicePath \"\"" Dec 09 11:38:34 crc kubenswrapper[5002]: I1209 11:38:34.372501 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-5d20-account-create-update-hklfw" event={"ID":"ada679ee-2fbf-4aef-bf7a-82eaa257438a","Type":"ContainerDied","Data":"2daf098067d0ac4b9013f921888dddf9961f310e75f0b1b2aab32007c2d875d2"} Dec 09 11:38:34 crc kubenswrapper[5002]: I1209 11:38:34.372546 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2daf098067d0ac4b9013f921888dddf9961f310e75f0b1b2aab32007c2d875d2" Dec 09 11:38:34 crc kubenswrapper[5002]: I1209 11:38:34.372603 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-5d20-account-create-update-hklfw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.472329 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-7f5cdc56bb-d9kbw"] Dec 09 11:38:35 crc kubenswrapper[5002]: E1209 11:38:35.473096 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1def173-a869-4bd2-9013-32339fd3ff8f" containerName="mariadb-database-create" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.473116 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1def173-a869-4bd2-9013-32339fd3ff8f" containerName="mariadb-database-create" Dec 09 11:38:35 crc kubenswrapper[5002]: E1209 11:38:35.473156 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada679ee-2fbf-4aef-bf7a-82eaa257438a" containerName="mariadb-account-create-update" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.473166 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada679ee-2fbf-4aef-bf7a-82eaa257438a" containerName="mariadb-account-create-update" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.473381 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1def173-a869-4bd2-9013-32339fd3ff8f" containerName="mariadb-database-create" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.473399 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada679ee-2fbf-4aef-bf7a-82eaa257438a" containerName="mariadb-account-create-update" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.477016 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.479918 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.480160 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-65g8c" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.480533 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.487251 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7f5cdc56bb-d9kbw"] Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.566688 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8e7b10c-91b5-4632-9ded-327e013f19db-scripts\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.567039 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a8e7b10c-91b5-4632-9ded-327e013f19db-config-data-merged\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.567189 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e7b10c-91b5-4632-9ded-327e013f19db-combined-ca-bundle\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.567322 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/a8e7b10c-91b5-4632-9ded-327e013f19db-octavia-run\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.567590 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e7b10c-91b5-4632-9ded-327e013f19db-config-data\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.669510 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8e7b10c-91b5-4632-9ded-327e013f19db-scripts\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.669852 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a8e7b10c-91b5-4632-9ded-327e013f19db-config-data-merged\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.670007 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e7b10c-91b5-4632-9ded-327e013f19db-combined-ca-bundle\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.670134 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/a8e7b10c-91b5-4632-9ded-327e013f19db-octavia-run\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.670299 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e7b10c-91b5-4632-9ded-327e013f19db-config-data\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.670510 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a8e7b10c-91b5-4632-9ded-327e013f19db-config-data-merged\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.670751 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/a8e7b10c-91b5-4632-9ded-327e013f19db-octavia-run\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.676154 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e7b10c-91b5-4632-9ded-327e013f19db-config-data\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.679523 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8e7b10c-91b5-4632-9ded-327e013f19db-scripts\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.686117 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e7b10c-91b5-4632-9ded-327e013f19db-combined-ca-bundle\") pod \"octavia-api-7f5cdc56bb-d9kbw\" (UID: \"a8e7b10c-91b5-4632-9ded-327e013f19db\") " pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:35 crc kubenswrapper[5002]: I1209 11:38:35.811278 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:36 crc kubenswrapper[5002]: I1209 11:38:36.293756 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7f5cdc56bb-d9kbw"] Dec 09 11:38:36 crc kubenswrapper[5002]: I1209 11:38:36.390705 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" event={"ID":"a8e7b10c-91b5-4632-9ded-327e013f19db","Type":"ContainerStarted","Data":"e4d6fbc6d4a94083fa64bda2a1a8da013a58a378ba3719bac57ff29bd86fd646"} Dec 09 11:38:45 crc kubenswrapper[5002]: I1209 11:38:45.514055 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" event={"ID":"a8e7b10c-91b5-4632-9ded-327e013f19db","Type":"ContainerStarted","Data":"8e23fa390142c364d505d293483a7e6fbc1673c3167bc897348604f014302970"} Dec 09 11:38:46 crc kubenswrapper[5002]: I1209 11:38:46.527315 5002 generic.go:334] "Generic (PLEG): container finished" podID="a8e7b10c-91b5-4632-9ded-327e013f19db" containerID="8e23fa390142c364d505d293483a7e6fbc1673c3167bc897348604f014302970" exitCode=0 Dec 09 11:38:46 crc kubenswrapper[5002]: I1209 11:38:46.527358 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" event={"ID":"a8e7b10c-91b5-4632-9ded-327e013f19db","Type":"ContainerDied","Data":"8e23fa390142c364d505d293483a7e6fbc1673c3167bc897348604f014302970"} Dec 09 11:38:46 crc kubenswrapper[5002]: I1209 11:38:46.923389 5002 scope.go:117] "RemoveContainer" containerID="f407febb120c9e6659c4785dd15bf5b765a05ed089d1b2040ed420e5dd438441" Dec 09 11:38:46 crc kubenswrapper[5002]: I1209 11:38:46.956889 5002 scope.go:117] "RemoveContainer" containerID="974269e3701da7e1cff1171d1be5ffcfb4a1fc19631b0a3628e4c9713bc1afcd" Dec 09 11:38:47 crc kubenswrapper[5002]: I1209 11:38:47.004466 5002 scope.go:117] "RemoveContainer" containerID="102d6b1e6d4be24cf22ae770d8c2354b76c69c07e396b1c5b329a14dfce8c125" Dec 09 11:38:47 crc kubenswrapper[5002]: I1209 11:38:47.039143 5002 scope.go:117] "RemoveContainer" containerID="7d8fd369e838ef25f4aa137e4efdca0a3c0c18df473a76e109e6b92a716d2fb6" Dec 09 11:38:47 crc kubenswrapper[5002]: I1209 11:38:47.545414 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" event={"ID":"a8e7b10c-91b5-4632-9ded-327e013f19db","Type":"ContainerStarted","Data":"e3107a9149cc25e124ae06dba26fc0b9f110709cee688994553ae782052a9e95"} Dec 09 11:38:47 crc kubenswrapper[5002]: I1209 11:38:47.545467 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" event={"ID":"a8e7b10c-91b5-4632-9ded-327e013f19db","Type":"ContainerStarted","Data":"3735fd9d3b0214c052b5383a66bbac98ab22e7002d3f9620f59333b7325d3af4"} Dec 09 11:38:47 crc kubenswrapper[5002]: I1209 11:38:47.545638 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:47 crc kubenswrapper[5002]: I1209 11:38:47.566220 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" podStartSLOduration=3.726910938 podStartE2EDuration="12.566197161s" podCreationTimestamp="2025-12-09 11:38:35 +0000 UTC" firstStartedPulling="2025-12-09 11:38:36.304327097 +0000 UTC m=+5848.696378178" lastFinishedPulling="2025-12-09 11:38:45.14361332 +0000 UTC m=+5857.535664401" observedRunningTime="2025-12-09 11:38:47.563082038 +0000 UTC m=+5859.955133129" watchObservedRunningTime="2025-12-09 11:38:47.566197161 +0000 UTC m=+5859.958248242" Dec 09 11:38:48 crc kubenswrapper[5002]: I1209 11:38:48.554453 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:38:55 crc kubenswrapper[5002]: I1209 11:38:55.928480 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9glwj" podUID="3e229031-30d4-44e3-897b-b4c4252e7b99" containerName="ovn-controller" probeResult="failure" output=< Dec 09 11:38:55 crc kubenswrapper[5002]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 09 11:38:55 crc kubenswrapper[5002]: > Dec 09 11:38:55 crc kubenswrapper[5002]: I1209 11:38:55.959201 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:55 crc kubenswrapper[5002]: I1209 11:38:55.959665 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ff8qg" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.102650 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9glwj-config-4drjr"] Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.103863 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.109701 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.179467 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssrkz\" (UniqueName: \"kubernetes.io/projected/a5edbbee-2307-4ce7-bd68-4d2510758050-kube-api-access-ssrkz\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.179559 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-log-ovn\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.179616 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.179656 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-scripts\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.179696 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run-ovn\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.179802 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-additional-scripts\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.197253 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9glwj-config-4drjr"] Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.281699 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-log-ovn\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.281774 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.281833 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-scripts\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.281942 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run-ovn\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.281978 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-additional-scripts\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.282048 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssrkz\" (UniqueName: \"kubernetes.io/projected/a5edbbee-2307-4ce7-bd68-4d2510758050-kube-api-access-ssrkz\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.282169 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.282174 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-log-ovn\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.282221 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run-ovn\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.283199 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-additional-scripts\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.284209 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-scripts\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.302011 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssrkz\" (UniqueName: \"kubernetes.io/projected/a5edbbee-2307-4ce7-bd68-4d2510758050-kube-api-access-ssrkz\") pod \"ovn-controller-9glwj-config-4drjr\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:56 crc kubenswrapper[5002]: I1209 11:38:56.468244 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.061050 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9glwj-config-4drjr"] Dec 09 11:38:57 crc kubenswrapper[5002]: W1209 11:38:57.073597 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5edbbee_2307_4ce7_bd68_4d2510758050.slice/crio-1a04e1183d5cf06337e0a453f228ddcddb477caade7ecd5ec88dd7839263b29f WatchSource:0}: Error finding container 1a04e1183d5cf06337e0a453f228ddcddb477caade7ecd5ec88dd7839263b29f: Status 404 returned error can't find the container with id 1a04e1183d5cf06337e0a453f228ddcddb477caade7ecd5ec88dd7839263b29f Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.188088 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-5gzkr"] Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.190810 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.206461 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.206736 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.206942 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.207041 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-5gzkr"] Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.313074 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a44e68c5-c542-4f82-af1f-a3405ef59656-hm-ports\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.313135 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44e68c5-c542-4f82-af1f-a3405ef59656-config-data\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.313305 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a44e68c5-c542-4f82-af1f-a3405ef59656-config-data-merged\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.313368 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44e68c5-c542-4f82-af1f-a3405ef59656-scripts\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.415582 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a44e68c5-c542-4f82-af1f-a3405ef59656-hm-ports\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.415629 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44e68c5-c542-4f82-af1f-a3405ef59656-config-data\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.415691 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44e68c5-c542-4f82-af1f-a3405ef59656-scripts\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.415709 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a44e68c5-c542-4f82-af1f-a3405ef59656-config-data-merged\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.416335 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a44e68c5-c542-4f82-af1f-a3405ef59656-config-data-merged\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.416759 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a44e68c5-c542-4f82-af1f-a3405ef59656-hm-ports\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.421587 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44e68c5-c542-4f82-af1f-a3405ef59656-scripts\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.429266 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44e68c5-c542-4f82-af1f-a3405ef59656-config-data\") pod \"octavia-rsyslog-5gzkr\" (UID: \"a44e68c5-c542-4f82-af1f-a3405ef59656\") " pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.530073 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.650355 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9glwj-config-4drjr" event={"ID":"a5edbbee-2307-4ce7-bd68-4d2510758050","Type":"ContainerStarted","Data":"7a5bcd0034d52d8610c57f3b65ce9c3e149fbea92fdecac4fe3eeef5b17be908"} Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.650552 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9glwj-config-4drjr" event={"ID":"a5edbbee-2307-4ce7-bd68-4d2510758050","Type":"ContainerStarted","Data":"1a04e1183d5cf06337e0a453f228ddcddb477caade7ecd5ec88dd7839263b29f"} Dec 09 11:38:57 crc kubenswrapper[5002]: I1209 11:38:57.677800 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9glwj-config-4drjr" podStartSLOduration=1.677778547 podStartE2EDuration="1.677778547s" podCreationTimestamp="2025-12-09 11:38:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:38:57.666517376 +0000 UTC m=+5870.058568457" watchObservedRunningTime="2025-12-09 11:38:57.677778547 +0000 UTC m=+5870.069829628" Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.097784 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cs89r"] Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.103073 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-cs89r" Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.106086 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cs89r"] Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.107247 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.173984 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-5gzkr"] Dec 09 11:38:58 crc kubenswrapper[5002]: W1209 11:38:58.206174 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44e68c5_c542_4f82_af1f_a3405ef59656.slice/crio-424c978f9096df199d69cf3bf13f875d81808ad86cdd39a30fe8eb66992229c2 WatchSource:0}: Error finding container 424c978f9096df199d69cf3bf13f875d81808ad86cdd39a30fe8eb66992229c2: Status 404 returned error can't find the container with id 424c978f9096df199d69cf3bf13f875d81808ad86cdd39a30fe8eb66992229c2 Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.249574 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4f95a775-11dd-4a27-af99-e10c0c2823f3-httpd-config\") pod \"octavia-image-upload-59f8cff499-cs89r\" (UID: \"4f95a775-11dd-4a27-af99-e10c0c2823f3\") " pod="openstack/octavia-image-upload-59f8cff499-cs89r" Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.250293 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/4f95a775-11dd-4a27-af99-e10c0c2823f3-amphora-image\") pod \"octavia-image-upload-59f8cff499-cs89r\" (UID: \"4f95a775-11dd-4a27-af99-e10c0c2823f3\") " pod="openstack/octavia-image-upload-59f8cff499-cs89r" Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.351911 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/4f95a775-11dd-4a27-af99-e10c0c2823f3-amphora-image\") pod \"octavia-image-upload-59f8cff499-cs89r\" (UID: \"4f95a775-11dd-4a27-af99-e10c0c2823f3\") " pod="openstack/octavia-image-upload-59f8cff499-cs89r" Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.352000 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4f95a775-11dd-4a27-af99-e10c0c2823f3-httpd-config\") pod \"octavia-image-upload-59f8cff499-cs89r\" (UID: \"4f95a775-11dd-4a27-af99-e10c0c2823f3\") " pod="openstack/octavia-image-upload-59f8cff499-cs89r" Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.352343 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/4f95a775-11dd-4a27-af99-e10c0c2823f3-amphora-image\") pod \"octavia-image-upload-59f8cff499-cs89r\" (UID: \"4f95a775-11dd-4a27-af99-e10c0c2823f3\") " pod="openstack/octavia-image-upload-59f8cff499-cs89r" Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.358732 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4f95a775-11dd-4a27-af99-e10c0c2823f3-httpd-config\") pod \"octavia-image-upload-59f8cff499-cs89r\" (UID: \"4f95a775-11dd-4a27-af99-e10c0c2823f3\") " pod="openstack/octavia-image-upload-59f8cff499-cs89r" Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.431128 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-cs89r" Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.663537 5002 generic.go:334] "Generic (PLEG): container finished" podID="a5edbbee-2307-4ce7-bd68-4d2510758050" containerID="7a5bcd0034d52d8610c57f3b65ce9c3e149fbea92fdecac4fe3eeef5b17be908" exitCode=0 Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.663621 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9glwj-config-4drjr" event={"ID":"a5edbbee-2307-4ce7-bd68-4d2510758050","Type":"ContainerDied","Data":"7a5bcd0034d52d8610c57f3b65ce9c3e149fbea92fdecac4fe3eeef5b17be908"} Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.667469 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-5gzkr" event={"ID":"a44e68c5-c542-4f82-af1f-a3405ef59656","Type":"ContainerStarted","Data":"424c978f9096df199d69cf3bf13f875d81808ad86cdd39a30fe8eb66992229c2"} Dec 09 11:38:58 crc kubenswrapper[5002]: I1209 11:38:58.912705 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cs89r"] Dec 09 11:38:58 crc kubenswrapper[5002]: W1209 11:38:58.921723 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f95a775_11dd_4a27_af99_e10c0c2823f3.slice/crio-c5f11f30312bdd5b7665414ff5dbf80395f5b083ba0d942174021eea9c3f3338 WatchSource:0}: Error finding container c5f11f30312bdd5b7665414ff5dbf80395f5b083ba0d942174021eea9c3f3338: Status 404 returned error can't find the container with id c5f11f30312bdd5b7665414ff5dbf80395f5b083ba0d942174021eea9c3f3338 Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.344524 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-lcrd7"] Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.348377 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.353123 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.356018 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-lcrd7"] Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.477404 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.477695 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-combined-ca-bundle\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.477726 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data-merged\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.477796 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-scripts\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.580240 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data-merged\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.580332 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-scripts\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.580531 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.580561 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-combined-ca-bundle\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.580703 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data-merged\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.588190 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-combined-ca-bundle\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.588199 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-scripts\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.588322 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data\") pod \"octavia-db-sync-lcrd7\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.677965 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:38:59 crc kubenswrapper[5002]: I1209 11:38:59.711658 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-cs89r" event={"ID":"4f95a775-11dd-4a27-af99-e10c0c2823f3","Type":"ContainerStarted","Data":"c5f11f30312bdd5b7665414ff5dbf80395f5b083ba0d942174021eea9c3f3338"} Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.078576 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.198085 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run\") pod \"a5edbbee-2307-4ce7-bd68-4d2510758050\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.198368 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-additional-scripts\") pod \"a5edbbee-2307-4ce7-bd68-4d2510758050\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.198189 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run" (OuterVolumeSpecName: "var-run") pod "a5edbbee-2307-4ce7-bd68-4d2510758050" (UID: "a5edbbee-2307-4ce7-bd68-4d2510758050"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.198536 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssrkz\" (UniqueName: \"kubernetes.io/projected/a5edbbee-2307-4ce7-bd68-4d2510758050-kube-api-access-ssrkz\") pod \"a5edbbee-2307-4ce7-bd68-4d2510758050\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.198613 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run-ovn\") pod \"a5edbbee-2307-4ce7-bd68-4d2510758050\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.198671 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-log-ovn\") pod \"a5edbbee-2307-4ce7-bd68-4d2510758050\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.198691 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-scripts\") pod \"a5edbbee-2307-4ce7-bd68-4d2510758050\" (UID: \"a5edbbee-2307-4ce7-bd68-4d2510758050\") " Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.198906 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a5edbbee-2307-4ce7-bd68-4d2510758050" (UID: "a5edbbee-2307-4ce7-bd68-4d2510758050"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.198982 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a5edbbee-2307-4ce7-bd68-4d2510758050" (UID: "a5edbbee-2307-4ce7-bd68-4d2510758050"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.199072 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a5edbbee-2307-4ce7-bd68-4d2510758050" (UID: "a5edbbee-2307-4ce7-bd68-4d2510758050"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.199453 5002 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.199467 5002 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.199477 5002 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.199486 5002 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a5edbbee-2307-4ce7-bd68-4d2510758050-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.199891 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-scripts" (OuterVolumeSpecName: "scripts") pod "a5edbbee-2307-4ce7-bd68-4d2510758050" (UID: "a5edbbee-2307-4ce7-bd68-4d2510758050"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.207536 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5edbbee-2307-4ce7-bd68-4d2510758050-kube-api-access-ssrkz" (OuterVolumeSpecName: "kube-api-access-ssrkz") pod "a5edbbee-2307-4ce7-bd68-4d2510758050" (UID: "a5edbbee-2307-4ce7-bd68-4d2510758050"). InnerVolumeSpecName "kube-api-access-ssrkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.236256 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-lcrd7"] Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.301694 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssrkz\" (UniqueName: \"kubernetes.io/projected/a5edbbee-2307-4ce7-bd68-4d2510758050-kube-api-access-ssrkz\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.301723 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5edbbee-2307-4ce7-bd68-4d2510758050-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.762715 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9glwj-config-4drjr"] Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.770893 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-lcrd7" event={"ID":"d660fbb0-443c-4a2d-9e92-1a89bac33302","Type":"ContainerStarted","Data":"380b4e48afdfc8578aac3bd39ef0cb5796bbd8f14b642ba8e9f0d535956124b7"} Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.775777 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9glwj-config-4drjr"] Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.778207 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a04e1183d5cf06337e0a453f228ddcddb477caade7ecd5ec88dd7839263b29f" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.778278 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9glwj-config-4drjr" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.856216 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9glwj-config-b444r"] Dec 09 11:39:00 crc kubenswrapper[5002]: E1209 11:39:00.856604 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5edbbee-2307-4ce7-bd68-4d2510758050" containerName="ovn-config" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.856622 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5edbbee-2307-4ce7-bd68-4d2510758050" containerName="ovn-config" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.856802 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5edbbee-2307-4ce7-bd68-4d2510758050" containerName="ovn-config" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.857442 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.859658 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 09 11:39:00 crc kubenswrapper[5002]: I1209 11:39:00.900928 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9glwj-config-b444r"] Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.034802 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.035021 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-log-ovn\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.035145 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run-ovn\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.035222 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-additional-scripts\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.035252 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45krt\" (UniqueName: \"kubernetes.io/projected/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-kube-api-access-45krt\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.035318 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-scripts\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.083300 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9glwj" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.136695 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-log-ovn\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.137180 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-log-ovn\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.137307 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run-ovn\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.137392 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run-ovn\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.137414 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-additional-scripts\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.137436 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45krt\" (UniqueName: \"kubernetes.io/projected/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-kube-api-access-45krt\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.138176 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-additional-scripts\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.138248 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-scripts\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.140106 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-scripts\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.140329 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.140421 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.168465 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45krt\" (UniqueName: \"kubernetes.io/projected/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-kube-api-access-45krt\") pod \"ovn-controller-9glwj-config-b444r\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.193540 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.798038 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-5gzkr" event={"ID":"a44e68c5-c542-4f82-af1f-a3405ef59656","Type":"ContainerStarted","Data":"ebb75b38be2dcfc4faf4c7b2174931be16217b9834146426bd436578541954bb"} Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.805137 5002 generic.go:334] "Generic (PLEG): container finished" podID="d660fbb0-443c-4a2d-9e92-1a89bac33302" containerID="0842b6fbe1e457931cb81a051aae1496d027ef1ec2164002202ebc57531755f3" exitCode=0 Dec 09 11:39:01 crc kubenswrapper[5002]: I1209 11:39:01.805189 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-lcrd7" event={"ID":"d660fbb0-443c-4a2d-9e92-1a89bac33302","Type":"ContainerDied","Data":"0842b6fbe1e457931cb81a051aae1496d027ef1ec2164002202ebc57531755f3"} Dec 09 11:39:02 crc kubenswrapper[5002]: I1209 11:39:02.076773 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5edbbee-2307-4ce7-bd68-4d2510758050" path="/var/lib/kubelet/pods/a5edbbee-2307-4ce7-bd68-4d2510758050/volumes" Dec 09 11:39:02 crc kubenswrapper[5002]: I1209 11:39:02.822653 5002 generic.go:334] "Generic (PLEG): container finished" podID="a44e68c5-c542-4f82-af1f-a3405ef59656" containerID="ebb75b38be2dcfc4faf4c7b2174931be16217b9834146426bd436578541954bb" exitCode=0 Dec 09 11:39:02 crc kubenswrapper[5002]: I1209 11:39:02.823029 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-5gzkr" event={"ID":"a44e68c5-c542-4f82-af1f-a3405ef59656","Type":"ContainerDied","Data":"ebb75b38be2dcfc4faf4c7b2174931be16217b9834146426bd436578541954bb"} Dec 09 11:39:03 crc kubenswrapper[5002]: W1209 11:39:03.285265 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeccc3e93_7acc_4ba6_8b72_afb47a23bf63.slice/crio-b9313c44e7ce137082454677572216ed84b33526f5490a07f4b056b94726917b WatchSource:0}: Error finding container b9313c44e7ce137082454677572216ed84b33526f5490a07f4b056b94726917b: Status 404 returned error can't find the container with id b9313c44e7ce137082454677572216ed84b33526f5490a07f4b056b94726917b Dec 09 11:39:03 crc kubenswrapper[5002]: I1209 11:39:03.291002 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9glwj-config-b444r"] Dec 09 11:39:03 crc kubenswrapper[5002]: I1209 11:39:03.842385 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-lcrd7" event={"ID":"d660fbb0-443c-4a2d-9e92-1a89bac33302","Type":"ContainerStarted","Data":"38b392a7acf133fa7119c93e414e97230818278d2853225f3a89404e4fca77eb"} Dec 09 11:39:03 crc kubenswrapper[5002]: I1209 11:39:03.848881 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9glwj-config-b444r" event={"ID":"eccc3e93-7acc-4ba6-8b72-afb47a23bf63","Type":"ContainerStarted","Data":"1c9cf717bec0dff95ea4ce5c72fcb3dbe2732a7bee5f7873eb95f4376473da47"} Dec 09 11:39:03 crc kubenswrapper[5002]: I1209 11:39:03.848937 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9glwj-config-b444r" event={"ID":"eccc3e93-7acc-4ba6-8b72-afb47a23bf63","Type":"ContainerStarted","Data":"b9313c44e7ce137082454677572216ed84b33526f5490a07f4b056b94726917b"} Dec 09 11:39:03 crc kubenswrapper[5002]: I1209 11:39:03.872754 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-lcrd7" podStartSLOduration=4.872735618 podStartE2EDuration="4.872735618s" podCreationTimestamp="2025-12-09 11:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:39:03.869519962 +0000 UTC m=+5876.261571053" watchObservedRunningTime="2025-12-09 11:39:03.872735618 +0000 UTC m=+5876.264786699" Dec 09 11:39:03 crc kubenswrapper[5002]: I1209 11:39:03.912843 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9glwj-config-b444r" podStartSLOduration=3.91280163 podStartE2EDuration="3.91280163s" podCreationTimestamp="2025-12-09 11:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:39:03.887647177 +0000 UTC m=+5876.279698258" watchObservedRunningTime="2025-12-09 11:39:03.91280163 +0000 UTC m=+5876.304852701" Dec 09 11:39:04 crc kubenswrapper[5002]: I1209 11:39:04.862701 5002 generic.go:334] "Generic (PLEG): container finished" podID="eccc3e93-7acc-4ba6-8b72-afb47a23bf63" containerID="1c9cf717bec0dff95ea4ce5c72fcb3dbe2732a7bee5f7873eb95f4376473da47" exitCode=0 Dec 09 11:39:04 crc kubenswrapper[5002]: I1209 11:39:04.862957 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9glwj-config-b444r" event={"ID":"eccc3e93-7acc-4ba6-8b72-afb47a23bf63","Type":"ContainerDied","Data":"1c9cf717bec0dff95ea4ce5c72fcb3dbe2732a7bee5f7873eb95f4376473da47"} Dec 09 11:39:04 crc kubenswrapper[5002]: I1209 11:39:04.869977 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-5gzkr" event={"ID":"a44e68c5-c542-4f82-af1f-a3405ef59656","Type":"ContainerStarted","Data":"dc836e7674797c27cc190af27f2554011ecb2a479eb0ab96a6b6ef7d26b2a234"} Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.287682 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.361367 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-scripts\") pod \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.361438 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45krt\" (UniqueName: \"kubernetes.io/projected/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-kube-api-access-45krt\") pod \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.361491 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-additional-scripts\") pod \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.361553 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run\") pod \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.361585 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-log-ovn\") pod \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.361671 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run-ovn\") pod \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\" (UID: \"eccc3e93-7acc-4ba6-8b72-afb47a23bf63\") " Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.361971 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "eccc3e93-7acc-4ba6-8b72-afb47a23bf63" (UID: "eccc3e93-7acc-4ba6-8b72-afb47a23bf63"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.361976 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run" (OuterVolumeSpecName: "var-run") pod "eccc3e93-7acc-4ba6-8b72-afb47a23bf63" (UID: "eccc3e93-7acc-4ba6-8b72-afb47a23bf63"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.362076 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "eccc3e93-7acc-4ba6-8b72-afb47a23bf63" (UID: "eccc3e93-7acc-4ba6-8b72-afb47a23bf63"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.362582 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "eccc3e93-7acc-4ba6-8b72-afb47a23bf63" (UID: "eccc3e93-7acc-4ba6-8b72-afb47a23bf63"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.363013 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-scripts" (OuterVolumeSpecName: "scripts") pod "eccc3e93-7acc-4ba6-8b72-afb47a23bf63" (UID: "eccc3e93-7acc-4ba6-8b72-afb47a23bf63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.363287 5002 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.363303 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.363311 5002 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.363323 5002 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.363331 5002 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.368002 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-kube-api-access-45krt" (OuterVolumeSpecName: "kube-api-access-45krt") pod "eccc3e93-7acc-4ba6-8b72-afb47a23bf63" (UID: "eccc3e93-7acc-4ba6-8b72-afb47a23bf63"). InnerVolumeSpecName "kube-api-access-45krt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.470325 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45krt\" (UniqueName: \"kubernetes.io/projected/eccc3e93-7acc-4ba6-8b72-afb47a23bf63-kube-api-access-45krt\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.890158 5002 generic.go:334] "Generic (PLEG): container finished" podID="d660fbb0-443c-4a2d-9e92-1a89bac33302" containerID="38b392a7acf133fa7119c93e414e97230818278d2853225f3a89404e4fca77eb" exitCode=0 Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.890232 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-lcrd7" event={"ID":"d660fbb0-443c-4a2d-9e92-1a89bac33302","Type":"ContainerDied","Data":"38b392a7acf133fa7119c93e414e97230818278d2853225f3a89404e4fca77eb"} Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.892198 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9glwj-config-b444r" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.892217 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9glwj-config-b444r" event={"ID":"eccc3e93-7acc-4ba6-8b72-afb47a23bf63","Type":"ContainerDied","Data":"b9313c44e7ce137082454677572216ed84b33526f5490a07f4b056b94726917b"} Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.892271 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9313c44e7ce137082454677572216ed84b33526f5490a07f4b056b94726917b" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.892739 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:39:06 crc kubenswrapper[5002]: I1209 11:39:06.939417 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-5gzkr" podStartSLOduration=3.783992376 podStartE2EDuration="9.939402009s" podCreationTimestamp="2025-12-09 11:38:57 +0000 UTC" firstStartedPulling="2025-12-09 11:38:58.213147598 +0000 UTC m=+5870.605198679" lastFinishedPulling="2025-12-09 11:39:04.368557231 +0000 UTC m=+5876.760608312" observedRunningTime="2025-12-09 11:39:06.930654085 +0000 UTC m=+5879.322705176" watchObservedRunningTime="2025-12-09 11:39:06.939402009 +0000 UTC m=+5879.331453090" Dec 09 11:39:07 crc kubenswrapper[5002]: I1209 11:39:07.366305 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9glwj-config-b444r"] Dec 09 11:39:07 crc kubenswrapper[5002]: I1209 11:39:07.377956 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9glwj-config-b444r"] Dec 09 11:39:08 crc kubenswrapper[5002]: I1209 11:39:08.090914 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eccc3e93-7acc-4ba6-8b72-afb47a23bf63" path="/var/lib/kubelet/pods/eccc3e93-7acc-4ba6-8b72-afb47a23bf63/volumes" Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.690874 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.740013 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-combined-ca-bundle\") pod \"d660fbb0-443c-4a2d-9e92-1a89bac33302\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.740138 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data-merged\") pod \"d660fbb0-443c-4a2d-9e92-1a89bac33302\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.740248 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data\") pod \"d660fbb0-443c-4a2d-9e92-1a89bac33302\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.740354 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-scripts\") pod \"d660fbb0-443c-4a2d-9e92-1a89bac33302\" (UID: \"d660fbb0-443c-4a2d-9e92-1a89bac33302\") " Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.745865 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data" (OuterVolumeSpecName: "config-data") pod "d660fbb0-443c-4a2d-9e92-1a89bac33302" (UID: "d660fbb0-443c-4a2d-9e92-1a89bac33302"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.754507 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-scripts" (OuterVolumeSpecName: "scripts") pod "d660fbb0-443c-4a2d-9e92-1a89bac33302" (UID: "d660fbb0-443c-4a2d-9e92-1a89bac33302"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.769699 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "d660fbb0-443c-4a2d-9e92-1a89bac33302" (UID: "d660fbb0-443c-4a2d-9e92-1a89bac33302"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.770771 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d660fbb0-443c-4a2d-9e92-1a89bac33302" (UID: "d660fbb0-443c-4a2d-9e92-1a89bac33302"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.841701 5002 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.841967 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.842034 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.842101 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d660fbb0-443c-4a2d-9e92-1a89bac33302-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.928157 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-lcrd7" event={"ID":"d660fbb0-443c-4a2d-9e92-1a89bac33302","Type":"ContainerDied","Data":"380b4e48afdfc8578aac3bd39ef0cb5796bbd8f14b642ba8e9f0d535956124b7"} Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.928196 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="380b4e48afdfc8578aac3bd39ef0cb5796bbd8f14b642ba8e9f0d535956124b7" Dec 09 11:39:09 crc kubenswrapper[5002]: I1209 11:39:09.928246 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-lcrd7" Dec 09 11:39:10 crc kubenswrapper[5002]: I1209 11:39:10.439380 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:39:10 crc kubenswrapper[5002]: I1209 11:39:10.471271 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7f5cdc56bb-d9kbw" Dec 09 11:39:12 crc kubenswrapper[5002]: I1209 11:39:12.559682 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-5gzkr" Dec 09 11:39:12 crc kubenswrapper[5002]: I1209 11:39:12.954499 5002 generic.go:334] "Generic (PLEG): container finished" podID="4f95a775-11dd-4a27-af99-e10c0c2823f3" containerID="075f54b60ea092ea182017047b5150c749df3903a6880f6b2daf440969fea3f5" exitCode=0 Dec 09 11:39:12 crc kubenswrapper[5002]: I1209 11:39:12.954549 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-cs89r" event={"ID":"4f95a775-11dd-4a27-af99-e10c0c2823f3","Type":"ContainerDied","Data":"075f54b60ea092ea182017047b5150c749df3903a6880f6b2daf440969fea3f5"} Dec 09 11:39:14 crc kubenswrapper[5002]: I1209 11:39:14.973432 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-cs89r" event={"ID":"4f95a775-11dd-4a27-af99-e10c0c2823f3","Type":"ContainerStarted","Data":"d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba"} Dec 09 11:39:14 crc kubenswrapper[5002]: I1209 11:39:14.992443 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-cs89r" podStartSLOduration=1.900870574 podStartE2EDuration="16.992421009s" podCreationTimestamp="2025-12-09 11:38:58 +0000 UTC" firstStartedPulling="2025-12-09 11:38:58.925061752 +0000 UTC m=+5871.317112833" lastFinishedPulling="2025-12-09 11:39:14.016612187 +0000 UTC m=+5886.408663268" observedRunningTime="2025-12-09 11:39:14.988351001 +0000 UTC m=+5887.380402082" watchObservedRunningTime="2025-12-09 11:39:14.992421009 +0000 UTC m=+5887.384472090" Dec 09 11:39:40 crc kubenswrapper[5002]: I1209 11:39:40.227976 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cs89r"] Dec 09 11:39:40 crc kubenswrapper[5002]: I1209 11:39:40.228618 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-cs89r" podUID="4f95a775-11dd-4a27-af99-e10c0c2823f3" containerName="octavia-amphora-httpd" containerID="cri-o://d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba" gracePeriod=30 Dec 09 11:39:40 crc kubenswrapper[5002]: I1209 11:39:40.806546 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-cs89r" Dec 09 11:39:40 crc kubenswrapper[5002]: I1209 11:39:40.894629 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4f95a775-11dd-4a27-af99-e10c0c2823f3-httpd-config\") pod \"4f95a775-11dd-4a27-af99-e10c0c2823f3\" (UID: \"4f95a775-11dd-4a27-af99-e10c0c2823f3\") " Dec 09 11:39:40 crc kubenswrapper[5002]: I1209 11:39:40.894691 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/4f95a775-11dd-4a27-af99-e10c0c2823f3-amphora-image\") pod \"4f95a775-11dd-4a27-af99-e10c0c2823f3\" (UID: \"4f95a775-11dd-4a27-af99-e10c0c2823f3\") " Dec 09 11:39:40 crc kubenswrapper[5002]: I1209 11:39:40.938216 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f95a775-11dd-4a27-af99-e10c0c2823f3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4f95a775-11dd-4a27-af99-e10c0c2823f3" (UID: "4f95a775-11dd-4a27-af99-e10c0c2823f3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:39:40 crc kubenswrapper[5002]: I1209 11:39:40.993066 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f95a775-11dd-4a27-af99-e10c0c2823f3-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "4f95a775-11dd-4a27-af99-e10c0c2823f3" (UID: "4f95a775-11dd-4a27-af99-e10c0c2823f3"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:39:40 crc kubenswrapper[5002]: I1209 11:39:40.997039 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4f95a775-11dd-4a27-af99-e10c0c2823f3-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:40 crc kubenswrapper[5002]: I1209 11:39:40.997073 5002 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/4f95a775-11dd-4a27-af99-e10c0c2823f3-amphora-image\") on node \"crc\" DevicePath \"\"" Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.247682 5002 generic.go:334] "Generic (PLEG): container finished" podID="4f95a775-11dd-4a27-af99-e10c0c2823f3" containerID="d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba" exitCode=0 Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.247742 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-cs89r" Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.247772 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-cs89r" event={"ID":"4f95a775-11dd-4a27-af99-e10c0c2823f3","Type":"ContainerDied","Data":"d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba"} Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.248145 5002 scope.go:117] "RemoveContainer" containerID="d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba" Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.248268 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-cs89r" event={"ID":"4f95a775-11dd-4a27-af99-e10c0c2823f3","Type":"ContainerDied","Data":"c5f11f30312bdd5b7665414ff5dbf80395f5b083ba0d942174021eea9c3f3338"} Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.297782 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cs89r"] Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.298910 5002 scope.go:117] "RemoveContainer" containerID="075f54b60ea092ea182017047b5150c749df3903a6880f6b2daf440969fea3f5" Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.306708 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cs89r"] Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.324022 5002 scope.go:117] "RemoveContainer" containerID="d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba" Dec 09 11:39:41 crc kubenswrapper[5002]: E1209 11:39:41.324529 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba\": container with ID starting with d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba not found: ID does not exist" containerID="d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba" Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.324560 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba"} err="failed to get container status \"d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba\": rpc error: code = NotFound desc = could not find container \"d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba\": container with ID starting with d5445ca96c0ea3ef5a0f0a32a9682715db2cc30bd94e25cc10d2d907191d06ba not found: ID does not exist" Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.324580 5002 scope.go:117] "RemoveContainer" containerID="075f54b60ea092ea182017047b5150c749df3903a6880f6b2daf440969fea3f5" Dec 09 11:39:41 crc kubenswrapper[5002]: E1209 11:39:41.325003 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075f54b60ea092ea182017047b5150c749df3903a6880f6b2daf440969fea3f5\": container with ID starting with 075f54b60ea092ea182017047b5150c749df3903a6880f6b2daf440969fea3f5 not found: ID does not exist" containerID="075f54b60ea092ea182017047b5150c749df3903a6880f6b2daf440969fea3f5" Dec 09 11:39:41 crc kubenswrapper[5002]: I1209 11:39:41.325044 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075f54b60ea092ea182017047b5150c749df3903a6880f6b2daf440969fea3f5"} err="failed to get container status \"075f54b60ea092ea182017047b5150c749df3903a6880f6b2daf440969fea3f5\": rpc error: code = NotFound desc = could not find container \"075f54b60ea092ea182017047b5150c749df3903a6880f6b2daf440969fea3f5\": container with ID starting with 075f54b60ea092ea182017047b5150c749df3903a6880f6b2daf440969fea3f5 not found: ID does not exist" Dec 09 11:39:42 crc kubenswrapper[5002]: I1209 11:39:42.076960 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f95a775-11dd-4a27-af99-e10c0c2823f3" path="/var/lib/kubelet/pods/4f95a775-11dd-4a27-af99-e10c0c2823f3/volumes" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.580944 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-njkxr"] Dec 09 11:39:44 crc kubenswrapper[5002]: E1209 11:39:44.581738 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eccc3e93-7acc-4ba6-8b72-afb47a23bf63" containerName="ovn-config" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.581754 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="eccc3e93-7acc-4ba6-8b72-afb47a23bf63" containerName="ovn-config" Dec 09 11:39:44 crc kubenswrapper[5002]: E1209 11:39:44.581771 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f95a775-11dd-4a27-af99-e10c0c2823f3" containerName="init" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.581778 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f95a775-11dd-4a27-af99-e10c0c2823f3" containerName="init" Dec 09 11:39:44 crc kubenswrapper[5002]: E1209 11:39:44.581797 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f95a775-11dd-4a27-af99-e10c0c2823f3" containerName="octavia-amphora-httpd" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.581805 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f95a775-11dd-4a27-af99-e10c0c2823f3" containerName="octavia-amphora-httpd" Dec 09 11:39:44 crc kubenswrapper[5002]: E1209 11:39:44.581855 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d660fbb0-443c-4a2d-9e92-1a89bac33302" containerName="octavia-db-sync" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.581864 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d660fbb0-443c-4a2d-9e92-1a89bac33302" containerName="octavia-db-sync" Dec 09 11:39:44 crc kubenswrapper[5002]: E1209 11:39:44.581877 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d660fbb0-443c-4a2d-9e92-1a89bac33302" containerName="init" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.581883 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d660fbb0-443c-4a2d-9e92-1a89bac33302" containerName="init" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.582131 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="eccc3e93-7acc-4ba6-8b72-afb47a23bf63" containerName="ovn-config" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.582166 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f95a775-11dd-4a27-af99-e10c0c2823f3" containerName="octavia-amphora-httpd" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.582181 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d660fbb0-443c-4a2d-9e92-1a89bac33302" containerName="octavia-db-sync" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.583457 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-njkxr" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.585675 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.602339 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-njkxr"] Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.665965 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e83bfed7-2c34-4972-bcac-89bb8621051c-httpd-config\") pod \"octavia-image-upload-59f8cff499-njkxr\" (UID: \"e83bfed7-2c34-4972-bcac-89bb8621051c\") " pod="openstack/octavia-image-upload-59f8cff499-njkxr" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.666149 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/e83bfed7-2c34-4972-bcac-89bb8621051c-amphora-image\") pod \"octavia-image-upload-59f8cff499-njkxr\" (UID: \"e83bfed7-2c34-4972-bcac-89bb8621051c\") " pod="openstack/octavia-image-upload-59f8cff499-njkxr" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.767425 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/e83bfed7-2c34-4972-bcac-89bb8621051c-amphora-image\") pod \"octavia-image-upload-59f8cff499-njkxr\" (UID: \"e83bfed7-2c34-4972-bcac-89bb8621051c\") " pod="openstack/octavia-image-upload-59f8cff499-njkxr" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.767524 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e83bfed7-2c34-4972-bcac-89bb8621051c-httpd-config\") pod \"octavia-image-upload-59f8cff499-njkxr\" (UID: \"e83bfed7-2c34-4972-bcac-89bb8621051c\") " pod="openstack/octavia-image-upload-59f8cff499-njkxr" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.767930 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/e83bfed7-2c34-4972-bcac-89bb8621051c-amphora-image\") pod \"octavia-image-upload-59f8cff499-njkxr\" (UID: \"e83bfed7-2c34-4972-bcac-89bb8621051c\") " pod="openstack/octavia-image-upload-59f8cff499-njkxr" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.774143 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e83bfed7-2c34-4972-bcac-89bb8621051c-httpd-config\") pod \"octavia-image-upload-59f8cff499-njkxr\" (UID: \"e83bfed7-2c34-4972-bcac-89bb8621051c\") " pod="openstack/octavia-image-upload-59f8cff499-njkxr" Dec 09 11:39:44 crc kubenswrapper[5002]: I1209 11:39:44.919414 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-njkxr" Dec 09 11:39:45 crc kubenswrapper[5002]: I1209 11:39:45.598055 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-njkxr"] Dec 09 11:39:46 crc kubenswrapper[5002]: I1209 11:39:46.301081 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-njkxr" event={"ID":"e83bfed7-2c34-4972-bcac-89bb8621051c","Type":"ContainerStarted","Data":"5c2557dd5e019e9777c1438b48c84bba7d47a2d46b5502e589f66942a88af833"} Dec 09 11:39:47 crc kubenswrapper[5002]: I1209 11:39:47.315919 5002 generic.go:334] "Generic (PLEG): container finished" podID="e83bfed7-2c34-4972-bcac-89bb8621051c" containerID="0e61ed68a1115dc02fafe33f26ea6efe27c8c9046915395c237c0ea4331568d8" exitCode=0 Dec 09 11:39:47 crc kubenswrapper[5002]: I1209 11:39:47.316034 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-njkxr" event={"ID":"e83bfed7-2c34-4972-bcac-89bb8621051c","Type":"ContainerDied","Data":"0e61ed68a1115dc02fafe33f26ea6efe27c8c9046915395c237c0ea4331568d8"} Dec 09 11:39:50 crc kubenswrapper[5002]: I1209 11:39:50.345663 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-njkxr" event={"ID":"e83bfed7-2c34-4972-bcac-89bb8621051c","Type":"ContainerStarted","Data":"0da255fdea6c2b4ba9b3de690f57ecd622b0d5302ecf1216ce4a1229c352c4ce"} Dec 09 11:39:50 crc kubenswrapper[5002]: I1209 11:39:50.369959 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-njkxr" podStartSLOduration=2.829666185 podStartE2EDuration="6.369940954s" podCreationTimestamp="2025-12-09 11:39:44 +0000 UTC" firstStartedPulling="2025-12-09 11:39:45.614971203 +0000 UTC m=+5918.007022294" lastFinishedPulling="2025-12-09 11:39:49.155245982 +0000 UTC m=+5921.547297063" observedRunningTime="2025-12-09 11:39:50.366755439 +0000 UTC m=+5922.758806590" watchObservedRunningTime="2025-12-09 11:39:50.369940954 +0000 UTC m=+5922.761992045" Dec 09 11:40:03 crc kubenswrapper[5002]: I1209 11:40:03.933338 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-jdm5x"] Dec 09 11:40:03 crc kubenswrapper[5002]: I1209 11:40:03.936591 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:03 crc kubenswrapper[5002]: I1209 11:40:03.942653 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Dec 09 11:40:03 crc kubenswrapper[5002]: I1209 11:40:03.942926 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Dec 09 11:40:03 crc kubenswrapper[5002]: I1209 11:40:03.943047 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Dec 09 11:40:03 crc kubenswrapper[5002]: I1209 11:40:03.946989 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-jdm5x"] Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.060271 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3b1a1027-677c-4d78-b807-84091ef5fa84-config-data-merged\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.060517 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-config-data\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.060569 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3b1a1027-677c-4d78-b807-84091ef5fa84-hm-ports\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.060855 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-amphora-certs\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.060963 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-scripts\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.061003 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-combined-ca-bundle\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.163030 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3b1a1027-677c-4d78-b807-84091ef5fa84-hm-ports\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.163135 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-amphora-certs\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.163166 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-scripts\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.163285 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-combined-ca-bundle\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.163331 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3b1a1027-677c-4d78-b807-84091ef5fa84-config-data-merged\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.163405 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-config-data\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.164991 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3b1a1027-677c-4d78-b807-84091ef5fa84-config-data-merged\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.165427 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3b1a1027-677c-4d78-b807-84091ef5fa84-hm-ports\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.169663 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-amphora-certs\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.184449 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-combined-ca-bundle\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.184591 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-config-data\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.185319 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1a1027-677c-4d78-b807-84091ef5fa84-scripts\") pod \"octavia-healthmanager-jdm5x\" (UID: \"3b1a1027-677c-4d78-b807-84091ef5fa84\") " pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.253610 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:04 crc kubenswrapper[5002]: I1209 11:40:04.867299 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-jdm5x"] Dec 09 11:40:04 crc kubenswrapper[5002]: W1209 11:40:04.872291 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b1a1027_677c_4d78_b807_84091ef5fa84.slice/crio-a99e47ccd78263858660a81da9b22f0847fb71ed41eb2307e1ed6ed8afab163b WatchSource:0}: Error finding container a99e47ccd78263858660a81da9b22f0847fb71ed41eb2307e1ed6ed8afab163b: Status 404 returned error can't find the container with id a99e47ccd78263858660a81da9b22f0847fb71ed41eb2307e1ed6ed8afab163b Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.365804 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-ljd8f"] Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.368515 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.370910 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.374244 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.378973 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-ljd8f"] Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.488343 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-config-data\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.488401 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-amphora-certs\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.488501 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-combined-ca-bundle\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.488559 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/649fd26a-484b-46f2-b1d5-5acffdf62265-hm-ports\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.488661 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-scripts\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.488714 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/649fd26a-484b-46f2-b1d5-5acffdf62265-config-data-merged\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.500544 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jdm5x" event={"ID":"3b1a1027-677c-4d78-b807-84091ef5fa84","Type":"ContainerStarted","Data":"6ba1a705791d7dd4a37c3b640259db15bd2446a242ee166d1c06c3912e775dcb"} Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.500594 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jdm5x" event={"ID":"3b1a1027-677c-4d78-b807-84091ef5fa84","Type":"ContainerStarted","Data":"a99e47ccd78263858660a81da9b22f0847fb71ed41eb2307e1ed6ed8afab163b"} Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.590680 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-config-data\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.591054 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-amphora-certs\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.591079 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-combined-ca-bundle\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.591106 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/649fd26a-484b-46f2-b1d5-5acffdf62265-hm-ports\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.591182 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-scripts\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.591206 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/649fd26a-484b-46f2-b1d5-5acffdf62265-config-data-merged\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.591666 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/649fd26a-484b-46f2-b1d5-5acffdf62265-config-data-merged\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.592474 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/649fd26a-484b-46f2-b1d5-5acffdf62265-hm-ports\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.598539 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-combined-ca-bundle\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.599510 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-config-data\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.601673 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-scripts\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.604700 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/649fd26a-484b-46f2-b1d5-5acffdf62265-amphora-certs\") pod \"octavia-housekeeping-ljd8f\" (UID: \"649fd26a-484b-46f2-b1d5-5acffdf62265\") " pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:05 crc kubenswrapper[5002]: I1209 11:40:05.690090 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.172902 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-l54zc"] Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.175259 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.180335 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.185613 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-l54zc"] Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.187978 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.281940 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-ljd8f"] Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.310780 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-config-data\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.310846 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-combined-ca-bundle\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.310925 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a1f5f591-fa99-49ac-a17c-b643790ffdba-hm-ports\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.311204 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-scripts\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.311340 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f591-fa99-49ac-a17c-b643790ffdba-config-data-merged\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.311551 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-amphora-certs\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.412955 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-amphora-certs\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.413285 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-config-data\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.413309 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-combined-ca-bundle\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.413371 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a1f5f591-fa99-49ac-a17c-b643790ffdba-hm-ports\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.413439 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-scripts\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.413490 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f591-fa99-49ac-a17c-b643790ffdba-config-data-merged\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.414146 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f591-fa99-49ac-a17c-b643790ffdba-config-data-merged\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.414989 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a1f5f591-fa99-49ac-a17c-b643790ffdba-hm-ports\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.418482 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-amphora-certs\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.419336 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-config-data\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.420898 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-scripts\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.433226 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f5f591-fa99-49ac-a17c-b643790ffdba-combined-ca-bundle\") pod \"octavia-worker-l54zc\" (UID: \"a1f5f591-fa99-49ac-a17c-b643790ffdba\") " pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.507995 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-l54zc" Dec 09 11:40:06 crc kubenswrapper[5002]: I1209 11:40:06.515206 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ljd8f" event={"ID":"649fd26a-484b-46f2-b1d5-5acffdf62265","Type":"ContainerStarted","Data":"717d7336ac93a012473a478f3963bc156986e842d492fddb4c914d4a6c9e3ba9"} Dec 09 11:40:07 crc kubenswrapper[5002]: I1209 11:40:07.021525 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-l54zc"] Dec 09 11:40:07 crc kubenswrapper[5002]: W1209 11:40:07.114804 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1f5f591_fa99_49ac_a17c_b643790ffdba.slice/crio-3ce397fd4099178e1c2cfe5f65043628362120ab9414353f1253f0038ef098d8 WatchSource:0}: Error finding container 3ce397fd4099178e1c2cfe5f65043628362120ab9414353f1253f0038ef098d8: Status 404 returned error can't find the container with id 3ce397fd4099178e1c2cfe5f65043628362120ab9414353f1253f0038ef098d8 Dec 09 11:40:07 crc kubenswrapper[5002]: I1209 11:40:07.528998 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-l54zc" event={"ID":"a1f5f591-fa99-49ac-a17c-b643790ffdba","Type":"ContainerStarted","Data":"3ce397fd4099178e1c2cfe5f65043628362120ab9414353f1253f0038ef098d8"} Dec 09 11:40:07 crc kubenswrapper[5002]: I1209 11:40:07.531220 5002 generic.go:334] "Generic (PLEG): container finished" podID="3b1a1027-677c-4d78-b807-84091ef5fa84" containerID="6ba1a705791d7dd4a37c3b640259db15bd2446a242ee166d1c06c3912e775dcb" exitCode=0 Dec 09 11:40:07 crc kubenswrapper[5002]: I1209 11:40:07.531253 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jdm5x" event={"ID":"3b1a1027-677c-4d78-b807-84091ef5fa84","Type":"ContainerDied","Data":"6ba1a705791d7dd4a37c3b640259db15bd2446a242ee166d1c06c3912e775dcb"} Dec 09 11:40:07 crc kubenswrapper[5002]: I1209 11:40:07.964529 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:40:07 crc kubenswrapper[5002]: I1209 11:40:07.965426 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:40:08 crc kubenswrapper[5002]: I1209 11:40:08.545316 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ljd8f" event={"ID":"649fd26a-484b-46f2-b1d5-5acffdf62265","Type":"ContainerStarted","Data":"3697f8a0ca2ca97552e53d90eb9defcca3c144b8baa1293fbe9c88836fc99138"} Dec 09 11:40:08 crc kubenswrapper[5002]: I1209 11:40:08.552424 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jdm5x" event={"ID":"3b1a1027-677c-4d78-b807-84091ef5fa84","Type":"ContainerStarted","Data":"279d51b2b09b46cde2cf19e476a070427e9ae2be457ef8041cb2c2bfeb271857"} Dec 09 11:40:08 crc kubenswrapper[5002]: I1209 11:40:08.553065 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:08 crc kubenswrapper[5002]: I1209 11:40:08.598025 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-jdm5x" podStartSLOduration=5.598006397 podStartE2EDuration="5.598006397s" podCreationTimestamp="2025-12-09 11:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:40:08.588266537 +0000 UTC m=+5940.980317628" watchObservedRunningTime="2025-12-09 11:40:08.598006397 +0000 UTC m=+5940.990057478" Dec 09 11:40:09 crc kubenswrapper[5002]: I1209 11:40:09.570163 5002 generic.go:334] "Generic (PLEG): container finished" podID="649fd26a-484b-46f2-b1d5-5acffdf62265" containerID="3697f8a0ca2ca97552e53d90eb9defcca3c144b8baa1293fbe9c88836fc99138" exitCode=0 Dec 09 11:40:09 crc kubenswrapper[5002]: I1209 11:40:09.570207 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ljd8f" event={"ID":"649fd26a-484b-46f2-b1d5-5acffdf62265","Type":"ContainerDied","Data":"3697f8a0ca2ca97552e53d90eb9defcca3c144b8baa1293fbe9c88836fc99138"} Dec 09 11:40:09 crc kubenswrapper[5002]: I1209 11:40:09.575839 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-l54zc" event={"ID":"a1f5f591-fa99-49ac-a17c-b643790ffdba","Type":"ContainerStarted","Data":"e8572a4411dfde54999c42e1bcec2d8358b0403876913ffb406fe35ca987fb96"} Dec 09 11:40:10 crc kubenswrapper[5002]: I1209 11:40:10.589643 5002 generic.go:334] "Generic (PLEG): container finished" podID="a1f5f591-fa99-49ac-a17c-b643790ffdba" containerID="e8572a4411dfde54999c42e1bcec2d8358b0403876913ffb406fe35ca987fb96" exitCode=0 Dec 09 11:40:10 crc kubenswrapper[5002]: I1209 11:40:10.589834 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-l54zc" event={"ID":"a1f5f591-fa99-49ac-a17c-b643790ffdba","Type":"ContainerDied","Data":"e8572a4411dfde54999c42e1bcec2d8358b0403876913ffb406fe35ca987fb96"} Dec 09 11:40:10 crc kubenswrapper[5002]: I1209 11:40:10.592403 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ljd8f" event={"ID":"649fd26a-484b-46f2-b1d5-5acffdf62265","Type":"ContainerStarted","Data":"0122e523e5e4a37bbdaafa6a86f3826009b36224bdaa5a1069d47891a6f49116"} Dec 09 11:40:10 crc kubenswrapper[5002]: I1209 11:40:10.593383 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:10 crc kubenswrapper[5002]: I1209 11:40:10.635231 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-ljd8f" podStartSLOduration=4.429382286 podStartE2EDuration="5.63520208s" podCreationTimestamp="2025-12-09 11:40:05 +0000 UTC" firstStartedPulling="2025-12-09 11:40:06.265584367 +0000 UTC m=+5938.657635448" lastFinishedPulling="2025-12-09 11:40:07.471404161 +0000 UTC m=+5939.863455242" observedRunningTime="2025-12-09 11:40:10.634303486 +0000 UTC m=+5943.026354567" watchObservedRunningTime="2025-12-09 11:40:10.63520208 +0000 UTC m=+5943.027253181" Dec 09 11:40:11 crc kubenswrapper[5002]: I1209 11:40:11.603428 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-l54zc" event={"ID":"a1f5f591-fa99-49ac-a17c-b643790ffdba","Type":"ContainerStarted","Data":"eec489984f10f0c7de7af8204b48650b3c6447c49eb621a6460cdf69f2464dbf"} Dec 09 11:40:11 crc kubenswrapper[5002]: I1209 11:40:11.605213 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-l54zc" Dec 09 11:40:11 crc kubenswrapper[5002]: I1209 11:40:11.637550 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-l54zc" podStartSLOduration=4.326227355 podStartE2EDuration="5.637530221s" podCreationTimestamp="2025-12-09 11:40:06 +0000 UTC" firstStartedPulling="2025-12-09 11:40:07.117615668 +0000 UTC m=+5939.509666799" lastFinishedPulling="2025-12-09 11:40:08.428918584 +0000 UTC m=+5940.820969665" observedRunningTime="2025-12-09 11:40:11.634257674 +0000 UTC m=+5944.026308755" watchObservedRunningTime="2025-12-09 11:40:11.637530221 +0000 UTC m=+5944.029581302" Dec 09 11:40:19 crc kubenswrapper[5002]: I1209 11:40:19.285014 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-jdm5x" Dec 09 11:40:20 crc kubenswrapper[5002]: I1209 11:40:20.745039 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-ljd8f" Dec 09 11:40:21 crc kubenswrapper[5002]: I1209 11:40:21.546992 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-l54zc" Dec 09 11:40:34 crc kubenswrapper[5002]: I1209 11:40:34.058928 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-tdh2j"] Dec 09 11:40:34 crc kubenswrapper[5002]: I1209 11:40:34.075288 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-132e-account-create-update-nq22q"] Dec 09 11:40:34 crc kubenswrapper[5002]: I1209 11:40:34.083694 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-tdh2j"] Dec 09 11:40:34 crc kubenswrapper[5002]: I1209 11:40:34.090947 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-132e-account-create-update-nq22q"] Dec 09 11:40:36 crc kubenswrapper[5002]: I1209 11:40:36.077316 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="600f358b-6892-4b2c-8f7a-f1f011c7a1e5" path="/var/lib/kubelet/pods/600f358b-6892-4b2c-8f7a-f1f011c7a1e5/volumes" Dec 09 11:40:36 crc kubenswrapper[5002]: I1209 11:40:36.080353 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c8f540f-8211-4f60-befd-2b8d255d97aa" path="/var/lib/kubelet/pods/7c8f540f-8211-4f60-befd-2b8d255d97aa/volumes" Dec 09 11:40:37 crc kubenswrapper[5002]: I1209 11:40:37.964321 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:40:37 crc kubenswrapper[5002]: I1209 11:40:37.964891 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:40:44 crc kubenswrapper[5002]: I1209 11:40:44.033550 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-sfv58"] Dec 09 11:40:44 crc kubenswrapper[5002]: I1209 11:40:44.045274 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-sfv58"] Dec 09 11:40:44 crc kubenswrapper[5002]: I1209 11:40:44.073061 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56b1e4c9-eb29-4110-8379-85134901c4db" path="/var/lib/kubelet/pods/56b1e4c9-eb29-4110-8379-85134901c4db/volumes" Dec 09 11:40:47 crc kubenswrapper[5002]: I1209 11:40:47.192530 5002 scope.go:117] "RemoveContainer" containerID="661d25efeb4a9e4d58d461b6867ea7e1d8db71eabef039493c4574bb5905533f" Dec 09 11:40:47 crc kubenswrapper[5002]: I1209 11:40:47.223688 5002 scope.go:117] "RemoveContainer" containerID="4b315336bfdc7b0f0e2cdaf3df1f2b62e25b6819f8f96c21ed763017270ccd5f" Dec 09 11:40:47 crc kubenswrapper[5002]: I1209 11:40:47.293524 5002 scope.go:117] "RemoveContainer" containerID="737a0a8e432adfc8cb9edb16c5409db21219565eaef2c6787d5e97490f7d880c" Dec 09 11:41:07 crc kubenswrapper[5002]: I1209 11:41:07.964758 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:41:07 crc kubenswrapper[5002]: I1209 11:41:07.965725 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:41:07 crc kubenswrapper[5002]: I1209 11:41:07.965797 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 11:41:07 crc kubenswrapper[5002]: I1209 11:41:07.966904 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:41:07 crc kubenswrapper[5002]: I1209 11:41:07.967011 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" gracePeriod=600 Dec 09 11:41:08 crc kubenswrapper[5002]: E1209 11:41:08.102782 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:41:08 crc kubenswrapper[5002]: I1209 11:41:08.199427 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" exitCode=0 Dec 09 11:41:08 crc kubenswrapper[5002]: I1209 11:41:08.199508 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e"} Dec 09 11:41:08 crc kubenswrapper[5002]: I1209 11:41:08.199970 5002 scope.go:117] "RemoveContainer" containerID="c9279947ed00c5b6531641df0eb3e04f34e3d816632d088e326b1acbc67d09a2" Dec 09 11:41:08 crc kubenswrapper[5002]: I1209 11:41:08.200703 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:41:08 crc kubenswrapper[5002]: E1209 11:41:08.201005 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.009932 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57b4f846bf-cvsfb"] Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.011916 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.014944 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.014961 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.015212 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-dm55t" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.026253 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.027866 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57b4f846bf-cvsfb"] Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.069744 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.070324 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" containerName="glance-log" containerID="cri-o://11bd6b82419787317723b421efc0133288435a57b6cf87cdfe4c3b856dbc19da" gracePeriod=30 Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.070425 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" containerName="glance-httpd" containerID="cri-o://95de9323ecc7ac5f2f04f3b35d37d92ccbed743d76744ec9714c03a948831708" gracePeriod=30 Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.156691 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.156986 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" containerName="glance-log" containerID="cri-o://9e1fdd6323ae596bb3e174d484097706c3446d95c512a3dc7a9c52a735175649" gracePeriod=30 Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.157059 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" containerName="glance-httpd" containerID="cri-o://2ad7495d08109df0a1d09a4f59f28f80dbd0450288de4c579c6ddea94badcc3c" gracePeriod=30 Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.161142 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-scripts\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.161341 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhxv2\" (UniqueName: \"kubernetes.io/projected/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-kube-api-access-lhxv2\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.161485 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-config-data\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.161592 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-logs\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.161690 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-horizon-secret-key\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.221228 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76d4949bc5-j4vhr"] Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.232169 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.236520 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76d4949bc5-j4vhr"] Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.268913 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-config-data\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.269010 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-scripts\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.269046 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-scripts\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.269070 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhxv2\" (UniqueName: \"kubernetes.io/projected/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-kube-api-access-lhxv2\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.269150 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46d15617-ecbb-44d1-aa58-e11b6d40cc25-logs\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.269205 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-config-data\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.269297 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnlp6\" (UniqueName: \"kubernetes.io/projected/46d15617-ecbb-44d1-aa58-e11b6d40cc25-kube-api-access-jnlp6\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.270855 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46d15617-ecbb-44d1-aa58-e11b6d40cc25-horizon-secret-key\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.270945 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-logs\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.270970 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-scripts\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.271342 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-horizon-secret-key\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.271970 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-logs\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.275141 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-config-data\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.279239 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-horizon-secret-key\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.293330 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhxv2\" (UniqueName: \"kubernetes.io/projected/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-kube-api-access-lhxv2\") pod \"horizon-57b4f846bf-cvsfb\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.342401 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.373638 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46d15617-ecbb-44d1-aa58-e11b6d40cc25-logs\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.374117 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnlp6\" (UniqueName: \"kubernetes.io/projected/46d15617-ecbb-44d1-aa58-e11b6d40cc25-kube-api-access-jnlp6\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.374151 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46d15617-ecbb-44d1-aa58-e11b6d40cc25-horizon-secret-key\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.374177 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46d15617-ecbb-44d1-aa58-e11b6d40cc25-logs\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.374257 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-config-data\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.374395 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-scripts\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.375176 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-scripts\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.375705 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-config-data\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.379229 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46d15617-ecbb-44d1-aa58-e11b6d40cc25-horizon-secret-key\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.393013 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnlp6\" (UniqueName: \"kubernetes.io/projected/46d15617-ecbb-44d1-aa58-e11b6d40cc25-kube-api-access-jnlp6\") pod \"horizon-76d4949bc5-j4vhr\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.568751 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.715711 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57b4f846bf-cvsfb"] Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.752095 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68dbf898cc-4fcz6"] Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.758202 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.782639 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68dbf898cc-4fcz6"] Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.852151 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57b4f846bf-cvsfb"] Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.881371 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-config-data\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.882133 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f4050891-a1f6-4bf8-b612-d516dcac072a-horizon-secret-key\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.882278 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lrbq\" (UniqueName: \"kubernetes.io/projected/f4050891-a1f6-4bf8-b612-d516dcac072a-kube-api-access-6lrbq\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.882385 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-scripts\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.882425 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4050891-a1f6-4bf8-b612-d516dcac072a-logs\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.984395 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lrbq\" (UniqueName: \"kubernetes.io/projected/f4050891-a1f6-4bf8-b612-d516dcac072a-kube-api-access-6lrbq\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.985005 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-scripts\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.985151 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4050891-a1f6-4bf8-b612-d516dcac072a-logs\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.985275 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-config-data\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.985568 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f4050891-a1f6-4bf8-b612-d516dcac072a-horizon-secret-key\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.985793 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-scripts\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.986266 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4050891-a1f6-4bf8-b612-d516dcac072a-logs\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.987067 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-config-data\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:09 crc kubenswrapper[5002]: I1209 11:41:09.991503 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f4050891-a1f6-4bf8-b612-d516dcac072a-horizon-secret-key\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:10 crc kubenswrapper[5002]: I1209 11:41:09.999976 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lrbq\" (UniqueName: \"kubernetes.io/projected/f4050891-a1f6-4bf8-b612-d516dcac072a-kube-api-access-6lrbq\") pod \"horizon-68dbf898cc-4fcz6\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:10 crc kubenswrapper[5002]: I1209 11:41:10.090362 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76d4949bc5-j4vhr"] Dec 09 11:41:10 crc kubenswrapper[5002]: I1209 11:41:10.101161 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:10 crc kubenswrapper[5002]: I1209 11:41:10.224923 5002 generic.go:334] "Generic (PLEG): container finished" podID="5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" containerID="11bd6b82419787317723b421efc0133288435a57b6cf87cdfe4c3b856dbc19da" exitCode=143 Dec 09 11:41:10 crc kubenswrapper[5002]: I1209 11:41:10.224987 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d","Type":"ContainerDied","Data":"11bd6b82419787317723b421efc0133288435a57b6cf87cdfe4c3b856dbc19da"} Dec 09 11:41:10 crc kubenswrapper[5002]: I1209 11:41:10.228607 5002 generic.go:334] "Generic (PLEG): container finished" podID="4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" containerID="9e1fdd6323ae596bb3e174d484097706c3446d95c512a3dc7a9c52a735175649" exitCode=143 Dec 09 11:41:10 crc kubenswrapper[5002]: I1209 11:41:10.228676 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b","Type":"ContainerDied","Data":"9e1fdd6323ae596bb3e174d484097706c3446d95c512a3dc7a9c52a735175649"} Dec 09 11:41:10 crc kubenswrapper[5002]: I1209 11:41:10.232480 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b4f846bf-cvsfb" event={"ID":"dc2facf6-dfa2-431a-990c-3ea84fc68c9a","Type":"ContainerStarted","Data":"45130d4976480e37d0f45588e82ad8793a8b9b663e47c6ca04d2a43562aadde4"} Dec 09 11:41:10 crc kubenswrapper[5002]: I1209 11:41:10.238145 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d4949bc5-j4vhr" event={"ID":"46d15617-ecbb-44d1-aa58-e11b6d40cc25","Type":"ContainerStarted","Data":"d049ad467d5d22c1e7c551c7b308c0e010aac1fcb5288a0eaf8d8b0c464c8dd2"} Dec 09 11:41:10 crc kubenswrapper[5002]: I1209 11:41:10.562828 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68dbf898cc-4fcz6"] Dec 09 11:41:10 crc kubenswrapper[5002]: W1209 11:41:10.576511 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4050891_a1f6_4bf8_b612_d516dcac072a.slice/crio-6d44d3109e57a866592e294800588a368113c3cd5411ef12b35f63627eb3ae6d WatchSource:0}: Error finding container 6d44d3109e57a866592e294800588a368113c3cd5411ef12b35f63627eb3ae6d: Status 404 returned error can't find the container with id 6d44d3109e57a866592e294800588a368113c3cd5411ef12b35f63627eb3ae6d Dec 09 11:41:11 crc kubenswrapper[5002]: I1209 11:41:11.254840 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68dbf898cc-4fcz6" event={"ID":"f4050891-a1f6-4bf8-b612-d516dcac072a","Type":"ContainerStarted","Data":"6d44d3109e57a866592e294800588a368113c3cd5411ef12b35f63627eb3ae6d"} Dec 09 11:41:13 crc kubenswrapper[5002]: I1209 11:41:13.046190 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9fe7-account-create-update-t2szh"] Dec 09 11:41:13 crc kubenswrapper[5002]: I1209 11:41:13.059489 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9fe7-account-create-update-t2szh"] Dec 09 11:41:13 crc kubenswrapper[5002]: I1209 11:41:13.070591 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-mmk2t"] Dec 09 11:41:13 crc kubenswrapper[5002]: I1209 11:41:13.081114 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-mmk2t"] Dec 09 11:41:13 crc kubenswrapper[5002]: I1209 11:41:13.275058 5002 generic.go:334] "Generic (PLEG): container finished" podID="4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" containerID="2ad7495d08109df0a1d09a4f59f28f80dbd0450288de4c579c6ddea94badcc3c" exitCode=0 Dec 09 11:41:13 crc kubenswrapper[5002]: I1209 11:41:13.275148 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b","Type":"ContainerDied","Data":"2ad7495d08109df0a1d09a4f59f28f80dbd0450288de4c579c6ddea94badcc3c"} Dec 09 11:41:13 crc kubenswrapper[5002]: I1209 11:41:13.278749 5002 generic.go:334] "Generic (PLEG): container finished" podID="5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" containerID="95de9323ecc7ac5f2f04f3b35d37d92ccbed743d76744ec9714c03a948831708" exitCode=0 Dec 09 11:41:13 crc kubenswrapper[5002]: I1209 11:41:13.278786 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d","Type":"ContainerDied","Data":"95de9323ecc7ac5f2f04f3b35d37d92ccbed743d76744ec9714c03a948831708"} Dec 09 11:41:14 crc kubenswrapper[5002]: I1209 11:41:14.071915 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c402dd03-2420-4dcc-a745-0685e3a997c8" path="/var/lib/kubelet/pods/c402dd03-2420-4dcc-a745-0685e3a997c8/volumes" Dec 09 11:41:14 crc kubenswrapper[5002]: I1209 11:41:14.258399 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d023e717-31e3-445f-9634-6cc0e92b6870" path="/var/lib/kubelet/pods/d023e717-31e3-445f-9634-6cc0e92b6870/volumes" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.528768 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.629625 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pqn6\" (UniqueName: \"kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-kube-api-access-6pqn6\") pod \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.629718 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-config-data\") pod \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.629778 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-httpd-run\") pod \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.629805 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-combined-ca-bundle\") pod \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.629852 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-scripts\") pod \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.629877 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-ceph\") pod \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.629918 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-logs\") pod \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\" (UID: \"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.630874 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-logs" (OuterVolumeSpecName: "logs") pod "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" (UID: "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.632605 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" (UID: "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.633655 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-kube-api-access-6pqn6" (OuterVolumeSpecName: "kube-api-access-6pqn6") pod "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" (UID: "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b"). InnerVolumeSpecName "kube-api-access-6pqn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.634860 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-ceph" (OuterVolumeSpecName: "ceph") pod "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" (UID: "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.637049 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-scripts" (OuterVolumeSpecName: "scripts") pod "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" (UID: "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.667689 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.676999 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" (UID: "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.725541 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-config-data" (OuterVolumeSpecName: "config-data") pod "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" (UID: "4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.733772 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pqn6\" (UniqueName: \"kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-kube-api-access-6pqn6\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.733806 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.733837 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.733856 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.733868 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.733876 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.733885 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.837042 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-combined-ca-bundle\") pod \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.837262 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-scripts\") pod \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.837300 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-logs\") pod \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.837349 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-ceph\") pod \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.837404 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp54h\" (UniqueName: \"kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-kube-api-access-xp54h\") pod \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.837461 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-httpd-run\") pod \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.838847 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-logs" (OuterVolumeSpecName: "logs") pod "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" (UID: "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.839411 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-config-data\") pod \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\" (UID: \"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d\") " Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.840616 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" (UID: "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.841392 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.841418 5002 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.849597 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-scripts" (OuterVolumeSpecName: "scripts") pod "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" (UID: "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.849750 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-kube-api-access-xp54h" (OuterVolumeSpecName: "kube-api-access-xp54h") pod "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" (UID: "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d"). InnerVolumeSpecName "kube-api-access-xp54h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.850104 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-ceph" (OuterVolumeSpecName: "ceph") pod "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" (UID: "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.879343 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" (UID: "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.896966 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-config-data" (OuterVolumeSpecName: "config-data") pod "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" (UID: "5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.942244 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.942276 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.942291 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp54h\" (UniqueName: \"kubernetes.io/projected/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-kube-api-access-xp54h\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.942301 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:16 crc kubenswrapper[5002]: I1209 11:41:16.942310 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.315522 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b4f846bf-cvsfb" event={"ID":"dc2facf6-dfa2-431a-990c-3ea84fc68c9a","Type":"ContainerStarted","Data":"f828e7da8511fee0f5e22e44498d4d65d8f944038ddef26a16e1371e474191b2"} Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.315579 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b4f846bf-cvsfb" event={"ID":"dc2facf6-dfa2-431a-990c-3ea84fc68c9a","Type":"ContainerStarted","Data":"3e23337b00595597c892834a902a89fd0a72d0b8eb97bd66e8574b7d51edc592"} Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.315670 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57b4f846bf-cvsfb" podUID="dc2facf6-dfa2-431a-990c-3ea84fc68c9a" containerName="horizon-log" containerID="cri-o://3e23337b00595597c892834a902a89fd0a72d0b8eb97bd66e8574b7d51edc592" gracePeriod=30 Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.315717 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57b4f846bf-cvsfb" podUID="dc2facf6-dfa2-431a-990c-3ea84fc68c9a" containerName="horizon" containerID="cri-o://f828e7da8511fee0f5e22e44498d4d65d8f944038ddef26a16e1371e474191b2" gracePeriod=30 Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.320799 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d4949bc5-j4vhr" event={"ID":"46d15617-ecbb-44d1-aa58-e11b6d40cc25","Type":"ContainerStarted","Data":"74c0450337f14d8144b94f3f2ba6e587df110d16a39c00eb2009f3931524240e"} Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.320867 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d4949bc5-j4vhr" event={"ID":"46d15617-ecbb-44d1-aa58-e11b6d40cc25","Type":"ContainerStarted","Data":"859b6ba1875aa02b0fe6422dceba243c53a343217e25a307f90ddfe174205394"} Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.323685 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d","Type":"ContainerDied","Data":"cea9e9832989b520bdc9a7743d44c29484ece69eecb96defacc9211a172138a7"} Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.323738 5002 scope.go:117] "RemoveContainer" containerID="95de9323ecc7ac5f2f04f3b35d37d92ccbed743d76744ec9714c03a948831708" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.323948 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.326462 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68dbf898cc-4fcz6" event={"ID":"f4050891-a1f6-4bf8-b612-d516dcac072a","Type":"ContainerStarted","Data":"ce071d4e632eaabf87d73b39d82e96c163b1f99773a26ef04c7b1b00c0c4daff"} Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.326514 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68dbf898cc-4fcz6" event={"ID":"f4050891-a1f6-4bf8-b612-d516dcac072a","Type":"ContainerStarted","Data":"fa0c95cd6a5a2021273ed0c0ea0a76748db43764b31355aa204820d63547a606"} Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.330530 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b","Type":"ContainerDied","Data":"03e341703efdba3405a458fa7d8616e3693ff42f9e523a6dd89fe07f0c8c0336"} Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.330719 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.353752 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-57b4f846bf-cvsfb" podStartSLOduration=2.975138384 podStartE2EDuration="9.353731366s" podCreationTimestamp="2025-12-09 11:41:08 +0000 UTC" firstStartedPulling="2025-12-09 11:41:09.854459838 +0000 UTC m=+6002.246510919" lastFinishedPulling="2025-12-09 11:41:16.23305282 +0000 UTC m=+6008.625103901" observedRunningTime="2025-12-09 11:41:17.342699511 +0000 UTC m=+6009.734750592" watchObservedRunningTime="2025-12-09 11:41:17.353731366 +0000 UTC m=+6009.745782447" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.360765 5002 scope.go:117] "RemoveContainer" containerID="11bd6b82419787317723b421efc0133288435a57b6cf87cdfe4c3b856dbc19da" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.384707 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76d4949bc5-j4vhr" podStartSLOduration=2.214095676 podStartE2EDuration="8.384685004s" podCreationTimestamp="2025-12-09 11:41:09 +0000 UTC" firstStartedPulling="2025-12-09 11:41:10.094467978 +0000 UTC m=+6002.486519059" lastFinishedPulling="2025-12-09 11:41:16.265057306 +0000 UTC m=+6008.657108387" observedRunningTime="2025-12-09 11:41:17.360895758 +0000 UTC m=+6009.752946859" watchObservedRunningTime="2025-12-09 11:41:17.384685004 +0000 UTC m=+6009.776736085" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.399647 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-68dbf898cc-4fcz6" podStartSLOduration=2.729888113 podStartE2EDuration="8.399626174s" podCreationTimestamp="2025-12-09 11:41:09 +0000 UTC" firstStartedPulling="2025-12-09 11:41:10.579998425 +0000 UTC m=+6002.972049516" lastFinishedPulling="2025-12-09 11:41:16.249736496 +0000 UTC m=+6008.641787577" observedRunningTime="2025-12-09 11:41:17.382140866 +0000 UTC m=+6009.774191957" watchObservedRunningTime="2025-12-09 11:41:17.399626174 +0000 UTC m=+6009.791677255" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.400134 5002 scope.go:117] "RemoveContainer" containerID="2ad7495d08109df0a1d09a4f59f28f80dbd0450288de4c579c6ddea94badcc3c" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.425630 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.434250 5002 scope.go:117] "RemoveContainer" containerID="9e1fdd6323ae596bb3e174d484097706c3446d95c512a3dc7a9c52a735175649" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.449262 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.479095 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.492954 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:41:17 crc kubenswrapper[5002]: E1209 11:41:17.493489 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" containerName="glance-log" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.493516 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" containerName="glance-log" Dec 09 11:41:17 crc kubenswrapper[5002]: E1209 11:41:17.493531 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" containerName="glance-httpd" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.493541 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" containerName="glance-httpd" Dec 09 11:41:17 crc kubenswrapper[5002]: E1209 11:41:17.493562 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" containerName="glance-httpd" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.493570 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" containerName="glance-httpd" Dec 09 11:41:17 crc kubenswrapper[5002]: E1209 11:41:17.493590 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" containerName="glance-log" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.493597 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" containerName="glance-log" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.493888 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" containerName="glance-log" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.493910 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" containerName="glance-httpd" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.493926 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" containerName="glance-log" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.493935 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" containerName="glance-httpd" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.495252 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.497864 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dchft" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.499452 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.499900 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.500643 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.515754 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.527319 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.529103 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.531182 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.539045 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.552272 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.552378 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.552410 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.552437 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.552461 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.552502 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.552863 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.552939 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-logs\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.552960 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r5cw\" (UniqueName: \"kubernetes.io/projected/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-kube-api-access-7r5cw\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.552988 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.553007 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll6hl\" (UniqueName: \"kubernetes.io/projected/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-kube-api-access-ll6hl\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.553050 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.553080 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-ceph\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.553105 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653561 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653612 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-ceph\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653636 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653663 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653713 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653729 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653744 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653758 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653786 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653826 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653864 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-logs\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653881 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r5cw\" (UniqueName: \"kubernetes.io/projected/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-kube-api-access-7r5cw\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653906 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.653922 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6hl\" (UniqueName: \"kubernetes.io/projected/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-kube-api-access-ll6hl\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.654498 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.655989 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.657304 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.659037 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-logs\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.661362 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.665343 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.674190 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.674555 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.678847 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-ceph\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.679417 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll6hl\" (UniqueName: \"kubernetes.io/projected/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-kube-api-access-ll6hl\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.687772 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75ade8b-f2b9-4102-a485-11ad67b3dd5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e75ade8b-f2b9-4102-a485-11ad67b3dd5f\") " pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.693504 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.693527 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r5cw\" (UniqueName: \"kubernetes.io/projected/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-kube-api-access-7r5cw\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.694196 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c2535b42-a1ab-41bb-9b0e-7970e69d34a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c2535b42-a1ab-41bb-9b0e-7970e69d34a3\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.819935 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:41:17 crc kubenswrapper[5002]: I1209 11:41:17.848345 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:18 crc kubenswrapper[5002]: I1209 11:41:18.086189 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b" path="/var/lib/kubelet/pods/4dbc5fe6-6a0d-41b9-a92b-a6ebfe50fc5b/volumes" Dec 09 11:41:18 crc kubenswrapper[5002]: I1209 11:41:18.087020 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d" path="/var/lib/kubelet/pods/5e09d71b-d0a2-4fb5-91e0-c75ac7972e9d/volumes" Dec 09 11:41:18 crc kubenswrapper[5002]: I1209 11:41:18.418603 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:41:19 crc kubenswrapper[5002]: I1209 11:41:19.342544 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:19 crc kubenswrapper[5002]: I1209 11:41:19.356459 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e75ade8b-f2b9-4102-a485-11ad67b3dd5f","Type":"ContainerStarted","Data":"6a4ce5eaf8cbb4af15dd33709c0c7e219e56c69507561da6828cdab2f7ea4e5d"} Dec 09 11:41:19 crc kubenswrapper[5002]: I1209 11:41:19.356519 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e75ade8b-f2b9-4102-a485-11ad67b3dd5f","Type":"ContainerStarted","Data":"3cc9e4400e8c4958170d53231a932a36998451bad3d5cd1f4341eb54e3381940"} Dec 09 11:41:19 crc kubenswrapper[5002]: I1209 11:41:19.470460 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:41:19 crc kubenswrapper[5002]: I1209 11:41:19.569025 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:19 crc kubenswrapper[5002]: I1209 11:41:19.569076 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:20 crc kubenswrapper[5002]: I1209 11:41:20.101592 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:20 crc kubenswrapper[5002]: I1209 11:41:20.103067 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:20 crc kubenswrapper[5002]: I1209 11:41:20.370154 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c2535b42-a1ab-41bb-9b0e-7970e69d34a3","Type":"ContainerStarted","Data":"85e8aafaf9e8db895321180e98eb4e0c5f75e704a1dc1ebf3cffb13bc4fd92eb"} Dec 09 11:41:20 crc kubenswrapper[5002]: I1209 11:41:20.370212 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c2535b42-a1ab-41bb-9b0e-7970e69d34a3","Type":"ContainerStarted","Data":"b3e6ee0eb2c2ecc4996cb4b145c9ca8da7b73051f5987d54c270e0a530aa9397"} Dec 09 11:41:20 crc kubenswrapper[5002]: I1209 11:41:20.373802 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e75ade8b-f2b9-4102-a485-11ad67b3dd5f","Type":"ContainerStarted","Data":"1ace592df6669655f0cfcbd6bd9bab75288c143373447aee3818e62bdaaab242"} Dec 09 11:41:20 crc kubenswrapper[5002]: I1209 11:41:20.406802 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.406783843 podStartE2EDuration="3.406783843s" podCreationTimestamp="2025-12-09 11:41:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:41:20.399759065 +0000 UTC m=+6012.791810176" watchObservedRunningTime="2025-12-09 11:41:20.406783843 +0000 UTC m=+6012.798834924" Dec 09 11:41:21 crc kubenswrapper[5002]: I1209 11:41:21.060403 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:41:21 crc kubenswrapper[5002]: E1209 11:41:21.061135 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:41:21 crc kubenswrapper[5002]: I1209 11:41:21.384476 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c2535b42-a1ab-41bb-9b0e-7970e69d34a3","Type":"ContainerStarted","Data":"36def276eda562658118f7a34a23bbc055a59218a4822c89cea58af9bd183475"} Dec 09 11:41:21 crc kubenswrapper[5002]: I1209 11:41:21.413559 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.413536212 podStartE2EDuration="4.413536212s" podCreationTimestamp="2025-12-09 11:41:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:41:21.402964629 +0000 UTC m=+6013.795015720" watchObservedRunningTime="2025-12-09 11:41:21.413536212 +0000 UTC m=+6013.805587303" Dec 09 11:41:22 crc kubenswrapper[5002]: I1209 11:41:22.087517 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-6qhmt"] Dec 09 11:41:22 crc kubenswrapper[5002]: I1209 11:41:22.087561 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-6qhmt"] Dec 09 11:41:24 crc kubenswrapper[5002]: I1209 11:41:24.312171 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6451fbf1-4860-402d-a862-6e320f7177e8" path="/var/lib/kubelet/pods/6451fbf1-4860-402d-a862-6e320f7177e8/volumes" Dec 09 11:41:27 crc kubenswrapper[5002]: I1209 11:41:27.821224 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 11:41:27 crc kubenswrapper[5002]: I1209 11:41:27.822951 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 11:41:27 crc kubenswrapper[5002]: I1209 11:41:27.849212 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:27 crc kubenswrapper[5002]: I1209 11:41:27.850417 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:27 crc kubenswrapper[5002]: I1209 11:41:27.860303 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 11:41:27 crc kubenswrapper[5002]: I1209 11:41:27.874507 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 11:41:27 crc kubenswrapper[5002]: I1209 11:41:27.919180 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:27 crc kubenswrapper[5002]: I1209 11:41:27.924118 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:28 crc kubenswrapper[5002]: I1209 11:41:28.456638 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 11:41:28 crc kubenswrapper[5002]: I1209 11:41:28.457025 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 11:41:28 crc kubenswrapper[5002]: I1209 11:41:28.457598 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:28 crc kubenswrapper[5002]: I1209 11:41:28.457617 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:29 crc kubenswrapper[5002]: I1209 11:41:29.572083 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76d4949bc5-j4vhr" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.105:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.105:8080: connect: connection refused" Dec 09 11:41:30 crc kubenswrapper[5002]: I1209 11:41:30.112385 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-68dbf898cc-4fcz6" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.106:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.106:8080: connect: connection refused" Dec 09 11:41:31 crc kubenswrapper[5002]: I1209 11:41:31.035041 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:31 crc kubenswrapper[5002]: I1209 11:41:31.035257 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:41:31 crc kubenswrapper[5002]: I1209 11:41:31.131953 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 11:41:31 crc kubenswrapper[5002]: I1209 11:41:31.132091 5002 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:41:31 crc kubenswrapper[5002]: I1209 11:41:31.164261 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 11:41:31 crc kubenswrapper[5002]: I1209 11:41:31.267302 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 11:41:36 crc kubenswrapper[5002]: I1209 11:41:36.062638 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:41:36 crc kubenswrapper[5002]: E1209 11:41:36.063391 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:41:41 crc kubenswrapper[5002]: I1209 11:41:41.324808 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:41 crc kubenswrapper[5002]: I1209 11:41:41.857801 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:43 crc kubenswrapper[5002]: I1209 11:41:43.003660 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:41:43 crc kubenswrapper[5002]: I1209 11:41:43.504117 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:41:43 crc kubenswrapper[5002]: I1209 11:41:43.624553 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76d4949bc5-j4vhr"] Dec 09 11:41:43 crc kubenswrapper[5002]: I1209 11:41:43.633176 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76d4949bc5-j4vhr" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon-log" containerID="cri-o://859b6ba1875aa02b0fe6422dceba243c53a343217e25a307f90ddfe174205394" gracePeriod=30 Dec 09 11:41:43 crc kubenswrapper[5002]: I1209 11:41:43.633245 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76d4949bc5-j4vhr" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon" containerID="cri-o://74c0450337f14d8144b94f3f2ba6e587df110d16a39c00eb2009f3931524240e" gracePeriod=30 Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.415014 5002 scope.go:117] "RemoveContainer" containerID="7c44740fc24f76b2cec5038f0aad39170b0333d0a8578ff0449ee320600e59db" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.484022 5002 scope.go:117] "RemoveContainer" containerID="900ae01be6554a19f311dcf8d5d4536cf3153c982b7cb47869b08e41783ec6bd" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.548791 5002 scope.go:117] "RemoveContainer" containerID="47d7a28455611c23e87f0f23ea5e5ae932d18a03dcc49b0037afe1f4eb41c58f" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.676639 5002 generic.go:334] "Generic (PLEG): container finished" podID="dc2facf6-dfa2-431a-990c-3ea84fc68c9a" containerID="f828e7da8511fee0f5e22e44498d4d65d8f944038ddef26a16e1371e474191b2" exitCode=137 Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.677041 5002 generic.go:334] "Generic (PLEG): container finished" podID="dc2facf6-dfa2-431a-990c-3ea84fc68c9a" containerID="3e23337b00595597c892834a902a89fd0a72d0b8eb97bd66e8574b7d51edc592" exitCode=137 Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.677001 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b4f846bf-cvsfb" event={"ID":"dc2facf6-dfa2-431a-990c-3ea84fc68c9a","Type":"ContainerDied","Data":"f828e7da8511fee0f5e22e44498d4d65d8f944038ddef26a16e1371e474191b2"} Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.677312 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b4f846bf-cvsfb" event={"ID":"dc2facf6-dfa2-431a-990c-3ea84fc68c9a","Type":"ContainerDied","Data":"3e23337b00595597c892834a902a89fd0a72d0b8eb97bd66e8574b7d51edc592"} Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.680085 5002 generic.go:334] "Generic (PLEG): container finished" podID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerID="74c0450337f14d8144b94f3f2ba6e587df110d16a39c00eb2009f3931524240e" exitCode=0 Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.680108 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d4949bc5-j4vhr" event={"ID":"46d15617-ecbb-44d1-aa58-e11b6d40cc25","Type":"ContainerDied","Data":"74c0450337f14d8144b94f3f2ba6e587df110d16a39c00eb2009f3931524240e"} Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.715217 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.870595 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-logs\") pod \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.871127 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-config-data\") pod \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.871251 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-horizon-secret-key\") pod \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.871368 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-logs" (OuterVolumeSpecName: "logs") pod "dc2facf6-dfa2-431a-990c-3ea84fc68c9a" (UID: "dc2facf6-dfa2-431a-990c-3ea84fc68c9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.871472 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-scripts\") pod \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.871573 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhxv2\" (UniqueName: \"kubernetes.io/projected/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-kube-api-access-lhxv2\") pod \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\" (UID: \"dc2facf6-dfa2-431a-990c-3ea84fc68c9a\") " Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.872703 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.878135 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dc2facf6-dfa2-431a-990c-3ea84fc68c9a" (UID: "dc2facf6-dfa2-431a-990c-3ea84fc68c9a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.878211 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-kube-api-access-lhxv2" (OuterVolumeSpecName: "kube-api-access-lhxv2") pod "dc2facf6-dfa2-431a-990c-3ea84fc68c9a" (UID: "dc2facf6-dfa2-431a-990c-3ea84fc68c9a"). InnerVolumeSpecName "kube-api-access-lhxv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.904723 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-config-data" (OuterVolumeSpecName: "config-data") pod "dc2facf6-dfa2-431a-990c-3ea84fc68c9a" (UID: "dc2facf6-dfa2-431a-990c-3ea84fc68c9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.905171 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-scripts" (OuterVolumeSpecName: "scripts") pod "dc2facf6-dfa2-431a-990c-3ea84fc68c9a" (UID: "dc2facf6-dfa2-431a-990c-3ea84fc68c9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.975073 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.975141 5002 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.975156 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:47 crc kubenswrapper[5002]: I1209 11:41:47.975190 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhxv2\" (UniqueName: \"kubernetes.io/projected/dc2facf6-dfa2-431a-990c-3ea84fc68c9a-kube-api-access-lhxv2\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:48 crc kubenswrapper[5002]: I1209 11:41:48.697688 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b4f846bf-cvsfb" event={"ID":"dc2facf6-dfa2-431a-990c-3ea84fc68c9a","Type":"ContainerDied","Data":"45130d4976480e37d0f45588e82ad8793a8b9b663e47c6ca04d2a43562aadde4"} Dec 09 11:41:48 crc kubenswrapper[5002]: I1209 11:41:48.698040 5002 scope.go:117] "RemoveContainer" containerID="f828e7da8511fee0f5e22e44498d4d65d8f944038ddef26a16e1371e474191b2" Dec 09 11:41:48 crc kubenswrapper[5002]: I1209 11:41:48.697988 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57b4f846bf-cvsfb" Dec 09 11:41:48 crc kubenswrapper[5002]: I1209 11:41:48.725365 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57b4f846bf-cvsfb"] Dec 09 11:41:48 crc kubenswrapper[5002]: I1209 11:41:48.735268 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-57b4f846bf-cvsfb"] Dec 09 11:41:48 crc kubenswrapper[5002]: I1209 11:41:48.864984 5002 scope.go:117] "RemoveContainer" containerID="3e23337b00595597c892834a902a89fd0a72d0b8eb97bd66e8574b7d51edc592" Dec 09 11:41:49 crc kubenswrapper[5002]: I1209 11:41:49.060656 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:41:49 crc kubenswrapper[5002]: E1209 11:41:49.061086 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:41:49 crc kubenswrapper[5002]: I1209 11:41:49.570053 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-76d4949bc5-j4vhr" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.105:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.105:8080: connect: connection refused" Dec 09 11:41:50 crc kubenswrapper[5002]: I1209 11:41:50.074165 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc2facf6-dfa2-431a-990c-3ea84fc68c9a" path="/var/lib/kubelet/pods/dc2facf6-dfa2-431a-990c-3ea84fc68c9a/volumes" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.596009 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77b6c8965c-q29mq"] Dec 09 11:41:52 crc kubenswrapper[5002]: E1209 11:41:52.597196 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2facf6-dfa2-431a-990c-3ea84fc68c9a" containerName="horizon-log" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.597215 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2facf6-dfa2-431a-990c-3ea84fc68c9a" containerName="horizon-log" Dec 09 11:41:52 crc kubenswrapper[5002]: E1209 11:41:52.597229 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2facf6-dfa2-431a-990c-3ea84fc68c9a" containerName="horizon" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.597255 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2facf6-dfa2-431a-990c-3ea84fc68c9a" containerName="horizon" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.597550 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc2facf6-dfa2-431a-990c-3ea84fc68c9a" containerName="horizon-log" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.597568 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc2facf6-dfa2-431a-990c-3ea84fc68c9a" containerName="horizon" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.598853 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.605329 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77b6c8965c-q29mq"] Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.688177 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e575f542-5dff-4ec7-b430-8e18382bc2e0-config-data\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.688287 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e575f542-5dff-4ec7-b430-8e18382bc2e0-horizon-secret-key\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.688591 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e575f542-5dff-4ec7-b430-8e18382bc2e0-scripts\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.688821 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e575f542-5dff-4ec7-b430-8e18382bc2e0-logs\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.689087 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6blk4\" (UniqueName: \"kubernetes.io/projected/e575f542-5dff-4ec7-b430-8e18382bc2e0-kube-api-access-6blk4\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.791317 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e575f542-5dff-4ec7-b430-8e18382bc2e0-scripts\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.791401 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e575f542-5dff-4ec7-b430-8e18382bc2e0-logs\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.791526 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6blk4\" (UniqueName: \"kubernetes.io/projected/e575f542-5dff-4ec7-b430-8e18382bc2e0-kube-api-access-6blk4\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.791618 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e575f542-5dff-4ec7-b430-8e18382bc2e0-config-data\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.791635 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e575f542-5dff-4ec7-b430-8e18382bc2e0-horizon-secret-key\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.791920 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e575f542-5dff-4ec7-b430-8e18382bc2e0-logs\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.792484 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e575f542-5dff-4ec7-b430-8e18382bc2e0-scripts\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.792952 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e575f542-5dff-4ec7-b430-8e18382bc2e0-config-data\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.807734 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e575f542-5dff-4ec7-b430-8e18382bc2e0-horizon-secret-key\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.811339 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6blk4\" (UniqueName: \"kubernetes.io/projected/e575f542-5dff-4ec7-b430-8e18382bc2e0-kube-api-access-6blk4\") pod \"horizon-77b6c8965c-q29mq\" (UID: \"e575f542-5dff-4ec7-b430-8e18382bc2e0\") " pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:52 crc kubenswrapper[5002]: I1209 11:41:52.929653 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:41:53 crc kubenswrapper[5002]: I1209 11:41:53.424217 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77b6c8965c-q29mq"] Dec 09 11:41:53 crc kubenswrapper[5002]: I1209 11:41:53.750726 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77b6c8965c-q29mq" event={"ID":"e575f542-5dff-4ec7-b430-8e18382bc2e0","Type":"ContainerStarted","Data":"5f7653c680abc431a7646e704e42f8fd7c8e6686d14b98d60ce7544a560192fc"} Dec 09 11:41:53 crc kubenswrapper[5002]: I1209 11:41:53.750764 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77b6c8965c-q29mq" event={"ID":"e575f542-5dff-4ec7-b430-8e18382bc2e0","Type":"ContainerStarted","Data":"94081e9d77ca6a92ad208bed211503c4ce602470b5c9abe03f5ff4545d9bc66e"} Dec 09 11:41:53 crc kubenswrapper[5002]: I1209 11:41:53.968989 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-ffd2-account-create-update-7dng6"] Dec 09 11:41:53 crc kubenswrapper[5002]: I1209 11:41:53.971923 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ffd2-account-create-update-7dng6" Dec 09 11:41:53 crc kubenswrapper[5002]: I1209 11:41:53.975070 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 09 11:41:53 crc kubenswrapper[5002]: I1209 11:41:53.983095 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-nbj95"] Dec 09 11:41:53 crc kubenswrapper[5002]: I1209 11:41:53.984351 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nbj95" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.005446 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-ffd2-account-create-update-7dng6"] Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.016887 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nbj95"] Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.118718 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59341756-042a-491d-a6de-f3a3f8d06cd2-operator-scripts\") pod \"heat-ffd2-account-create-update-7dng6\" (UID: \"59341756-042a-491d-a6de-f3a3f8d06cd2\") " pod="openstack/heat-ffd2-account-create-update-7dng6" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.118773 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndwbg\" (UniqueName: \"kubernetes.io/projected/4700fedc-db85-433e-b657-cecd4f746ca2-kube-api-access-ndwbg\") pod \"heat-db-create-nbj95\" (UID: \"4700fedc-db85-433e-b657-cecd4f746ca2\") " pod="openstack/heat-db-create-nbj95" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.118848 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4700fedc-db85-433e-b657-cecd4f746ca2-operator-scripts\") pod \"heat-db-create-nbj95\" (UID: \"4700fedc-db85-433e-b657-cecd4f746ca2\") " pod="openstack/heat-db-create-nbj95" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.118868 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5wg2\" (UniqueName: \"kubernetes.io/projected/59341756-042a-491d-a6de-f3a3f8d06cd2-kube-api-access-t5wg2\") pod \"heat-ffd2-account-create-update-7dng6\" (UID: \"59341756-042a-491d-a6de-f3a3f8d06cd2\") " pod="openstack/heat-ffd2-account-create-update-7dng6" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.220960 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59341756-042a-491d-a6de-f3a3f8d06cd2-operator-scripts\") pod \"heat-ffd2-account-create-update-7dng6\" (UID: \"59341756-042a-491d-a6de-f3a3f8d06cd2\") " pod="openstack/heat-ffd2-account-create-update-7dng6" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.221022 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndwbg\" (UniqueName: \"kubernetes.io/projected/4700fedc-db85-433e-b657-cecd4f746ca2-kube-api-access-ndwbg\") pod \"heat-db-create-nbj95\" (UID: \"4700fedc-db85-433e-b657-cecd4f746ca2\") " pod="openstack/heat-db-create-nbj95" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.221124 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4700fedc-db85-433e-b657-cecd4f746ca2-operator-scripts\") pod \"heat-db-create-nbj95\" (UID: \"4700fedc-db85-433e-b657-cecd4f746ca2\") " pod="openstack/heat-db-create-nbj95" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.221154 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5wg2\" (UniqueName: \"kubernetes.io/projected/59341756-042a-491d-a6de-f3a3f8d06cd2-kube-api-access-t5wg2\") pod \"heat-ffd2-account-create-update-7dng6\" (UID: \"59341756-042a-491d-a6de-f3a3f8d06cd2\") " pod="openstack/heat-ffd2-account-create-update-7dng6" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.221826 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59341756-042a-491d-a6de-f3a3f8d06cd2-operator-scripts\") pod \"heat-ffd2-account-create-update-7dng6\" (UID: \"59341756-042a-491d-a6de-f3a3f8d06cd2\") " pod="openstack/heat-ffd2-account-create-update-7dng6" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.222085 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4700fedc-db85-433e-b657-cecd4f746ca2-operator-scripts\") pod \"heat-db-create-nbj95\" (UID: \"4700fedc-db85-433e-b657-cecd4f746ca2\") " pod="openstack/heat-db-create-nbj95" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.257747 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndwbg\" (UniqueName: \"kubernetes.io/projected/4700fedc-db85-433e-b657-cecd4f746ca2-kube-api-access-ndwbg\") pod \"heat-db-create-nbj95\" (UID: \"4700fedc-db85-433e-b657-cecd4f746ca2\") " pod="openstack/heat-db-create-nbj95" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.270449 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5wg2\" (UniqueName: \"kubernetes.io/projected/59341756-042a-491d-a6de-f3a3f8d06cd2-kube-api-access-t5wg2\") pod \"heat-ffd2-account-create-update-7dng6\" (UID: \"59341756-042a-491d-a6de-f3a3f8d06cd2\") " pod="openstack/heat-ffd2-account-create-update-7dng6" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.299421 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ffd2-account-create-update-7dng6" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.328389 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nbj95" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.760956 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77b6c8965c-q29mq" event={"ID":"e575f542-5dff-4ec7-b430-8e18382bc2e0","Type":"ContainerStarted","Data":"94946dcd7c4a53fa20042c344cba8e5c37a8dcad5c6d9fbae41fbf8055c3f290"} Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.793222 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77b6c8965c-q29mq" podStartSLOduration=2.793200827 podStartE2EDuration="2.793200827s" podCreationTimestamp="2025-12-09 11:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:41:54.78357276 +0000 UTC m=+6047.175623841" watchObservedRunningTime="2025-12-09 11:41:54.793200827 +0000 UTC m=+6047.185251908" Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.860442 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-ffd2-account-create-update-7dng6"] Dec 09 11:41:54 crc kubenswrapper[5002]: W1209 11:41:54.936439 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4700fedc_db85_433e_b657_cecd4f746ca2.slice/crio-19f50b711e7e31a145a71288c009447be86716d6f40b6c9205515c7406e2305c WatchSource:0}: Error finding container 19f50b711e7e31a145a71288c009447be86716d6f40b6c9205515c7406e2305c: Status 404 returned error can't find the container with id 19f50b711e7e31a145a71288c009447be86716d6f40b6c9205515c7406e2305c Dec 09 11:41:54 crc kubenswrapper[5002]: I1209 11:41:54.937234 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nbj95"] Dec 09 11:41:55 crc kubenswrapper[5002]: I1209 11:41:55.771251 5002 generic.go:334] "Generic (PLEG): container finished" podID="59341756-042a-491d-a6de-f3a3f8d06cd2" containerID="707fbd821f391fe27359a758a6735fa7aafcb1ae3a67e8f3c0015f0841e5b3e8" exitCode=0 Dec 09 11:41:55 crc kubenswrapper[5002]: I1209 11:41:55.771432 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ffd2-account-create-update-7dng6" event={"ID":"59341756-042a-491d-a6de-f3a3f8d06cd2","Type":"ContainerDied","Data":"707fbd821f391fe27359a758a6735fa7aafcb1ae3a67e8f3c0015f0841e5b3e8"} Dec 09 11:41:55 crc kubenswrapper[5002]: I1209 11:41:55.771667 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ffd2-account-create-update-7dng6" event={"ID":"59341756-042a-491d-a6de-f3a3f8d06cd2","Type":"ContainerStarted","Data":"2c9e33ca38fbaee87542d5995fac6d20c711302c2bbc2cd8f918878b239f19c5"} Dec 09 11:41:55 crc kubenswrapper[5002]: I1209 11:41:55.774405 5002 generic.go:334] "Generic (PLEG): container finished" podID="4700fedc-db85-433e-b657-cecd4f746ca2" containerID="9cb4bcc27e93416cb6b88f623e6f71e134169d4614fdd7ea8266151b23fc4e28" exitCode=0 Dec 09 11:41:55 crc kubenswrapper[5002]: I1209 11:41:55.774500 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nbj95" event={"ID":"4700fedc-db85-433e-b657-cecd4f746ca2","Type":"ContainerDied","Data":"9cb4bcc27e93416cb6b88f623e6f71e134169d4614fdd7ea8266151b23fc4e28"} Dec 09 11:41:55 crc kubenswrapper[5002]: I1209 11:41:55.774533 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nbj95" event={"ID":"4700fedc-db85-433e-b657-cecd4f746ca2","Type":"ContainerStarted","Data":"19f50b711e7e31a145a71288c009447be86716d6f40b6c9205515c7406e2305c"} Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.220333 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nbj95" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.229312 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ffd2-account-create-update-7dng6" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.316094 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndwbg\" (UniqueName: \"kubernetes.io/projected/4700fedc-db85-433e-b657-cecd4f746ca2-kube-api-access-ndwbg\") pod \"4700fedc-db85-433e-b657-cecd4f746ca2\" (UID: \"4700fedc-db85-433e-b657-cecd4f746ca2\") " Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.316163 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59341756-042a-491d-a6de-f3a3f8d06cd2-operator-scripts\") pod \"59341756-042a-491d-a6de-f3a3f8d06cd2\" (UID: \"59341756-042a-491d-a6de-f3a3f8d06cd2\") " Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.316184 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5wg2\" (UniqueName: \"kubernetes.io/projected/59341756-042a-491d-a6de-f3a3f8d06cd2-kube-api-access-t5wg2\") pod \"59341756-042a-491d-a6de-f3a3f8d06cd2\" (UID: \"59341756-042a-491d-a6de-f3a3f8d06cd2\") " Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.316250 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4700fedc-db85-433e-b657-cecd4f746ca2-operator-scripts\") pod \"4700fedc-db85-433e-b657-cecd4f746ca2\" (UID: \"4700fedc-db85-433e-b657-cecd4f746ca2\") " Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.316650 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59341756-042a-491d-a6de-f3a3f8d06cd2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59341756-042a-491d-a6de-f3a3f8d06cd2" (UID: "59341756-042a-491d-a6de-f3a3f8d06cd2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.316995 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4700fedc-db85-433e-b657-cecd4f746ca2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4700fedc-db85-433e-b657-cecd4f746ca2" (UID: "4700fedc-db85-433e-b657-cecd4f746ca2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.317496 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59341756-042a-491d-a6de-f3a3f8d06cd2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.317517 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4700fedc-db85-433e-b657-cecd4f746ca2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.323040 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4700fedc-db85-433e-b657-cecd4f746ca2-kube-api-access-ndwbg" (OuterVolumeSpecName: "kube-api-access-ndwbg") pod "4700fedc-db85-433e-b657-cecd4f746ca2" (UID: "4700fedc-db85-433e-b657-cecd4f746ca2"). InnerVolumeSpecName "kube-api-access-ndwbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.328953 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59341756-042a-491d-a6de-f3a3f8d06cd2-kube-api-access-t5wg2" (OuterVolumeSpecName: "kube-api-access-t5wg2") pod "59341756-042a-491d-a6de-f3a3f8d06cd2" (UID: "59341756-042a-491d-a6de-f3a3f8d06cd2"). InnerVolumeSpecName "kube-api-access-t5wg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.419543 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndwbg\" (UniqueName: \"kubernetes.io/projected/4700fedc-db85-433e-b657-cecd4f746ca2-kube-api-access-ndwbg\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.419573 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5wg2\" (UniqueName: \"kubernetes.io/projected/59341756-042a-491d-a6de-f3a3f8d06cd2-kube-api-access-t5wg2\") on node \"crc\" DevicePath \"\"" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.802061 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ffd2-account-create-update-7dng6" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.802079 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ffd2-account-create-update-7dng6" event={"ID":"59341756-042a-491d-a6de-f3a3f8d06cd2","Type":"ContainerDied","Data":"2c9e33ca38fbaee87542d5995fac6d20c711302c2bbc2cd8f918878b239f19c5"} Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.802155 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c9e33ca38fbaee87542d5995fac6d20c711302c2bbc2cd8f918878b239f19c5" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.805027 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nbj95" event={"ID":"4700fedc-db85-433e-b657-cecd4f746ca2","Type":"ContainerDied","Data":"19f50b711e7e31a145a71288c009447be86716d6f40b6c9205515c7406e2305c"} Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.805049 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19f50b711e7e31a145a71288c009447be86716d6f40b6c9205515c7406e2305c" Dec 09 11:41:57 crc kubenswrapper[5002]: I1209 11:41:57.805134 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nbj95" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.154459 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-8wvkc"] Dec 09 11:41:59 crc kubenswrapper[5002]: E1209 11:41:59.156965 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4700fedc-db85-433e-b657-cecd4f746ca2" containerName="mariadb-database-create" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.157068 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4700fedc-db85-433e-b657-cecd4f746ca2" containerName="mariadb-database-create" Dec 09 11:41:59 crc kubenswrapper[5002]: E1209 11:41:59.157137 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59341756-042a-491d-a6de-f3a3f8d06cd2" containerName="mariadb-account-create-update" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.157212 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="59341756-042a-491d-a6de-f3a3f8d06cd2" containerName="mariadb-account-create-update" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.157440 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4700fedc-db85-433e-b657-cecd4f746ca2" containerName="mariadb-database-create" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.157504 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="59341756-042a-491d-a6de-f3a3f8d06cd2" containerName="mariadb-account-create-update" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.158245 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8wvkc" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.160347 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-29xdw" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.160499 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.172651 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-8wvkc"] Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.254605 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqfr\" (UniqueName: \"kubernetes.io/projected/665c443a-dd3a-417d-a5b4-8a9851fe80c7-kube-api-access-zkqfr\") pod \"heat-db-sync-8wvkc\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " pod="openstack/heat-db-sync-8wvkc" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.254676 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-config-data\") pod \"heat-db-sync-8wvkc\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " pod="openstack/heat-db-sync-8wvkc" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.254708 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-combined-ca-bundle\") pod \"heat-db-sync-8wvkc\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " pod="openstack/heat-db-sync-8wvkc" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.360521 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqfr\" (UniqueName: \"kubernetes.io/projected/665c443a-dd3a-417d-a5b4-8a9851fe80c7-kube-api-access-zkqfr\") pod \"heat-db-sync-8wvkc\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " pod="openstack/heat-db-sync-8wvkc" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.360892 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-config-data\") pod \"heat-db-sync-8wvkc\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " pod="openstack/heat-db-sync-8wvkc" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.361074 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-combined-ca-bundle\") pod \"heat-db-sync-8wvkc\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " pod="openstack/heat-db-sync-8wvkc" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.369082 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-config-data\") pod \"heat-db-sync-8wvkc\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " pod="openstack/heat-db-sync-8wvkc" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.373241 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-combined-ca-bundle\") pod \"heat-db-sync-8wvkc\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " pod="openstack/heat-db-sync-8wvkc" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.397992 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqfr\" (UniqueName: \"kubernetes.io/projected/665c443a-dd3a-417d-a5b4-8a9851fe80c7-kube-api-access-zkqfr\") pod \"heat-db-sync-8wvkc\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " pod="openstack/heat-db-sync-8wvkc" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.477231 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8wvkc" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.570376 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-76d4949bc5-j4vhr" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.105:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.105:8080: connect: connection refused" Dec 09 11:41:59 crc kubenswrapper[5002]: I1209 11:41:59.959865 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-8wvkc"] Dec 09 11:42:00 crc kubenswrapper[5002]: I1209 11:42:00.860662 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8wvkc" event={"ID":"665c443a-dd3a-417d-a5b4-8a9851fe80c7","Type":"ContainerStarted","Data":"b075229a1a56f5b3d66f11dec3b0da8feb78f7a845a44aad64d575d2bb0f2da0"} Dec 09 11:42:02 crc kubenswrapper[5002]: I1209 11:42:02.930779 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:42:02 crc kubenswrapper[5002]: I1209 11:42:02.931756 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:42:03 crc kubenswrapper[5002]: I1209 11:42:03.061136 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:42:03 crc kubenswrapper[5002]: E1209 11:42:03.061409 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:42:05 crc kubenswrapper[5002]: I1209 11:42:05.061435 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-m58rr"] Dec 09 11:42:05 crc kubenswrapper[5002]: I1209 11:42:05.071631 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-272b-account-create-update-kxfs7"] Dec 09 11:42:05 crc kubenswrapper[5002]: I1209 11:42:05.082270 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-m58rr"] Dec 09 11:42:05 crc kubenswrapper[5002]: I1209 11:42:05.090644 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-272b-account-create-update-kxfs7"] Dec 09 11:42:06 crc kubenswrapper[5002]: I1209 11:42:06.228050 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8df4965-e06f-4000-9b29-5cd4bfbdc827" path="/var/lib/kubelet/pods/b8df4965-e06f-4000-9b29-5cd4bfbdc827/volumes" Dec 09 11:42:06 crc kubenswrapper[5002]: I1209 11:42:06.232303 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d905abdf-06e8-474e-b053-702860fc6a48" path="/var/lib/kubelet/pods/d905abdf-06e8-474e-b053-702860fc6a48/volumes" Dec 09 11:42:07 crc kubenswrapper[5002]: I1209 11:42:07.928549 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8wvkc" event={"ID":"665c443a-dd3a-417d-a5b4-8a9851fe80c7","Type":"ContainerStarted","Data":"d9be45632cab05c8d66b8dcd9fc2dbcb754354846d336d02aed7cabd2c1bdb19"} Dec 09 11:42:07 crc kubenswrapper[5002]: I1209 11:42:07.954356 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-8wvkc" podStartSLOduration=2.202298624 podStartE2EDuration="8.954338735s" podCreationTimestamp="2025-12-09 11:41:59 +0000 UTC" firstStartedPulling="2025-12-09 11:41:59.965004978 +0000 UTC m=+6052.357056059" lastFinishedPulling="2025-12-09 11:42:06.717045049 +0000 UTC m=+6059.109096170" observedRunningTime="2025-12-09 11:42:07.945739515 +0000 UTC m=+6060.337790586" watchObservedRunningTime="2025-12-09 11:42:07.954338735 +0000 UTC m=+6060.346389816" Dec 09 11:42:08 crc kubenswrapper[5002]: I1209 11:42:08.945244 5002 generic.go:334] "Generic (PLEG): container finished" podID="665c443a-dd3a-417d-a5b4-8a9851fe80c7" containerID="d9be45632cab05c8d66b8dcd9fc2dbcb754354846d336d02aed7cabd2c1bdb19" exitCode=0 Dec 09 11:42:08 crc kubenswrapper[5002]: I1209 11:42:08.945469 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8wvkc" event={"ID":"665c443a-dd3a-417d-a5b4-8a9851fe80c7","Type":"ContainerDied","Data":"d9be45632cab05c8d66b8dcd9fc2dbcb754354846d336d02aed7cabd2c1bdb19"} Dec 09 11:42:09 crc kubenswrapper[5002]: I1209 11:42:09.569904 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-76d4949bc5-j4vhr" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.105:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.105:8080: connect: connection refused" Dec 09 11:42:09 crc kubenswrapper[5002]: I1209 11:42:09.570337 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.334057 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8wvkc" Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.509731 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-combined-ca-bundle\") pod \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.509983 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-config-data\") pod \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.510072 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkqfr\" (UniqueName: \"kubernetes.io/projected/665c443a-dd3a-417d-a5b4-8a9851fe80c7-kube-api-access-zkqfr\") pod \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\" (UID: \"665c443a-dd3a-417d-a5b4-8a9851fe80c7\") " Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.532223 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/665c443a-dd3a-417d-a5b4-8a9851fe80c7-kube-api-access-zkqfr" (OuterVolumeSpecName: "kube-api-access-zkqfr") pod "665c443a-dd3a-417d-a5b4-8a9851fe80c7" (UID: "665c443a-dd3a-417d-a5b4-8a9851fe80c7"). InnerVolumeSpecName "kube-api-access-zkqfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.547118 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "665c443a-dd3a-417d-a5b4-8a9851fe80c7" (UID: "665c443a-dd3a-417d-a5b4-8a9851fe80c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.612393 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkqfr\" (UniqueName: \"kubernetes.io/projected/665c443a-dd3a-417d-a5b4-8a9851fe80c7-kube-api-access-zkqfr\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.612439 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.625413 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-config-data" (OuterVolumeSpecName: "config-data") pod "665c443a-dd3a-417d-a5b4-8a9851fe80c7" (UID: "665c443a-dd3a-417d-a5b4-8a9851fe80c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.714690 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/665c443a-dd3a-417d-a5b4-8a9851fe80c7-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.974439 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8wvkc" event={"ID":"665c443a-dd3a-417d-a5b4-8a9851fe80c7","Type":"ContainerDied","Data":"b075229a1a56f5b3d66f11dec3b0da8feb78f7a845a44aad64d575d2bb0f2da0"} Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.974494 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b075229a1a56f5b3d66f11dec3b0da8feb78f7a845a44aad64d575d2bb0f2da0" Dec 09 11:42:10 crc kubenswrapper[5002]: I1209 11:42:10.974552 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8wvkc" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.057967 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5844579b4-rxqx6"] Dec 09 11:42:12 crc kubenswrapper[5002]: E1209 11:42:12.058625 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="665c443a-dd3a-417d-a5b4-8a9851fe80c7" containerName="heat-db-sync" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.058636 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="665c443a-dd3a-417d-a5b4-8a9851fe80c7" containerName="heat-db-sync" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.058909 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="665c443a-dd3a-417d-a5b4-8a9851fe80c7" containerName="heat-db-sync" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.059794 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.069730 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.070153 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.070482 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-29xdw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.094624 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-54698c9446-z6mqw"] Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.097209 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5844579b4-rxqx6"] Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.097327 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.109303 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.109881 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-54698c9446-z6mqw"] Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.146238 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fcab3bf4-1659-4429-93f1-c941380fa773-config-data-custom\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.146318 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcab3bf4-1659-4429-93f1-c941380fa773-config-data\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.146377 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4942\" (UniqueName: \"kubernetes.io/projected/fcab3bf4-1659-4429-93f1-c941380fa773-kube-api-access-j4942\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.146401 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcab3bf4-1659-4429-93f1-c941380fa773-combined-ca-bundle\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.173865 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-74d764bbbb-8rsqh"] Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.175224 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.181156 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.188429 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-74d764bbbb-8rsqh"] Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.248883 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcab3bf4-1659-4429-93f1-c941380fa773-config-data\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.248945 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671233cb-6429-4ecb-ad1d-9a40d9407b60-config-data\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.249110 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4942\" (UniqueName: \"kubernetes.io/projected/fcab3bf4-1659-4429-93f1-c941380fa773-kube-api-access-j4942\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.249133 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcab3bf4-1659-4429-93f1-c941380fa773-combined-ca-bundle\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.249244 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/671233cb-6429-4ecb-ad1d-9a40d9407b60-config-data-custom\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.249296 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pslwr\" (UniqueName: \"kubernetes.io/projected/671233cb-6429-4ecb-ad1d-9a40d9407b60-kube-api-access-pslwr\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.249383 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671233cb-6429-4ecb-ad1d-9a40d9407b60-combined-ca-bundle\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.249409 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fcab3bf4-1659-4429-93f1-c941380fa773-config-data-custom\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.258902 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcab3bf4-1659-4429-93f1-c941380fa773-combined-ca-bundle\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.263003 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcab3bf4-1659-4429-93f1-c941380fa773-config-data\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.278227 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fcab3bf4-1659-4429-93f1-c941380fa773-config-data-custom\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.283660 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4942\" (UniqueName: \"kubernetes.io/projected/fcab3bf4-1659-4429-93f1-c941380fa773-kube-api-access-j4942\") pod \"heat-engine-5844579b4-rxqx6\" (UID: \"fcab3bf4-1659-4429-93f1-c941380fa773\") " pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.351208 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b6ad8c-27bd-4485-94ac-ff455749056e-combined-ca-bundle\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.351283 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671233cb-6429-4ecb-ad1d-9a40d9407b60-config-data\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.351324 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09b6ad8c-27bd-4485-94ac-ff455749056e-config-data-custom\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.351420 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/671233cb-6429-4ecb-ad1d-9a40d9407b60-config-data-custom\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.351451 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pslwr\" (UniqueName: \"kubernetes.io/projected/671233cb-6429-4ecb-ad1d-9a40d9407b60-kube-api-access-pslwr\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.351515 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlndj\" (UniqueName: \"kubernetes.io/projected/09b6ad8c-27bd-4485-94ac-ff455749056e-kube-api-access-qlndj\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.351542 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671233cb-6429-4ecb-ad1d-9a40d9407b60-combined-ca-bundle\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.351560 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b6ad8c-27bd-4485-94ac-ff455749056e-config-data\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.355766 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671233cb-6429-4ecb-ad1d-9a40d9407b60-combined-ca-bundle\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.356347 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671233cb-6429-4ecb-ad1d-9a40d9407b60-config-data\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.357381 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/671233cb-6429-4ecb-ad1d-9a40d9407b60-config-data-custom\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.377085 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pslwr\" (UniqueName: \"kubernetes.io/projected/671233cb-6429-4ecb-ad1d-9a40d9407b60-kube-api-access-pslwr\") pod \"heat-api-54698c9446-z6mqw\" (UID: \"671233cb-6429-4ecb-ad1d-9a40d9407b60\") " pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.396534 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.448957 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.454089 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlndj\" (UniqueName: \"kubernetes.io/projected/09b6ad8c-27bd-4485-94ac-ff455749056e-kube-api-access-qlndj\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.454307 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b6ad8c-27bd-4485-94ac-ff455749056e-config-data\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.454479 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b6ad8c-27bd-4485-94ac-ff455749056e-combined-ca-bundle\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.454608 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09b6ad8c-27bd-4485-94ac-ff455749056e-config-data-custom\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.460234 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b6ad8c-27bd-4485-94ac-ff455749056e-combined-ca-bundle\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.461115 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09b6ad8c-27bd-4485-94ac-ff455749056e-config-data-custom\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.462852 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b6ad8c-27bd-4485-94ac-ff455749056e-config-data\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.475724 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlndj\" (UniqueName: \"kubernetes.io/projected/09b6ad8c-27bd-4485-94ac-ff455749056e-kube-api-access-qlndj\") pod \"heat-cfnapi-74d764bbbb-8rsqh\" (UID: \"09b6ad8c-27bd-4485-94ac-ff455749056e\") " pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.531630 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:12 crc kubenswrapper[5002]: W1209 11:42:12.937937 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcab3bf4_1659_4429_93f1_c941380fa773.slice/crio-46a210fe07ceb338d8c0b6b6753cd6aa365f85f4f3e2f5471dbf24ba032abdae WatchSource:0}: Error finding container 46a210fe07ceb338d8c0b6b6753cd6aa365f85f4f3e2f5471dbf24ba032abdae: Status 404 returned error can't find the container with id 46a210fe07ceb338d8c0b6b6753cd6aa365f85f4f3e2f5471dbf24ba032abdae Dec 09 11:42:12 crc kubenswrapper[5002]: I1209 11:42:12.944249 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5844579b4-rxqx6"] Dec 09 11:42:13 crc kubenswrapper[5002]: I1209 11:42:13.039660 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-l44gb"] Dec 09 11:42:13 crc kubenswrapper[5002]: I1209 11:42:13.053498 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5844579b4-rxqx6" event={"ID":"fcab3bf4-1659-4429-93f1-c941380fa773","Type":"ContainerStarted","Data":"46a210fe07ceb338d8c0b6b6753cd6aa365f85f4f3e2f5471dbf24ba032abdae"} Dec 09 11:42:13 crc kubenswrapper[5002]: I1209 11:42:13.054747 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-l44gb"] Dec 09 11:42:13 crc kubenswrapper[5002]: I1209 11:42:13.114421 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-54698c9446-z6mqw"] Dec 09 11:42:13 crc kubenswrapper[5002]: W1209 11:42:13.114694 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod671233cb_6429_4ecb_ad1d_9a40d9407b60.slice/crio-db2d35ed964340ebda704686a863bff047d3fa00570e82fcee5beff720a1e121 WatchSource:0}: Error finding container db2d35ed964340ebda704686a863bff047d3fa00570e82fcee5beff720a1e121: Status 404 returned error can't find the container with id db2d35ed964340ebda704686a863bff047d3fa00570e82fcee5beff720a1e121 Dec 09 11:42:13 crc kubenswrapper[5002]: W1209 11:42:13.252362 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09b6ad8c_27bd_4485_94ac_ff455749056e.slice/crio-5e9624eecf3affe11f6271688f92667438f1a531c564416ca320f890d3300fcf WatchSource:0}: Error finding container 5e9624eecf3affe11f6271688f92667438f1a531c564416ca320f890d3300fcf: Status 404 returned error can't find the container with id 5e9624eecf3affe11f6271688f92667438f1a531c564416ca320f890d3300fcf Dec 09 11:42:13 crc kubenswrapper[5002]: I1209 11:42:13.254331 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-74d764bbbb-8rsqh"] Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.081180 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c88a1594-851e-43a5-b1ad-ba082a94d556" path="/var/lib/kubelet/pods/c88a1594-851e-43a5-b1ad-ba082a94d556/volumes" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.101158 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" event={"ID":"09b6ad8c-27bd-4485-94ac-ff455749056e","Type":"ContainerStarted","Data":"5e9624eecf3affe11f6271688f92667438f1a531c564416ca320f890d3300fcf"} Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.110014 5002 generic.go:334] "Generic (PLEG): container finished" podID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerID="859b6ba1875aa02b0fe6422dceba243c53a343217e25a307f90ddfe174205394" exitCode=137 Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.110108 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d4949bc5-j4vhr" event={"ID":"46d15617-ecbb-44d1-aa58-e11b6d40cc25","Type":"ContainerDied","Data":"859b6ba1875aa02b0fe6422dceba243c53a343217e25a307f90ddfe174205394"} Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.113693 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54698c9446-z6mqw" event={"ID":"671233cb-6429-4ecb-ad1d-9a40d9407b60","Type":"ContainerStarted","Data":"db2d35ed964340ebda704686a863bff047d3fa00570e82fcee5beff720a1e121"} Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.118408 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5844579b4-rxqx6" event={"ID":"fcab3bf4-1659-4429-93f1-c941380fa773","Type":"ContainerStarted","Data":"67f7940520b14c6a1539381cefdcbd36284df18e7893d5854102f9377e250db3"} Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.119193 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.149573 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5844579b4-rxqx6" podStartSLOduration=2.149550731 podStartE2EDuration="2.149550731s" podCreationTimestamp="2025-12-09 11:42:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:42:14.137829938 +0000 UTC m=+6066.529881039" watchObservedRunningTime="2025-12-09 11:42:14.149550731 +0000 UTC m=+6066.541601812" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.187101 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.296063 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46d15617-ecbb-44d1-aa58-e11b6d40cc25-horizon-secret-key\") pod \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.296164 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-scripts\") pod \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.296185 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-config-data\") pod \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.297539 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46d15617-ecbb-44d1-aa58-e11b6d40cc25-logs\") pod \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.297753 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnlp6\" (UniqueName: \"kubernetes.io/projected/46d15617-ecbb-44d1-aa58-e11b6d40cc25-kube-api-access-jnlp6\") pod \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\" (UID: \"46d15617-ecbb-44d1-aa58-e11b6d40cc25\") " Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.298330 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d15617-ecbb-44d1-aa58-e11b6d40cc25-logs" (OuterVolumeSpecName: "logs") pod "46d15617-ecbb-44d1-aa58-e11b6d40cc25" (UID: "46d15617-ecbb-44d1-aa58-e11b6d40cc25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.300308 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46d15617-ecbb-44d1-aa58-e11b6d40cc25-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.313198 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d15617-ecbb-44d1-aa58-e11b6d40cc25-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "46d15617-ecbb-44d1-aa58-e11b6d40cc25" (UID: "46d15617-ecbb-44d1-aa58-e11b6d40cc25"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.313386 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d15617-ecbb-44d1-aa58-e11b6d40cc25-kube-api-access-jnlp6" (OuterVolumeSpecName: "kube-api-access-jnlp6") pod "46d15617-ecbb-44d1-aa58-e11b6d40cc25" (UID: "46d15617-ecbb-44d1-aa58-e11b6d40cc25"). InnerVolumeSpecName "kube-api-access-jnlp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.330207 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-scripts" (OuterVolumeSpecName: "scripts") pod "46d15617-ecbb-44d1-aa58-e11b6d40cc25" (UID: "46d15617-ecbb-44d1-aa58-e11b6d40cc25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.333764 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-config-data" (OuterVolumeSpecName: "config-data") pod "46d15617-ecbb-44d1-aa58-e11b6d40cc25" (UID: "46d15617-ecbb-44d1-aa58-e11b6d40cc25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.402622 5002 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46d15617-ecbb-44d1-aa58-e11b6d40cc25-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.402665 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.402676 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46d15617-ecbb-44d1-aa58-e11b6d40cc25-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:14 crc kubenswrapper[5002]: I1209 11:42:14.402688 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnlp6\" (UniqueName: \"kubernetes.io/projected/46d15617-ecbb-44d1-aa58-e11b6d40cc25-kube-api-access-jnlp6\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:15 crc kubenswrapper[5002]: I1209 11:42:15.001889 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:42:15 crc kubenswrapper[5002]: I1209 11:42:15.149676 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76d4949bc5-j4vhr" Dec 09 11:42:15 crc kubenswrapper[5002]: I1209 11:42:15.150193 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d4949bc5-j4vhr" event={"ID":"46d15617-ecbb-44d1-aa58-e11b6d40cc25","Type":"ContainerDied","Data":"d049ad467d5d22c1e7c551c7b308c0e010aac1fcb5288a0eaf8d8b0c464c8dd2"} Dec 09 11:42:15 crc kubenswrapper[5002]: I1209 11:42:15.150227 5002 scope.go:117] "RemoveContainer" containerID="74c0450337f14d8144b94f3f2ba6e587df110d16a39c00eb2009f3931524240e" Dec 09 11:42:15 crc kubenswrapper[5002]: I1209 11:42:15.188725 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76d4949bc5-j4vhr"] Dec 09 11:42:15 crc kubenswrapper[5002]: I1209 11:42:15.200398 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76d4949bc5-j4vhr"] Dec 09 11:42:15 crc kubenswrapper[5002]: I1209 11:42:15.927688 5002 scope.go:117] "RemoveContainer" containerID="859b6ba1875aa02b0fe6422dceba243c53a343217e25a307f90ddfe174205394" Dec 09 11:42:16 crc kubenswrapper[5002]: I1209 11:42:16.061195 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:42:16 crc kubenswrapper[5002]: E1209 11:42:16.061696 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:42:16 crc kubenswrapper[5002]: I1209 11:42:16.071801 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" path="/var/lib/kubelet/pods/46d15617-ecbb-44d1-aa58-e11b6d40cc25/volumes" Dec 09 11:42:16 crc kubenswrapper[5002]: I1209 11:42:16.846254 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-77b6c8965c-q29mq" Dec 09 11:42:16 crc kubenswrapper[5002]: I1209 11:42:16.933555 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68dbf898cc-4fcz6"] Dec 09 11:42:16 crc kubenswrapper[5002]: I1209 11:42:16.934124 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68dbf898cc-4fcz6" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon-log" containerID="cri-o://fa0c95cd6a5a2021273ed0c0ea0a76748db43764b31355aa204820d63547a606" gracePeriod=30 Dec 09 11:42:16 crc kubenswrapper[5002]: I1209 11:42:16.934182 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68dbf898cc-4fcz6" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon" containerID="cri-o://ce071d4e632eaabf87d73b39d82e96c163b1f99773a26ef04c7b1b00c0c4daff" gracePeriod=30 Dec 09 11:42:17 crc kubenswrapper[5002]: I1209 11:42:17.197861 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" event={"ID":"09b6ad8c-27bd-4485-94ac-ff455749056e","Type":"ContainerStarted","Data":"1dd45f28ce74f7113f6b4c3f3fd80a88446076eafdb8fa250eae95cd789c5a1c"} Dec 09 11:42:17 crc kubenswrapper[5002]: I1209 11:42:17.198007 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:17 crc kubenswrapper[5002]: I1209 11:42:17.200147 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54698c9446-z6mqw" event={"ID":"671233cb-6429-4ecb-ad1d-9a40d9407b60","Type":"ContainerStarted","Data":"f87c9dd43af35f1922e6ecd1982d59228816184a2e6da0a740b40067823b8d75"} Dec 09 11:42:17 crc kubenswrapper[5002]: I1209 11:42:17.200304 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:17 crc kubenswrapper[5002]: I1209 11:42:17.215148 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" podStartSLOduration=2.335761852 podStartE2EDuration="5.215130132s" podCreationTimestamp="2025-12-09 11:42:12 +0000 UTC" firstStartedPulling="2025-12-09 11:42:13.275300216 +0000 UTC m=+6065.667351297" lastFinishedPulling="2025-12-09 11:42:16.154668466 +0000 UTC m=+6068.546719577" observedRunningTime="2025-12-09 11:42:17.211703641 +0000 UTC m=+6069.603754722" watchObservedRunningTime="2025-12-09 11:42:17.215130132 +0000 UTC m=+6069.607181203" Dec 09 11:42:17 crc kubenswrapper[5002]: I1209 11:42:17.231199 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-54698c9446-z6mqw" podStartSLOduration=2.4179323090000002 podStartE2EDuration="5.231175671s" podCreationTimestamp="2025-12-09 11:42:12 +0000 UTC" firstStartedPulling="2025-12-09 11:42:13.116294672 +0000 UTC m=+6065.508345753" lastFinishedPulling="2025-12-09 11:42:15.929538034 +0000 UTC m=+6068.321589115" observedRunningTime="2025-12-09 11:42:17.225277874 +0000 UTC m=+6069.617328975" watchObservedRunningTime="2025-12-09 11:42:17.231175671 +0000 UTC m=+6069.623226752" Dec 09 11:42:20 crc kubenswrapper[5002]: I1209 11:42:20.102284 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68dbf898cc-4fcz6" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.106:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.106:8080: connect: connection refused" Dec 09 11:42:20 crc kubenswrapper[5002]: I1209 11:42:20.232739 5002 generic.go:334] "Generic (PLEG): container finished" podID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerID="ce071d4e632eaabf87d73b39d82e96c163b1f99773a26ef04c7b1b00c0c4daff" exitCode=0 Dec 09 11:42:20 crc kubenswrapper[5002]: I1209 11:42:20.232804 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68dbf898cc-4fcz6" event={"ID":"f4050891-a1f6-4bf8-b612-d516dcac072a","Type":"ContainerDied","Data":"ce071d4e632eaabf87d73b39d82e96c163b1f99773a26ef04c7b1b00c0c4daff"} Dec 09 11:42:23 crc kubenswrapper[5002]: I1209 11:42:23.820170 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-54698c9446-z6mqw" Dec 09 11:42:23 crc kubenswrapper[5002]: I1209 11:42:23.952530 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-74d764bbbb-8rsqh" Dec 09 11:42:29 crc kubenswrapper[5002]: I1209 11:42:29.060446 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:42:29 crc kubenswrapper[5002]: E1209 11:42:29.061602 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:42:30 crc kubenswrapper[5002]: I1209 11:42:30.101952 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68dbf898cc-4fcz6" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.106:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.106:8080: connect: connection refused" Dec 09 11:42:32 crc kubenswrapper[5002]: I1209 11:42:32.455334 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5844579b4-rxqx6" Dec 09 11:42:40 crc kubenswrapper[5002]: I1209 11:42:40.102761 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68dbf898cc-4fcz6" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.106:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.106:8080: connect: connection refused" Dec 09 11:42:40 crc kubenswrapper[5002]: I1209 11:42:40.103546 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.750892 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l"] Dec 09 11:42:41 crc kubenswrapper[5002]: E1209 11:42:41.751808 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon-log" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.752056 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon-log" Dec 09 11:42:41 crc kubenswrapper[5002]: E1209 11:42:41.752103 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.752112 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.752375 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon-log" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.752397 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d15617-ecbb-44d1-aa58-e11b6d40cc25" containerName="horizon" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.754321 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.757580 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.763052 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l"] Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.860905 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.860999 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rmt\" (UniqueName: \"kubernetes.io/projected/4a24ce03-952b-4bbe-aa94-26132c1f7655-kube-api-access-66rmt\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.861060 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.962917 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.963122 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.963226 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rmt\" (UniqueName: \"kubernetes.io/projected/4a24ce03-952b-4bbe-aa94-26132c1f7655-kube-api-access-66rmt\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.964047 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.965022 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:41 crc kubenswrapper[5002]: I1209 11:42:41.987897 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rmt\" (UniqueName: \"kubernetes.io/projected/4a24ce03-952b-4bbe-aa94-26132c1f7655-kube-api-access-66rmt\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:42 crc kubenswrapper[5002]: I1209 11:42:42.084362 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:42 crc kubenswrapper[5002]: I1209 11:42:42.703903 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l"] Dec 09 11:42:43 crc kubenswrapper[5002]: I1209 11:42:43.479429 5002 generic.go:334] "Generic (PLEG): container finished" podID="4a24ce03-952b-4bbe-aa94-26132c1f7655" containerID="0f51fd9cb5be277f46e0581ca8be68b6e66296ceb4b64cf8a07eef1a5f5945c9" exitCode=0 Dec 09 11:42:43 crc kubenswrapper[5002]: I1209 11:42:43.479545 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" event={"ID":"4a24ce03-952b-4bbe-aa94-26132c1f7655","Type":"ContainerDied","Data":"0f51fd9cb5be277f46e0581ca8be68b6e66296ceb4b64cf8a07eef1a5f5945c9"} Dec 09 11:42:43 crc kubenswrapper[5002]: I1209 11:42:43.479774 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" event={"ID":"4a24ce03-952b-4bbe-aa94-26132c1f7655","Type":"ContainerStarted","Data":"018333047f5f0a6599afe1788763484de296e10fc8f17aa8ac36f3a9aea28575"} Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.060675 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:42:44 crc kubenswrapper[5002]: E1209 11:42:44.061130 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.083380 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fw9tt"] Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.086975 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fw9tt"] Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.087125 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.222123 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-utilities\") pod \"redhat-operators-fw9tt\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.222770 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-catalog-content\") pod \"redhat-operators-fw9tt\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.223042 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mnb\" (UniqueName: \"kubernetes.io/projected/245ee9b5-cf0d-4046-b4c8-666039a8eacc-kube-api-access-b4mnb\") pod \"redhat-operators-fw9tt\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.324692 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-utilities\") pod \"redhat-operators-fw9tt\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.324765 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-catalog-content\") pod \"redhat-operators-fw9tt\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.324804 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mnb\" (UniqueName: \"kubernetes.io/projected/245ee9b5-cf0d-4046-b4c8-666039a8eacc-kube-api-access-b4mnb\") pod \"redhat-operators-fw9tt\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.325179 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-utilities\") pod \"redhat-operators-fw9tt\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.325307 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-catalog-content\") pod \"redhat-operators-fw9tt\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.348181 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mnb\" (UniqueName: \"kubernetes.io/projected/245ee9b5-cf0d-4046-b4c8-666039a8eacc-kube-api-access-b4mnb\") pod \"redhat-operators-fw9tt\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:44 crc kubenswrapper[5002]: I1209 11:42:44.459718 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:45 crc kubenswrapper[5002]: I1209 11:42:45.326574 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fw9tt"] Dec 09 11:42:45 crc kubenswrapper[5002]: I1209 11:42:45.500572 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw9tt" event={"ID":"245ee9b5-cf0d-4046-b4c8-666039a8eacc","Type":"ContainerStarted","Data":"0ff1b5eb5aa82e42133ea3e5bb347eafe4553d91656b8bd72eb8e01afc5a481b"} Dec 09 11:42:46 crc kubenswrapper[5002]: I1209 11:42:46.516971 5002 generic.go:334] "Generic (PLEG): container finished" podID="4a24ce03-952b-4bbe-aa94-26132c1f7655" containerID="e99d90c3685d1e3712461d7ffa5b89badbeec8f8293dedcdbdbad189ffd8e4b4" exitCode=0 Dec 09 11:42:46 crc kubenswrapper[5002]: I1209 11:42:46.517173 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" event={"ID":"4a24ce03-952b-4bbe-aa94-26132c1f7655","Type":"ContainerDied","Data":"e99d90c3685d1e3712461d7ffa5b89badbeec8f8293dedcdbdbad189ffd8e4b4"} Dec 09 11:42:46 crc kubenswrapper[5002]: I1209 11:42:46.521231 5002 generic.go:334] "Generic (PLEG): container finished" podID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerID="82501750b20e812ac7ecbd26445f5270a910a89031bc417ab4ec3d3fba069f02" exitCode=0 Dec 09 11:42:46 crc kubenswrapper[5002]: I1209 11:42:46.521281 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw9tt" event={"ID":"245ee9b5-cf0d-4046-b4c8-666039a8eacc","Type":"ContainerDied","Data":"82501750b20e812ac7ecbd26445f5270a910a89031bc417ab4ec3d3fba069f02"} Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.052105 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5fgxf"] Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.060241 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-069c-account-create-update-gb6q6"] Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.070764 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5fgxf"] Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.080098 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-069c-account-create-update-gb6q6"] Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.535287 5002 generic.go:334] "Generic (PLEG): container finished" podID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerID="fa0c95cd6a5a2021273ed0c0ea0a76748db43764b31355aa204820d63547a606" exitCode=137 Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.535370 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68dbf898cc-4fcz6" event={"ID":"f4050891-a1f6-4bf8-b612-d516dcac072a","Type":"ContainerDied","Data":"fa0c95cd6a5a2021273ed0c0ea0a76748db43764b31355aa204820d63547a606"} Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.535702 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68dbf898cc-4fcz6" event={"ID":"f4050891-a1f6-4bf8-b612-d516dcac072a","Type":"ContainerDied","Data":"6d44d3109e57a866592e294800588a368113c3cd5411ef12b35f63627eb3ae6d"} Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.535720 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d44d3109e57a866592e294800588a368113c3cd5411ef12b35f63627eb3ae6d" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.538362 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" event={"ID":"4a24ce03-952b-4bbe-aa94-26132c1f7655","Type":"ContainerStarted","Data":"d57f0005cb62ef71c8cf51d62b6bee17bcd5fd7bb1e88ce96fed4e087b08a0bc"} Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.565275 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" podStartSLOduration=4.791829032 podStartE2EDuration="6.565255279s" podCreationTimestamp="2025-12-09 11:42:41 +0000 UTC" firstStartedPulling="2025-12-09 11:42:43.481562394 +0000 UTC m=+6095.873613475" lastFinishedPulling="2025-12-09 11:42:45.254988641 +0000 UTC m=+6097.647039722" observedRunningTime="2025-12-09 11:42:47.552020165 +0000 UTC m=+6099.944071246" watchObservedRunningTime="2025-12-09 11:42:47.565255279 +0000 UTC m=+6099.957306360" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.574755 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.694805 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lrbq\" (UniqueName: \"kubernetes.io/projected/f4050891-a1f6-4bf8-b612-d516dcac072a-kube-api-access-6lrbq\") pod \"f4050891-a1f6-4bf8-b612-d516dcac072a\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.695056 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f4050891-a1f6-4bf8-b612-d516dcac072a-horizon-secret-key\") pod \"f4050891-a1f6-4bf8-b612-d516dcac072a\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.695097 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-config-data\") pod \"f4050891-a1f6-4bf8-b612-d516dcac072a\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.695124 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-scripts\") pod \"f4050891-a1f6-4bf8-b612-d516dcac072a\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.695240 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4050891-a1f6-4bf8-b612-d516dcac072a-logs\") pod \"f4050891-a1f6-4bf8-b612-d516dcac072a\" (UID: \"f4050891-a1f6-4bf8-b612-d516dcac072a\") " Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.695714 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4050891-a1f6-4bf8-b612-d516dcac072a-logs" (OuterVolumeSpecName: "logs") pod "f4050891-a1f6-4bf8-b612-d516dcac072a" (UID: "f4050891-a1f6-4bf8-b612-d516dcac072a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.695959 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4050891-a1f6-4bf8-b612-d516dcac072a-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.700269 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4050891-a1f6-4bf8-b612-d516dcac072a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f4050891-a1f6-4bf8-b612-d516dcac072a" (UID: "f4050891-a1f6-4bf8-b612-d516dcac072a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.702896 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4050891-a1f6-4bf8-b612-d516dcac072a-kube-api-access-6lrbq" (OuterVolumeSpecName: "kube-api-access-6lrbq") pod "f4050891-a1f6-4bf8-b612-d516dcac072a" (UID: "f4050891-a1f6-4bf8-b612-d516dcac072a"). InnerVolumeSpecName "kube-api-access-6lrbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.717758 5002 scope.go:117] "RemoveContainer" containerID="27b49507d971a35cb7b62603337368fe2b5a39da95b6cfcae2e4a5714b6920cd" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.721710 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-scripts" (OuterVolumeSpecName: "scripts") pod "f4050891-a1f6-4bf8-b612-d516dcac072a" (UID: "f4050891-a1f6-4bf8-b612-d516dcac072a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.722689 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-config-data" (OuterVolumeSpecName: "config-data") pod "f4050891-a1f6-4bf8-b612-d516dcac072a" (UID: "f4050891-a1f6-4bf8-b612-d516dcac072a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.798960 5002 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f4050891-a1f6-4bf8-b612-d516dcac072a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.799041 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.799069 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4050891-a1f6-4bf8-b612-d516dcac072a-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.799095 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lrbq\" (UniqueName: \"kubernetes.io/projected/f4050891-a1f6-4bf8-b612-d516dcac072a-kube-api-access-6lrbq\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.819572 5002 scope.go:117] "RemoveContainer" containerID="6a2250e0429e97b6c1c0fc36bef5d0e99a3b006bf02ef0dbb35256b686acade6" Dec 09 11:42:47 crc kubenswrapper[5002]: I1209 11:42:47.859616 5002 scope.go:117] "RemoveContainer" containerID="e9317b7a96ea912c6bdda8365f01af064ecf7b077911e19a228f4f0cdf9849a6" Dec 09 11:42:48 crc kubenswrapper[5002]: I1209 11:42:48.070967 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b" path="/var/lib/kubelet/pods/ce45c659-f71a-4bc5-8a2d-d4f2e230fb0b/volumes" Dec 09 11:42:48 crc kubenswrapper[5002]: I1209 11:42:48.071657 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6b5c066-5fc5-40ba-99eb-552de9a2abfb" path="/var/lib/kubelet/pods/d6b5c066-5fc5-40ba-99eb-552de9a2abfb/volumes" Dec 09 11:42:48 crc kubenswrapper[5002]: I1209 11:42:48.549449 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw9tt" event={"ID":"245ee9b5-cf0d-4046-b4c8-666039a8eacc","Type":"ContainerStarted","Data":"24aeb3a07fa6035b052b77ff64bcfed6f92598c2d7d4ba54a96657a28c09ced4"} Dec 09 11:42:48 crc kubenswrapper[5002]: I1209 11:42:48.551653 5002 generic.go:334] "Generic (PLEG): container finished" podID="4a24ce03-952b-4bbe-aa94-26132c1f7655" containerID="d57f0005cb62ef71c8cf51d62b6bee17bcd5fd7bb1e88ce96fed4e087b08a0bc" exitCode=0 Dec 09 11:42:48 crc kubenswrapper[5002]: I1209 11:42:48.551745 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68dbf898cc-4fcz6" Dec 09 11:42:48 crc kubenswrapper[5002]: I1209 11:42:48.551734 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" event={"ID":"4a24ce03-952b-4bbe-aa94-26132c1f7655","Type":"ContainerDied","Data":"d57f0005cb62ef71c8cf51d62b6bee17bcd5fd7bb1e88ce96fed4e087b08a0bc"} Dec 09 11:42:48 crc kubenswrapper[5002]: I1209 11:42:48.609863 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68dbf898cc-4fcz6"] Dec 09 11:42:48 crc kubenswrapper[5002]: I1209 11:42:48.625076 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-68dbf898cc-4fcz6"] Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.001166 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.066342 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66rmt\" (UniqueName: \"kubernetes.io/projected/4a24ce03-952b-4bbe-aa94-26132c1f7655-kube-api-access-66rmt\") pod \"4a24ce03-952b-4bbe-aa94-26132c1f7655\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.066630 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-bundle\") pod \"4a24ce03-952b-4bbe-aa94-26132c1f7655\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.066884 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-util\") pod \"4a24ce03-952b-4bbe-aa94-26132c1f7655\" (UID: \"4a24ce03-952b-4bbe-aa94-26132c1f7655\") " Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.068282 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-bundle" (OuterVolumeSpecName: "bundle") pod "4a24ce03-952b-4bbe-aa94-26132c1f7655" (UID: "4a24ce03-952b-4bbe-aa94-26132c1f7655"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.072302 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a24ce03-952b-4bbe-aa94-26132c1f7655-kube-api-access-66rmt" (OuterVolumeSpecName: "kube-api-access-66rmt") pod "4a24ce03-952b-4bbe-aa94-26132c1f7655" (UID: "4a24ce03-952b-4bbe-aa94-26132c1f7655"). InnerVolumeSpecName "kube-api-access-66rmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.078375 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-util" (OuterVolumeSpecName: "util") pod "4a24ce03-952b-4bbe-aa94-26132c1f7655" (UID: "4a24ce03-952b-4bbe-aa94-26132c1f7655"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.086261 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" path="/var/lib/kubelet/pods/f4050891-a1f6-4bf8-b612-d516dcac072a/volumes" Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.170219 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66rmt\" (UniqueName: \"kubernetes.io/projected/4a24ce03-952b-4bbe-aa94-26132c1f7655-kube-api-access-66rmt\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.170261 5002 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.170272 5002 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4a24ce03-952b-4bbe-aa94-26132c1f7655-util\") on node \"crc\" DevicePath \"\"" Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.573715 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" event={"ID":"4a24ce03-952b-4bbe-aa94-26132c1f7655","Type":"ContainerDied","Data":"018333047f5f0a6599afe1788763484de296e10fc8f17aa8ac36f3a9aea28575"} Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.573755 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="018333047f5f0a6599afe1788763484de296e10fc8f17aa8ac36f3a9aea28575" Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.573837 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l" Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.577617 5002 generic.go:334] "Generic (PLEG): container finished" podID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerID="24aeb3a07fa6035b052b77ff64bcfed6f92598c2d7d4ba54a96657a28c09ced4" exitCode=0 Dec 09 11:42:50 crc kubenswrapper[5002]: I1209 11:42:50.577660 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw9tt" event={"ID":"245ee9b5-cf0d-4046-b4c8-666039a8eacc","Type":"ContainerDied","Data":"24aeb3a07fa6035b052b77ff64bcfed6f92598c2d7d4ba54a96657a28c09ced4"} Dec 09 11:42:52 crc kubenswrapper[5002]: I1209 11:42:52.596543 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw9tt" event={"ID":"245ee9b5-cf0d-4046-b4c8-666039a8eacc","Type":"ContainerStarted","Data":"6fdf88d40c3bb0f02ecda408e9dc6177c3c0027d2aa9ac896901a258a17e127e"} Dec 09 11:42:52 crc kubenswrapper[5002]: I1209 11:42:52.615995 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fw9tt" podStartSLOduration=3.553794872 podStartE2EDuration="8.615978961s" podCreationTimestamp="2025-12-09 11:42:44 +0000 UTC" firstStartedPulling="2025-12-09 11:42:46.523362089 +0000 UTC m=+6098.915413180" lastFinishedPulling="2025-12-09 11:42:51.585546188 +0000 UTC m=+6103.977597269" observedRunningTime="2025-12-09 11:42:52.613366371 +0000 UTC m=+6105.005417452" watchObservedRunningTime="2025-12-09 11:42:52.615978961 +0000 UTC m=+6105.008030042" Dec 09 11:42:54 crc kubenswrapper[5002]: I1209 11:42:54.460416 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:54 crc kubenswrapper[5002]: I1209 11:42:54.461018 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:42:55 crc kubenswrapper[5002]: I1209 11:42:55.574453 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fw9tt" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerName="registry-server" probeResult="failure" output=< Dec 09 11:42:55 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 11:42:55 crc kubenswrapper[5002]: > Dec 09 11:42:56 crc kubenswrapper[5002]: I1209 11:42:56.064738 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:42:56 crc kubenswrapper[5002]: E1209 11:42:56.065280 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.039379 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zwq82"] Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.047383 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zwq82"] Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.072050 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c13469b-99fb-4eac-b05f-ec9c7f806022" path="/var/lib/kubelet/pods/3c13469b-99fb-4eac-b05f-ec9c7f806022/volumes" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.627153 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-2d286"] Dec 09 11:42:58 crc kubenswrapper[5002]: E1209 11:42:58.628016 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.628041 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon" Dec 09 11:42:58 crc kubenswrapper[5002]: E1209 11:42:58.628073 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon-log" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.628081 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon-log" Dec 09 11:42:58 crc kubenswrapper[5002]: E1209 11:42:58.628100 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a24ce03-952b-4bbe-aa94-26132c1f7655" containerName="util" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.628109 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a24ce03-952b-4bbe-aa94-26132c1f7655" containerName="util" Dec 09 11:42:58 crc kubenswrapper[5002]: E1209 11:42:58.628129 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a24ce03-952b-4bbe-aa94-26132c1f7655" containerName="pull" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.628138 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a24ce03-952b-4bbe-aa94-26132c1f7655" containerName="pull" Dec 09 11:42:58 crc kubenswrapper[5002]: E1209 11:42:58.628164 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a24ce03-952b-4bbe-aa94-26132c1f7655" containerName="extract" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.628171 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a24ce03-952b-4bbe-aa94-26132c1f7655" containerName="extract" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.628420 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon-log" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.628437 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a24ce03-952b-4bbe-aa94-26132c1f7655" containerName="extract" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.628457 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4050891-a1f6-4bf8-b612-d516dcac072a" containerName="horizon" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.629348 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2d286" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.631629 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-z9hpp" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.631965 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.632128 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.728024 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-2d286"] Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.778746 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr"] Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.780435 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.782631 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-8t4l2" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.788190 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.790457 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9"] Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.791669 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.802985 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr42w\" (UniqueName: \"kubernetes.io/projected/37dc16a6-bb85-4730-bb3f-acebf2ad7404-kube-api-access-nr42w\") pod \"obo-prometheus-operator-668cf9dfbb-2d286\" (UID: \"37dc16a6-bb85-4730-bb3f-acebf2ad7404\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2d286" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.809099 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr"] Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.818701 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9"] Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.904517 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr42w\" (UniqueName: \"kubernetes.io/projected/37dc16a6-bb85-4730-bb3f-acebf2ad7404-kube-api-access-nr42w\") pod \"obo-prometheus-operator-668cf9dfbb-2d286\" (UID: \"37dc16a6-bb85-4730-bb3f-acebf2ad7404\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2d286" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.904615 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/183516b3-813f-49e0-9b21-0baa88a22cc6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9\" (UID: \"183516b3-813f-49e0-9b21-0baa88a22cc6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.904677 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec9ffe1c-2fa3-440d-a9de-de56929b29db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr\" (UID: \"ec9ffe1c-2fa3-440d-a9de-de56929b29db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.904843 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec9ffe1c-2fa3-440d-a9de-de56929b29db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr\" (UID: \"ec9ffe1c-2fa3-440d-a9de-de56929b29db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.904875 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/183516b3-813f-49e0-9b21-0baa88a22cc6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9\" (UID: \"183516b3-813f-49e0-9b21-0baa88a22cc6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.946493 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr42w\" (UniqueName: \"kubernetes.io/projected/37dc16a6-bb85-4730-bb3f-acebf2ad7404-kube-api-access-nr42w\") pod \"obo-prometheus-operator-668cf9dfbb-2d286\" (UID: \"37dc16a6-bb85-4730-bb3f-acebf2ad7404\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2d286" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.960754 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2d286" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.981894 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-qjfz8"] Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.983637 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.996007 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-qjfz8"] Dec 09 11:42:58 crc kubenswrapper[5002]: I1209 11:42:58.998525 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.018880 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-hbgdx" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.019476 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec9ffe1c-2fa3-440d-a9de-de56929b29db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr\" (UID: \"ec9ffe1c-2fa3-440d-a9de-de56929b29db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.019519 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/183516b3-813f-49e0-9b21-0baa88a22cc6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9\" (UID: \"183516b3-813f-49e0-9b21-0baa88a22cc6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.019627 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/183516b3-813f-49e0-9b21-0baa88a22cc6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9\" (UID: \"183516b3-813f-49e0-9b21-0baa88a22cc6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.019666 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec9ffe1c-2fa3-440d-a9de-de56929b29db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr\" (UID: \"ec9ffe1c-2fa3-440d-a9de-de56929b29db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.029946 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/183516b3-813f-49e0-9b21-0baa88a22cc6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9\" (UID: \"183516b3-813f-49e0-9b21-0baa88a22cc6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.029951 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec9ffe1c-2fa3-440d-a9de-de56929b29db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr\" (UID: \"ec9ffe1c-2fa3-440d-a9de-de56929b29db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.031388 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/183516b3-813f-49e0-9b21-0baa88a22cc6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9\" (UID: \"183516b3-813f-49e0-9b21-0baa88a22cc6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.042798 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec9ffe1c-2fa3-440d-a9de-de56929b29db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr\" (UID: \"ec9ffe1c-2fa3-440d-a9de-de56929b29db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.104593 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.118905 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.120654 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxr8j\" (UniqueName: \"kubernetes.io/projected/f110f2fc-51fa-41da-8787-682fe8ceb5ff-kube-api-access-sxr8j\") pod \"observability-operator-d8bb48f5d-qjfz8\" (UID: \"f110f2fc-51fa-41da-8787-682fe8ceb5ff\") " pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.120773 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f110f2fc-51fa-41da-8787-682fe8ceb5ff-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-qjfz8\" (UID: \"f110f2fc-51fa-41da-8787-682fe8ceb5ff\") " pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.223726 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxr8j\" (UniqueName: \"kubernetes.io/projected/f110f2fc-51fa-41da-8787-682fe8ceb5ff-kube-api-access-sxr8j\") pod \"observability-operator-d8bb48f5d-qjfz8\" (UID: \"f110f2fc-51fa-41da-8787-682fe8ceb5ff\") " pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.224039 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f110f2fc-51fa-41da-8787-682fe8ceb5ff-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-qjfz8\" (UID: \"f110f2fc-51fa-41da-8787-682fe8ceb5ff\") " pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.243560 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f110f2fc-51fa-41da-8787-682fe8ceb5ff-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-qjfz8\" (UID: \"f110f2fc-51fa-41da-8787-682fe8ceb5ff\") " pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.261553 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxr8j\" (UniqueName: \"kubernetes.io/projected/f110f2fc-51fa-41da-8787-682fe8ceb5ff-kube-api-access-sxr8j\") pod \"observability-operator-d8bb48f5d-qjfz8\" (UID: \"f110f2fc-51fa-41da-8787-682fe8ceb5ff\") " pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.294615 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-87dfp"] Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.296596 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-87dfp" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.300695 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-xmq26" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.323164 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-87dfp"] Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.421898 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.436884 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sngtc\" (UniqueName: \"kubernetes.io/projected/5cc80363-bb46-4085-b008-4dd0eecd9a09-kube-api-access-sngtc\") pod \"perses-operator-5446b9c989-87dfp\" (UID: \"5cc80363-bb46-4085-b008-4dd0eecd9a09\") " pod="openshift-operators/perses-operator-5446b9c989-87dfp" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.436982 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5cc80363-bb46-4085-b008-4dd0eecd9a09-openshift-service-ca\") pod \"perses-operator-5446b9c989-87dfp\" (UID: \"5cc80363-bb46-4085-b008-4dd0eecd9a09\") " pod="openshift-operators/perses-operator-5446b9c989-87dfp" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.539198 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sngtc\" (UniqueName: \"kubernetes.io/projected/5cc80363-bb46-4085-b008-4dd0eecd9a09-kube-api-access-sngtc\") pod \"perses-operator-5446b9c989-87dfp\" (UID: \"5cc80363-bb46-4085-b008-4dd0eecd9a09\") " pod="openshift-operators/perses-operator-5446b9c989-87dfp" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.539287 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5cc80363-bb46-4085-b008-4dd0eecd9a09-openshift-service-ca\") pod \"perses-operator-5446b9c989-87dfp\" (UID: \"5cc80363-bb46-4085-b008-4dd0eecd9a09\") " pod="openshift-operators/perses-operator-5446b9c989-87dfp" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.540252 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5cc80363-bb46-4085-b008-4dd0eecd9a09-openshift-service-ca\") pod \"perses-operator-5446b9c989-87dfp\" (UID: \"5cc80363-bb46-4085-b008-4dd0eecd9a09\") " pod="openshift-operators/perses-operator-5446b9c989-87dfp" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.558132 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sngtc\" (UniqueName: \"kubernetes.io/projected/5cc80363-bb46-4085-b008-4dd0eecd9a09-kube-api-access-sngtc\") pod \"perses-operator-5446b9c989-87dfp\" (UID: \"5cc80363-bb46-4085-b008-4dd0eecd9a09\") " pod="openshift-operators/perses-operator-5446b9c989-87dfp" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.639490 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-2d286"] Dec 09 11:42:59 crc kubenswrapper[5002]: W1209 11:42:59.643628 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37dc16a6_bb85_4730_bb3f_acebf2ad7404.slice/crio-f8c69b73ddec8fe5bb0a2d0d9f434398ea1861ab83b226a4621ebee5e65e954c WatchSource:0}: Error finding container f8c69b73ddec8fe5bb0a2d0d9f434398ea1861ab83b226a4621ebee5e65e954c: Status 404 returned error can't find the container with id f8c69b73ddec8fe5bb0a2d0d9f434398ea1861ab83b226a4621ebee5e65e954c Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.647798 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-87dfp" Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.648890 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:42:59 crc kubenswrapper[5002]: W1209 11:42:59.651396 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9ffe1c_2fa3_440d_a9de_de56929b29db.slice/crio-f1ce9a11c4d55057b4c4787662b4d0ed63756141af4afcc378e42d62b36aa520 WatchSource:0}: Error finding container f1ce9a11c4d55057b4c4787662b4d0ed63756141af4afcc378e42d62b36aa520: Status 404 returned error can't find the container with id f1ce9a11c4d55057b4c4787662b4d0ed63756141af4afcc378e42d62b36aa520 Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.653265 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr"] Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.660321 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2d286" event={"ID":"37dc16a6-bb85-4730-bb3f-acebf2ad7404","Type":"ContainerStarted","Data":"f8c69b73ddec8fe5bb0a2d0d9f434398ea1861ab83b226a4621ebee5e65e954c"} Dec 09 11:42:59 crc kubenswrapper[5002]: I1209 11:42:59.912561 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-qjfz8"] Dec 09 11:42:59 crc kubenswrapper[5002]: W1209 11:42:59.943075 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf110f2fc_51fa_41da_8787_682fe8ceb5ff.slice/crio-8d92dda09a2939189a01080439a69c8baf0f4c9b76480a27cf04e47a20bd8a04 WatchSource:0}: Error finding container 8d92dda09a2939189a01080439a69c8baf0f4c9b76480a27cf04e47a20bd8a04: Status 404 returned error can't find the container with id 8d92dda09a2939189a01080439a69c8baf0f4c9b76480a27cf04e47a20bd8a04 Dec 09 11:43:00 crc kubenswrapper[5002]: I1209 11:43:00.064019 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9"] Dec 09 11:43:00 crc kubenswrapper[5002]: W1209 11:43:00.453710 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cc80363_bb46_4085_b008_4dd0eecd9a09.slice/crio-b27e0b6f68f141fca17032a362bd7588f0c586d310e874da83f4dbac576faed5 WatchSource:0}: Error finding container b27e0b6f68f141fca17032a362bd7588f0c586d310e874da83f4dbac576faed5: Status 404 returned error can't find the container with id b27e0b6f68f141fca17032a362bd7588f0c586d310e874da83f4dbac576faed5 Dec 09 11:43:00 crc kubenswrapper[5002]: I1209 11:43:00.454240 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-87dfp"] Dec 09 11:43:00 crc kubenswrapper[5002]: I1209 11:43:00.677185 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9" event={"ID":"183516b3-813f-49e0-9b21-0baa88a22cc6","Type":"ContainerStarted","Data":"1305b7507ccba8821e82d558a2673d3a48cf6b610749b58b4fcd46a6b3d90c82"} Dec 09 11:43:00 crc kubenswrapper[5002]: I1209 11:43:00.678451 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-87dfp" event={"ID":"5cc80363-bb46-4085-b008-4dd0eecd9a09","Type":"ContainerStarted","Data":"b27e0b6f68f141fca17032a362bd7588f0c586d310e874da83f4dbac576faed5"} Dec 09 11:43:00 crc kubenswrapper[5002]: I1209 11:43:00.679997 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" event={"ID":"f110f2fc-51fa-41da-8787-682fe8ceb5ff","Type":"ContainerStarted","Data":"8d92dda09a2939189a01080439a69c8baf0f4c9b76480a27cf04e47a20bd8a04"} Dec 09 11:43:00 crc kubenswrapper[5002]: I1209 11:43:00.681030 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr" event={"ID":"ec9ffe1c-2fa3-440d-a9de-de56929b29db","Type":"ContainerStarted","Data":"f1ce9a11c4d55057b4c4787662b4d0ed63756141af4afcc378e42d62b36aa520"} Dec 09 11:43:05 crc kubenswrapper[5002]: I1209 11:43:05.527899 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fw9tt" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerName="registry-server" probeResult="failure" output=< Dec 09 11:43:05 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 11:43:05 crc kubenswrapper[5002]: > Dec 09 11:43:08 crc kubenswrapper[5002]: I1209 11:43:08.080676 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:43:08 crc kubenswrapper[5002]: E1209 11:43:08.081383 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.835899 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-87dfp" event={"ID":"5cc80363-bb46-4085-b008-4dd0eecd9a09","Type":"ContainerStarted","Data":"852c272236ca66a2da6a652dfc6d5c6cb69346b19bc191c6575ab9c76f74ae31"} Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.836456 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-87dfp" Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.837428 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" event={"ID":"f110f2fc-51fa-41da-8787-682fe8ceb5ff","Type":"ContainerStarted","Data":"f17efc3ddc31e2a13708950461e847da5333777d4b0b60ec71056b22c75384e7"} Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.838712 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.841060 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr" event={"ID":"ec9ffe1c-2fa3-440d-a9de-de56929b29db","Type":"ContainerStarted","Data":"9336885d32402783151efd3b123712504995574f92f0c7ab61b360994ca6e8e8"} Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.842903 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.842932 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9" event={"ID":"183516b3-813f-49e0-9b21-0baa88a22cc6","Type":"ContainerStarted","Data":"3b7f06643f739e16883fc83e7884199dd77350708454ab43345a02d74dc05e0c"} Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.844689 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2d286" event={"ID":"37dc16a6-bb85-4730-bb3f-acebf2ad7404","Type":"ContainerStarted","Data":"fcbb6b4bc58d1c7da84cbaeaaa8cfdbb92494d8a574dcad0f28aba7f9315b906"} Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.864356 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-87dfp" podStartSLOduration=2.453095886 podStartE2EDuration="11.864338347s" podCreationTimestamp="2025-12-09 11:42:59 +0000 UTC" firstStartedPulling="2025-12-09 11:43:00.459554559 +0000 UTC m=+6112.851605640" lastFinishedPulling="2025-12-09 11:43:09.87079702 +0000 UTC m=+6122.262848101" observedRunningTime="2025-12-09 11:43:10.856032435 +0000 UTC m=+6123.248083516" watchObservedRunningTime="2025-12-09 11:43:10.864338347 +0000 UTC m=+6123.256389428" Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.874986 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2d286" podStartSLOduration=2.654185675 podStartE2EDuration="12.874971541s" podCreationTimestamp="2025-12-09 11:42:58 +0000 UTC" firstStartedPulling="2025-12-09 11:42:59.648462053 +0000 UTC m=+6112.040513134" lastFinishedPulling="2025-12-09 11:43:09.869247919 +0000 UTC m=+6122.261299000" observedRunningTime="2025-12-09 11:43:10.872745292 +0000 UTC m=+6123.264796373" watchObservedRunningTime="2025-12-09 11:43:10.874971541 +0000 UTC m=+6123.267022622" Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.919478 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr" podStartSLOduration=2.711953011 podStartE2EDuration="12.919453331s" podCreationTimestamp="2025-12-09 11:42:58 +0000 UTC" firstStartedPulling="2025-12-09 11:42:59.661980985 +0000 UTC m=+6112.054032066" lastFinishedPulling="2025-12-09 11:43:09.869481295 +0000 UTC m=+6122.261532386" observedRunningTime="2025-12-09 11:43:10.899507778 +0000 UTC m=+6123.291558869" watchObservedRunningTime="2025-12-09 11:43:10.919453331 +0000 UTC m=+6123.311504412" Dec 09 11:43:10 crc kubenswrapper[5002]: I1209 11:43:10.990420 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-qjfz8" podStartSLOduration=3.042255356 podStartE2EDuration="12.990400979s" podCreationTimestamp="2025-12-09 11:42:58 +0000 UTC" firstStartedPulling="2025-12-09 11:42:59.953605605 +0000 UTC m=+6112.345656686" lastFinishedPulling="2025-12-09 11:43:09.901751218 +0000 UTC m=+6122.293802309" observedRunningTime="2025-12-09 11:43:10.98034689 +0000 UTC m=+6123.372397991" watchObservedRunningTime="2025-12-09 11:43:10.990400979 +0000 UTC m=+6123.382452060" Dec 09 11:43:15 crc kubenswrapper[5002]: I1209 11:43:15.519135 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fw9tt" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerName="registry-server" probeResult="failure" output=< Dec 09 11:43:15 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 11:43:15 crc kubenswrapper[5002]: > Dec 09 11:43:19 crc kubenswrapper[5002]: I1209 11:43:19.651971 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-87dfp" Dec 09 11:43:19 crc kubenswrapper[5002]: I1209 11:43:19.675344 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9" podStartSLOduration=11.921569438 podStartE2EDuration="21.675318861s" podCreationTimestamp="2025-12-09 11:42:58 +0000 UTC" firstStartedPulling="2025-12-09 11:43:00.109377632 +0000 UTC m=+6112.501428713" lastFinishedPulling="2025-12-09 11:43:09.863127045 +0000 UTC m=+6122.255178136" observedRunningTime="2025-12-09 11:43:11.046342705 +0000 UTC m=+6123.438393786" watchObservedRunningTime="2025-12-09 11:43:19.675318861 +0000 UTC m=+6132.067369932" Dec 09 11:43:20 crc kubenswrapper[5002]: I1209 11:43:20.060048 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:43:20 crc kubenswrapper[5002]: E1209 11:43:20.060325 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.833583 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.834190 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="7c20bcd3-3ee7-4073-92e9-dd7d79e048b6" containerName="openstackclient" containerID="cri-o://11d3bbafb80743c1c3019622d4bf1eba4fc2afb99b128f8cfabc5b6c598e0ba9" gracePeriod=2 Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.878882 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.919867 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 09 11:43:22 crc kubenswrapper[5002]: E1209 11:43:22.920319 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c20bcd3-3ee7-4073-92e9-dd7d79e048b6" containerName="openstackclient" Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.920336 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c20bcd3-3ee7-4073-92e9-dd7d79e048b6" containerName="openstackclient" Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.920539 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c20bcd3-3ee7-4073-92e9-dd7d79e048b6" containerName="openstackclient" Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.921213 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.933286 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b-openstack-config-secret\") pod \"openstackclient\" (UID: \"2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b\") " pod="openstack/openstackclient" Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.933401 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b-openstack-config\") pod \"openstackclient\" (UID: \"2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b\") " pod="openstack/openstackclient" Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.933443 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s2qq\" (UniqueName: \"kubernetes.io/projected/2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b-kube-api-access-6s2qq\") pod \"openstackclient\" (UID: \"2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b\") " pod="openstack/openstackclient" Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.980430 5002 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7c20bcd3-3ee7-4073-92e9-dd7d79e048b6" podUID="2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b" Dec 09 11:43:22 crc kubenswrapper[5002]: I1209 11:43:22.997029 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.111460 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b-openstack-config-secret\") pod \"openstackclient\" (UID: \"2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b\") " pod="openstack/openstackclient" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.111793 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b-openstack-config\") pod \"openstackclient\" (UID: \"2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b\") " pod="openstack/openstackclient" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.123121 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s2qq\" (UniqueName: \"kubernetes.io/projected/2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b-kube-api-access-6s2qq\") pod \"openstackclient\" (UID: \"2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b\") " pod="openstack/openstackclient" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.125830 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b-openstack-config\") pod \"openstackclient\" (UID: \"2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b\") " pod="openstack/openstackclient" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.159729 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b-openstack-config-secret\") pod \"openstackclient\" (UID: \"2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b\") " pod="openstack/openstackclient" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.202259 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s2qq\" (UniqueName: \"kubernetes.io/projected/2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b-kube-api-access-6s2qq\") pod \"openstackclient\" (UID: \"2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b\") " pod="openstack/openstackclient" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.220666 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.222003 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.237438 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-p57b6" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.285636 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.286534 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.331207 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdbg2\" (UniqueName: \"kubernetes.io/projected/dedd9325-8ad0-49bf-9e6d-f14ea11b47dc-kube-api-access-vdbg2\") pod \"kube-state-metrics-0\" (UID: \"dedd9325-8ad0-49bf-9e6d-f14ea11b47dc\") " pod="openstack/kube-state-metrics-0" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.434733 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdbg2\" (UniqueName: \"kubernetes.io/projected/dedd9325-8ad0-49bf-9e6d-f14ea11b47dc-kube-api-access-vdbg2\") pod \"kube-state-metrics-0\" (UID: \"dedd9325-8ad0-49bf-9e6d-f14ea11b47dc\") " pod="openstack/kube-state-metrics-0" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.465163 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdbg2\" (UniqueName: \"kubernetes.io/projected/dedd9325-8ad0-49bf-9e6d-f14ea11b47dc-kube-api-access-vdbg2\") pod \"kube-state-metrics-0\" (UID: \"dedd9325-8ad0-49bf-9e6d-f14ea11b47dc\") " pod="openstack/kube-state-metrics-0" Dec 09 11:43:23 crc kubenswrapper[5002]: I1209 11:43:23.578280 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.100212 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.108046 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.109899 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.115723 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.116153 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.116314 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.116457 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-wzqdn" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.116585 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.157025 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/3fe6db24-e908-49ab-be4c-5482809166e0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.157318 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3fe6db24-e908-49ab-be4c-5482809166e0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.157537 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3fe6db24-e908-49ab-be4c-5482809166e0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.157722 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3fe6db24-e908-49ab-be4c-5482809166e0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.157876 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v44q\" (UniqueName: \"kubernetes.io/projected/3fe6db24-e908-49ab-be4c-5482809166e0-kube-api-access-9v44q\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.158055 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3fe6db24-e908-49ab-be4c-5482809166e0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.158219 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3fe6db24-e908-49ab-be4c-5482809166e0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.254884 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.259532 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3fe6db24-e908-49ab-be4c-5482809166e0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.259612 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3fe6db24-e908-49ab-be4c-5482809166e0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.259639 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v44q\" (UniqueName: \"kubernetes.io/projected/3fe6db24-e908-49ab-be4c-5482809166e0-kube-api-access-9v44q\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.259670 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3fe6db24-e908-49ab-be4c-5482809166e0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.259724 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3fe6db24-e908-49ab-be4c-5482809166e0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.259743 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/3fe6db24-e908-49ab-be4c-5482809166e0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.259782 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3fe6db24-e908-49ab-be4c-5482809166e0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.270305 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/3fe6db24-e908-49ab-be4c-5482809166e0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.280074 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3fe6db24-e908-49ab-be4c-5482809166e0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.282131 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3fe6db24-e908-49ab-be4c-5482809166e0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.283203 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3fe6db24-e908-49ab-be4c-5482809166e0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.284774 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v44q\" (UniqueName: \"kubernetes.io/projected/3fe6db24-e908-49ab-be4c-5482809166e0-kube-api-access-9v44q\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.290002 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3fe6db24-e908-49ab-be4c-5482809166e0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.308204 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3fe6db24-e908-49ab-be4c-5482809166e0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"3fe6db24-e908-49ab-be4c-5482809166e0\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.443100 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.510132 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.550935 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:43:24 crc kubenswrapper[5002]: W1209 11:43:24.583544 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddedd9325_8ad0_49bf_9e6d_f14ea11b47dc.slice/crio-135dc321617eaa924d0c8111a18e65923d745c8af4670fe318bd283f24a24c3f WatchSource:0}: Error finding container 135dc321617eaa924d0c8111a18e65923d745c8af4670fe318bd283f24a24c3f: Status 404 returned error can't find the container with id 135dc321617eaa924d0c8111a18e65923d745c8af4670fe318bd283f24a24c3f Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.608055 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.990863 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.995319 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.997928 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.999217 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.999440 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.999519 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-fhlz8" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.999606 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 09 11:43:24 crc kubenswrapper[5002]: I1209 11:43:24.999768 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.123803 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dedd9325-8ad0-49bf-9e6d-f14ea11b47dc","Type":"ContainerStarted","Data":"135dc321617eaa924d0c8111a18e65923d745c8af4670fe318bd283f24a24c3f"} Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.131006 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b","Type":"ContainerStarted","Data":"9566857fb096b2ff66bff793e990f4d0af2676ece410b3e561498a18e5c88bb9"} Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.179623 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qw6x\" (UniqueName: \"kubernetes.io/projected/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-kube-api-access-7qw6x\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.179707 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.179790 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.180283 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.180515 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-26ffea12-027a-4123-bcd4-a01999aa3d5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26ffea12-027a-4123-bcd4-a01999aa3d5e\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.180549 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.180577 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.180634 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-config\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.187339 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.200904 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fw9tt"] Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.282343 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-26ffea12-027a-4123-bcd4-a01999aa3d5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26ffea12-027a-4123-bcd4-a01999aa3d5e\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.282404 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.282427 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.282460 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-config\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.282527 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qw6x\" (UniqueName: \"kubernetes.io/projected/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-kube-api-access-7qw6x\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.282553 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.282584 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.282639 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.287848 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.288029 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.288234 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.288336 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.288743 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-config\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.291782 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.301179 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qw6x\" (UniqueName: \"kubernetes.io/projected/49ee88e1-200f-4c9b-8411-cedfd8cf9afb-kube-api-access-7qw6x\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.455987 5002 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.456047 5002 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-26ffea12-027a-4123-bcd4-a01999aa3d5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26ffea12-027a-4123-bcd4-a01999aa3d5e\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b0f862d90af7db1a1a88d2792dab2dbef643814251c994ee62a31b612a0c13f9/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.824122 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-26ffea12-027a-4123-bcd4-a01999aa3d5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26ffea12-027a-4123-bcd4-a01999aa3d5e\") pod \"prometheus-metric-storage-0\" (UID: \"49ee88e1-200f-4c9b-8411-cedfd8cf9afb\") " pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.858963 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 09 11:43:25 crc kubenswrapper[5002]: I1209 11:43:25.946626 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.155477 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b","Type":"ContainerStarted","Data":"576159c3463513f183a5eaa094dfc48e90207c0731e192591ecdba4d8a337fdd"} Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.159448 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3fe6db24-e908-49ab-be4c-5482809166e0","Type":"ContainerStarted","Data":"e816352bd86bc5c3aeb336d565ca5edfe673efbf21accafd2b8d522723d7a6a0"} Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.187503 5002 generic.go:334] "Generic (PLEG): container finished" podID="7c20bcd3-3ee7-4073-92e9-dd7d79e048b6" containerID="11d3bbafb80743c1c3019622d4bf1eba4fc2afb99b128f8cfabc5b6c598e0ba9" exitCode=137 Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.187770 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fw9tt" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerName="registry-server" containerID="cri-o://6fdf88d40c3bb0f02ecda408e9dc6177c3c0027d2aa9ac896901a258a17e127e" gracePeriod=2 Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.640778 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.675264 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.675244522 podStartE2EDuration="4.675244522s" podCreationTimestamp="2025-12-09 11:43:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:43:26.187558067 +0000 UTC m=+6138.579609148" watchObservedRunningTime="2025-12-09 11:43:26.675244522 +0000 UTC m=+6139.067295603" Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.675791 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 09 11:43:26 crc kubenswrapper[5002]: W1209 11:43:26.681998 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49ee88e1_200f_4c9b_8411_cedfd8cf9afb.slice/crio-0a0fb0bf55c807f18f0295b96cb7fda97acbf5efb98a96e4e68cc2162ff25ef3 WatchSource:0}: Error finding container 0a0fb0bf55c807f18f0295b96cb7fda97acbf5efb98a96e4e68cc2162ff25ef3: Status 404 returned error can't find the container with id 0a0fb0bf55c807f18f0295b96cb7fda97acbf5efb98a96e4e68cc2162ff25ef3 Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.719982 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config\") pod \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.720217 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6wgl\" (UniqueName: \"kubernetes.io/projected/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-kube-api-access-p6wgl\") pod \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.720984 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config-secret\") pod \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\" (UID: \"7c20bcd3-3ee7-4073-92e9-dd7d79e048b6\") " Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.725338 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-kube-api-access-p6wgl" (OuterVolumeSpecName: "kube-api-access-p6wgl") pod "7c20bcd3-3ee7-4073-92e9-dd7d79e048b6" (UID: "7c20bcd3-3ee7-4073-92e9-dd7d79e048b6"). InnerVolumeSpecName "kube-api-access-p6wgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.759950 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "7c20bcd3-3ee7-4073-92e9-dd7d79e048b6" (UID: "7c20bcd3-3ee7-4073-92e9-dd7d79e048b6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.794071 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "7c20bcd3-3ee7-4073-92e9-dd7d79e048b6" (UID: "7c20bcd3-3ee7-4073-92e9-dd7d79e048b6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.823144 5002 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.823179 5002 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:43:26 crc kubenswrapper[5002]: I1209 11:43:26.823188 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6wgl\" (UniqueName: \"kubernetes.io/projected/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6-kube-api-access-p6wgl\") on node \"crc\" DevicePath \"\"" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.198757 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dedd9325-8ad0-49bf-9e6d-f14ea11b47dc","Type":"ContainerStarted","Data":"453d16c25564b7eabef909ad94c484e1630bc4d645880c29dac6fb8a00ceb441"} Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.200957 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.203531 5002 generic.go:334] "Generic (PLEG): container finished" podID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerID="6fdf88d40c3bb0f02ecda408e9dc6177c3c0027d2aa9ac896901a258a17e127e" exitCode=0 Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.203601 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw9tt" event={"ID":"245ee9b5-cf0d-4046-b4c8-666039a8eacc","Type":"ContainerDied","Data":"6fdf88d40c3bb0f02ecda408e9dc6177c3c0027d2aa9ac896901a258a17e127e"} Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.203686 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw9tt" event={"ID":"245ee9b5-cf0d-4046-b4c8-666039a8eacc","Type":"ContainerDied","Data":"0ff1b5eb5aa82e42133ea3e5bb347eafe4553d91656b8bd72eb8e01afc5a481b"} Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.203705 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ff1b5eb5aa82e42133ea3e5bb347eafe4553d91656b8bd72eb8e01afc5a481b" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.206948 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"49ee88e1-200f-4c9b-8411-cedfd8cf9afb","Type":"ContainerStarted","Data":"0a0fb0bf55c807f18f0295b96cb7fda97acbf5efb98a96e4e68cc2162ff25ef3"} Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.208652 5002 scope.go:117] "RemoveContainer" containerID="11d3bbafb80743c1c3019622d4bf1eba4fc2afb99b128f8cfabc5b6c598e0ba9" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.208672 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.231537 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.729191117 podStartE2EDuration="4.231512152s" podCreationTimestamp="2025-12-09 11:43:23 +0000 UTC" firstStartedPulling="2025-12-09 11:43:24.586055889 +0000 UTC m=+6136.978106970" lastFinishedPulling="2025-12-09 11:43:26.088376914 +0000 UTC m=+6138.480428005" observedRunningTime="2025-12-09 11:43:27.218988097 +0000 UTC m=+6139.611039178" watchObservedRunningTime="2025-12-09 11:43:27.231512152 +0000 UTC m=+6139.623563233" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.261154 5002 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7c20bcd3-3ee7-4073-92e9-dd7d79e048b6" podUID="2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.269160 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.335736 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-utilities\") pod \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.335844 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4mnb\" (UniqueName: \"kubernetes.io/projected/245ee9b5-cf0d-4046-b4c8-666039a8eacc-kube-api-access-b4mnb\") pod \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.335915 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-catalog-content\") pod \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\" (UID: \"245ee9b5-cf0d-4046-b4c8-666039a8eacc\") " Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.338069 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-utilities" (OuterVolumeSpecName: "utilities") pod "245ee9b5-cf0d-4046-b4c8-666039a8eacc" (UID: "245ee9b5-cf0d-4046-b4c8-666039a8eacc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.340866 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/245ee9b5-cf0d-4046-b4c8-666039a8eacc-kube-api-access-b4mnb" (OuterVolumeSpecName: "kube-api-access-b4mnb") pod "245ee9b5-cf0d-4046-b4c8-666039a8eacc" (UID: "245ee9b5-cf0d-4046-b4c8-666039a8eacc"). InnerVolumeSpecName "kube-api-access-b4mnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.438539 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.438590 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4mnb\" (UniqueName: \"kubernetes.io/projected/245ee9b5-cf0d-4046-b4c8-666039a8eacc-kube-api-access-b4mnb\") on node \"crc\" DevicePath \"\"" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.439859 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "245ee9b5-cf0d-4046-b4c8-666039a8eacc" (UID: "245ee9b5-cf0d-4046-b4c8-666039a8eacc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:43:27 crc kubenswrapper[5002]: I1209 11:43:27.540420 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245ee9b5-cf0d-4046-b4c8-666039a8eacc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:43:28 crc kubenswrapper[5002]: I1209 11:43:28.082057 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c20bcd3-3ee7-4073-92e9-dd7d79e048b6" path="/var/lib/kubelet/pods/7c20bcd3-3ee7-4073-92e9-dd7d79e048b6/volumes" Dec 09 11:43:28 crc kubenswrapper[5002]: I1209 11:43:28.219097 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw9tt" Dec 09 11:43:28 crc kubenswrapper[5002]: I1209 11:43:28.261250 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fw9tt"] Dec 09 11:43:28 crc kubenswrapper[5002]: I1209 11:43:28.281076 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fw9tt"] Dec 09 11:43:30 crc kubenswrapper[5002]: I1209 11:43:30.073602 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" path="/var/lib/kubelet/pods/245ee9b5-cf0d-4046-b4c8-666039a8eacc/volumes" Dec 09 11:43:33 crc kubenswrapper[5002]: I1209 11:43:33.272503 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"49ee88e1-200f-4c9b-8411-cedfd8cf9afb","Type":"ContainerStarted","Data":"e1f0692545748ddad13002aa2ea8e6b0ec06ad1378972808f7df54cd5325ae7a"} Dec 09 11:43:33 crc kubenswrapper[5002]: I1209 11:43:33.274711 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3fe6db24-e908-49ab-be4c-5482809166e0","Type":"ContainerStarted","Data":"f55c0a3a9879e9988008ba7a1291d002eeceaf0a8011aa331380433017971a3b"} Dec 09 11:43:33 crc kubenswrapper[5002]: I1209 11:43:33.583282 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 09 11:43:35 crc kubenswrapper[5002]: I1209 11:43:35.060602 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:43:35 crc kubenswrapper[5002]: E1209 11:43:35.061410 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:43:41 crc kubenswrapper[5002]: I1209 11:43:41.361719 5002 generic.go:334] "Generic (PLEG): container finished" podID="3fe6db24-e908-49ab-be4c-5482809166e0" containerID="f55c0a3a9879e9988008ba7a1291d002eeceaf0a8011aa331380433017971a3b" exitCode=0 Dec 09 11:43:41 crc kubenswrapper[5002]: I1209 11:43:41.361822 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3fe6db24-e908-49ab-be4c-5482809166e0","Type":"ContainerDied","Data":"f55c0a3a9879e9988008ba7a1291d002eeceaf0a8011aa331380433017971a3b"} Dec 09 11:43:41 crc kubenswrapper[5002]: I1209 11:43:41.363535 5002 generic.go:334] "Generic (PLEG): container finished" podID="49ee88e1-200f-4c9b-8411-cedfd8cf9afb" containerID="e1f0692545748ddad13002aa2ea8e6b0ec06ad1378972808f7df54cd5325ae7a" exitCode=0 Dec 09 11:43:41 crc kubenswrapper[5002]: I1209 11:43:41.363566 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"49ee88e1-200f-4c9b-8411-cedfd8cf9afb","Type":"ContainerDied","Data":"e1f0692545748ddad13002aa2ea8e6b0ec06ad1378972808f7df54cd5325ae7a"} Dec 09 11:43:46 crc kubenswrapper[5002]: I1209 11:43:46.420474 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3fe6db24-e908-49ab-be4c-5482809166e0","Type":"ContainerStarted","Data":"88078f65eb246e825609ceb5935ceaf5b84c9e388f50d058c17a6658eb63d525"} Dec 09 11:43:47 crc kubenswrapper[5002]: I1209 11:43:47.059963 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:43:47 crc kubenswrapper[5002]: E1209 11:43:47.060609 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:43:47 crc kubenswrapper[5002]: I1209 11:43:47.991602 5002 scope.go:117] "RemoveContainer" containerID="5fccb647da094d58682f4b945b3f22efb76875fa4fe0a6b9c9294851c21f4deb" Dec 09 11:43:48 crc kubenswrapper[5002]: I1209 11:43:48.475927 5002 scope.go:117] "RemoveContainer" containerID="0e20ac414f2b1aa76e45c85921a3e6fbb34f3ea393422301c6b27ff7b365452f" Dec 09 11:43:48 crc kubenswrapper[5002]: I1209 11:43:48.549981 5002 scope.go:117] "RemoveContainer" containerID="368e80e160dd49a06b484bf5b0ef26549e076ef6842dcf7c2a30a3464d733cf7" Dec 09 11:43:49 crc kubenswrapper[5002]: I1209 11:43:49.450016 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"49ee88e1-200f-4c9b-8411-cedfd8cf9afb","Type":"ContainerStarted","Data":"09c0b6f6b407ecdaf104b9b0d74bb8d4f2dc5e42747cf0b5ca87f3ff7de4dcc5"} Dec 09 11:43:52 crc kubenswrapper[5002]: I1209 11:43:52.480959 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3fe6db24-e908-49ab-be4c-5482809166e0","Type":"ContainerStarted","Data":"ad87f9d8cbeefb8cb20a78ec6eaecd55b73b902eb3b2ee673af79dd39f393c75"} Dec 09 11:43:52 crc kubenswrapper[5002]: I1209 11:43:52.481607 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:52 crc kubenswrapper[5002]: I1209 11:43:52.484612 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 09 11:43:52 crc kubenswrapper[5002]: I1209 11:43:52.511938 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=8.610132096 podStartE2EDuration="28.511920128s" podCreationTimestamp="2025-12-09 11:43:24 +0000 UTC" firstStartedPulling="2025-12-09 11:43:25.188497784 +0000 UTC m=+6137.580548865" lastFinishedPulling="2025-12-09 11:43:45.090285816 +0000 UTC m=+6157.482336897" observedRunningTime="2025-12-09 11:43:52.503134143 +0000 UTC m=+6164.895185224" watchObservedRunningTime="2025-12-09 11:43:52.511920128 +0000 UTC m=+6164.903971229" Dec 09 11:43:53 crc kubenswrapper[5002]: I1209 11:43:53.499149 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"49ee88e1-200f-4c9b-8411-cedfd8cf9afb","Type":"ContainerStarted","Data":"abca24ce979895bfed31188f9a64670c4588d5f26130f3c9b404001b2d45341d"} Dec 09 11:43:58 crc kubenswrapper[5002]: I1209 11:43:58.058133 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5a5c-account-create-update-wlfkx"] Dec 09 11:43:58 crc kubenswrapper[5002]: I1209 11:43:58.071911 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:43:58 crc kubenswrapper[5002]: E1209 11:43:58.072216 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:43:58 crc kubenswrapper[5002]: I1209 11:43:58.076046 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5a5c-account-create-update-wlfkx"] Dec 09 11:43:59 crc kubenswrapper[5002]: I1209 11:43:59.043176 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-wbv78"] Dec 09 11:43:59 crc kubenswrapper[5002]: I1209 11:43:59.053255 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-wbv78"] Dec 09 11:43:59 crc kubenswrapper[5002]: I1209 11:43:59.066989 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-p7j5k"] Dec 09 11:43:59 crc kubenswrapper[5002]: I1209 11:43:59.074647 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-lscbm"] Dec 09 11:43:59 crc kubenswrapper[5002]: I1209 11:43:59.083885 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2680-account-create-update-xh47f"] Dec 09 11:43:59 crc kubenswrapper[5002]: I1209 11:43:59.092326 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b588-account-create-update-tjhwj"] Dec 09 11:43:59 crc kubenswrapper[5002]: I1209 11:43:59.100191 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-p7j5k"] Dec 09 11:43:59 crc kubenswrapper[5002]: I1209 11:43:59.108695 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-lscbm"] Dec 09 11:43:59 crc kubenswrapper[5002]: I1209 11:43:59.117763 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2680-account-create-update-xh47f"] Dec 09 11:43:59 crc kubenswrapper[5002]: I1209 11:43:59.126611 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b588-account-create-update-tjhwj"] Dec 09 11:44:02 crc kubenswrapper[5002]: I1209 11:44:02.375584 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01960093-3aa2-428d-a587-3647c93e64e7" path="/var/lib/kubelet/pods/01960093-3aa2-428d-a587-3647c93e64e7/volumes" Dec 09 11:44:02 crc kubenswrapper[5002]: I1209 11:44:02.377715 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08af662c-3e6c-4904-b409-bb179c6f45d1" path="/var/lib/kubelet/pods/08af662c-3e6c-4904-b409-bb179c6f45d1/volumes" Dec 09 11:44:02 crc kubenswrapper[5002]: I1209 11:44:02.382242 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f5b89ee-003e-4323-ac0d-e9e22f329941" path="/var/lib/kubelet/pods/3f5b89ee-003e-4323-ac0d-e9e22f329941/volumes" Dec 09 11:44:02 crc kubenswrapper[5002]: I1209 11:44:02.498849 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d66cb19-429c-4c73-a24c-d8c49803146c" path="/var/lib/kubelet/pods/4d66cb19-429c-4c73-a24c-d8c49803146c/volumes" Dec 09 11:44:02 crc kubenswrapper[5002]: I1209 11:44:02.499402 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3" path="/var/lib/kubelet/pods/7a5c9ceb-52aa-4490-b2da-4c0a8077cbb3/volumes" Dec 09 11:44:02 crc kubenswrapper[5002]: I1209 11:44:02.500440 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f4f809-9551-419b-9aac-57de880d3629" path="/var/lib/kubelet/pods/b5f4f809-9551-419b-9aac-57de880d3629/volumes" Dec 09 11:44:02 crc kubenswrapper[5002]: E1209 11:44:02.501576 5002 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.442s" Dec 09 11:44:05 crc kubenswrapper[5002]: I1209 11:44:05.626641 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"49ee88e1-200f-4c9b-8411-cedfd8cf9afb","Type":"ContainerStarted","Data":"5af507cc6d17c71f5822013b9ba7ba87241d3fb6dde4bf0fe119c365a5c7ac34"} Dec 09 11:44:06 crc kubenswrapper[5002]: I1209 11:44:06.678028 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=5.080345548 podStartE2EDuration="43.678010708s" podCreationTimestamp="2025-12-09 11:43:23 +0000 UTC" firstStartedPulling="2025-12-09 11:43:26.68751469 +0000 UTC m=+6139.079565771" lastFinishedPulling="2025-12-09 11:44:05.28517986 +0000 UTC m=+6177.677230931" observedRunningTime="2025-12-09 11:44:06.66279048 +0000 UTC m=+6179.054841601" watchObservedRunningTime="2025-12-09 11:44:06.678010708 +0000 UTC m=+6179.070061779" Dec 09 11:44:08 crc kubenswrapper[5002]: I1209 11:44:08.050835 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bdrr7"] Dec 09 11:44:08 crc kubenswrapper[5002]: I1209 11:44:08.080225 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bdrr7"] Dec 09 11:44:09 crc kubenswrapper[5002]: I1209 11:44:09.947276 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-s67vq" podUID="267396c4-1ded-4436-9caa-a45bb8a54e75" containerName="registry-server" probeResult="failure" output=< Dec 09 11:44:09 crc kubenswrapper[5002]: timeout: health rpc did not complete within 1s Dec 09 11:44:09 crc kubenswrapper[5002]: > Dec 09 11:44:10 crc kubenswrapper[5002]: I1209 11:44:10.080428 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="977406ee-d447-4350-9dac-57ff3973d465" path="/var/lib/kubelet/pods/977406ee-d447-4350-9dac-57ff3973d465/volumes" Dec 09 11:44:10 crc kubenswrapper[5002]: I1209 11:44:10.948353 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 09 11:44:10 crc kubenswrapper[5002]: I1209 11:44:10.948750 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 09 11:44:10 crc kubenswrapper[5002]: I1209 11:44:10.950536 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 09 11:44:11 crc kubenswrapper[5002]: I1209 11:44:11.060911 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:44:11 crc kubenswrapper[5002]: E1209 11:44:11.061221 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:44:11 crc kubenswrapper[5002]: I1209 11:44:11.748712 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.685710 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:44:13 crc kubenswrapper[5002]: E1209 11:44:13.686511 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerName="registry-server" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.686522 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerName="registry-server" Dec 09 11:44:13 crc kubenswrapper[5002]: E1209 11:44:13.686556 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerName="extract-content" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.686562 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerName="extract-content" Dec 09 11:44:13 crc kubenswrapper[5002]: E1209 11:44:13.686581 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerName="extract-utilities" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.686586 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerName="extract-utilities" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.686780 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="245ee9b5-cf0d-4046-b4c8-666039a8eacc" containerName="registry-server" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.688598 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.691324 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.691644 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.706578 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.760577 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.760644 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g2ng\" (UniqueName: \"kubernetes.io/projected/ffa689c7-aa9c-49e5-81ca-607f501e98bd-kube-api-access-9g2ng\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.760741 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.760758 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.760780 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-config-data\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.760808 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.760852 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-scripts\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.862332 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.862379 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.862413 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-config-data\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.862453 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.862483 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-scripts\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.862585 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.862623 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g2ng\" (UniqueName: \"kubernetes.io/projected/ffa689c7-aa9c-49e5-81ca-607f501e98bd-kube-api-access-9g2ng\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.862870 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.863938 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.873399 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.873654 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-scripts\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.874036 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-config-data\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.875683 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:13 crc kubenswrapper[5002]: I1209 11:44:13.884761 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g2ng\" (UniqueName: \"kubernetes.io/projected/ffa689c7-aa9c-49e5-81ca-607f501e98bd-kube-api-access-9g2ng\") pod \"ceilometer-0\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " pod="openstack/ceilometer-0" Dec 09 11:44:14 crc kubenswrapper[5002]: I1209 11:44:14.022592 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:44:14 crc kubenswrapper[5002]: I1209 11:44:14.572993 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:44:14 crc kubenswrapper[5002]: I1209 11:44:14.791362 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffa689c7-aa9c-49e5-81ca-607f501e98bd","Type":"ContainerStarted","Data":"32dd72b4877aab023e68a833b8da031b82d752adcff359b0e228865e632d7745"} Dec 09 11:44:22 crc kubenswrapper[5002]: I1209 11:44:22.060641 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:44:22 crc kubenswrapper[5002]: E1209 11:44:22.061463 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:44:25 crc kubenswrapper[5002]: I1209 11:44:25.671056 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-8cbcs" podUID="21791258-f31e-49b6-8470-1915bc504a3f" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:44:28 crc kubenswrapper[5002]: I1209 11:44:28.041489 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xxkmb"] Dec 09 11:44:28 crc kubenswrapper[5002]: I1209 11:44:28.050448 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xxkmb"] Dec 09 11:44:28 crc kubenswrapper[5002]: I1209 11:44:28.071555 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e771094a-9f24-43c1-be1f-3efca4e1b42c" path="/var/lib/kubelet/pods/e771094a-9f24-43c1-be1f-3efca4e1b42c/volumes" Dec 09 11:44:29 crc kubenswrapper[5002]: I1209 11:44:29.033336 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-qd2mx"] Dec 09 11:44:29 crc kubenswrapper[5002]: I1209 11:44:29.043750 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-qd2mx"] Dec 09 11:44:30 crc kubenswrapper[5002]: I1209 11:44:30.087292 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fb0a2a3-32f7-4406-9a40-db702f2e2786" path="/var/lib/kubelet/pods/3fb0a2a3-32f7-4406-9a40-db702f2e2786/volumes" Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.287277 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qmhrf"] Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.289868 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.297828 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmhrf"] Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.356452 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-catalog-content\") pod \"community-operators-qmhrf\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.356614 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-utilities\") pod \"community-operators-qmhrf\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.356706 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4x8c\" (UniqueName: \"kubernetes.io/projected/6853242a-1640-4e72-a63f-61931ee9ea3f-kube-api-access-m4x8c\") pod \"community-operators-qmhrf\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.459171 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4x8c\" (UniqueName: \"kubernetes.io/projected/6853242a-1640-4e72-a63f-61931ee9ea3f-kube-api-access-m4x8c\") pod \"community-operators-qmhrf\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.459258 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-catalog-content\") pod \"community-operators-qmhrf\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.459342 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-utilities\") pod \"community-operators-qmhrf\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.460164 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-catalog-content\") pod \"community-operators-qmhrf\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.460210 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-utilities\") pod \"community-operators-qmhrf\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.476265 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4x8c\" (UniqueName: \"kubernetes.io/projected/6853242a-1640-4e72-a63f-61931ee9ea3f-kube-api-access-m4x8c\") pod \"community-operators-qmhrf\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:44:32 crc kubenswrapper[5002]: I1209 11:44:32.619410 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:44:33 crc kubenswrapper[5002]: I1209 11:44:33.060898 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:44:33 crc kubenswrapper[5002]: E1209 11:44:33.061207 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.691704 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cq7rt"] Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.695304 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.726836 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cq7rt"] Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.813397 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-utilities\") pod \"redhat-marketplace-cq7rt\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.813697 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-catalog-content\") pod \"redhat-marketplace-cq7rt\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.813753 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9fgm\" (UniqueName: \"kubernetes.io/projected/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-kube-api-access-r9fgm\") pod \"redhat-marketplace-cq7rt\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.916213 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-utilities\") pod \"redhat-marketplace-cq7rt\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.916456 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-catalog-content\") pod \"redhat-marketplace-cq7rt\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.916560 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9fgm\" (UniqueName: \"kubernetes.io/projected/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-kube-api-access-r9fgm\") pod \"redhat-marketplace-cq7rt\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.916662 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-utilities\") pod \"redhat-marketplace-cq7rt\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.917014 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-catalog-content\") pod \"redhat-marketplace-cq7rt\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:44:34 crc kubenswrapper[5002]: I1209 11:44:34.947791 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9fgm\" (UniqueName: \"kubernetes.io/projected/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-kube-api-access-r9fgm\") pod \"redhat-marketplace-cq7rt\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:44:35 crc kubenswrapper[5002]: I1209 11:44:35.120995 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:44:35 crc kubenswrapper[5002]: I1209 11:44:35.574614 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmhrf"] Dec 09 11:44:36 crc kubenswrapper[5002]: I1209 11:44:36.017098 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmhrf" event={"ID":"6853242a-1640-4e72-a63f-61931ee9ea3f","Type":"ContainerStarted","Data":"f547e8679209f57676cdd6fb41377bbb5f2c9d352a284a0be130f46ae5283552"} Dec 09 11:44:36 crc kubenswrapper[5002]: W1209 11:44:36.150383 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27ff2b0a_bd1e_4263_bd46_532a5790f1f3.slice/crio-2a1d9e03d63286418db686f92329d86516871277984724bc6ccd571adccc9192 WatchSource:0}: Error finding container 2a1d9e03d63286418db686f92329d86516871277984724bc6ccd571adccc9192: Status 404 returned error can't find the container with id 2a1d9e03d63286418db686f92329d86516871277984724bc6ccd571adccc9192 Dec 09 11:44:36 crc kubenswrapper[5002]: I1209 11:44:36.159165 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cq7rt"] Dec 09 11:44:37 crc kubenswrapper[5002]: I1209 11:44:37.034699 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cq7rt" event={"ID":"27ff2b0a-bd1e-4263-bd46-532a5790f1f3","Type":"ContainerStarted","Data":"2a1d9e03d63286418db686f92329d86516871277984724bc6ccd571adccc9192"} Dec 09 11:44:40 crc kubenswrapper[5002]: I1209 11:44:39.055646 5002 generic.go:334] "Generic (PLEG): container finished" podID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" containerID="f823d6527fe9f4ae1d7c208322bf7e841de5d3ba91441b08c5aae0e4fd4589bc" exitCode=0 Dec 09 11:44:40 crc kubenswrapper[5002]: I1209 11:44:39.055767 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cq7rt" event={"ID":"27ff2b0a-bd1e-4263-bd46-532a5790f1f3","Type":"ContainerDied","Data":"f823d6527fe9f4ae1d7c208322bf7e841de5d3ba91441b08c5aae0e4fd4589bc"} Dec 09 11:44:40 crc kubenswrapper[5002]: I1209 11:44:39.061661 5002 generic.go:334] "Generic (PLEG): container finished" podID="6853242a-1640-4e72-a63f-61931ee9ea3f" containerID="9ac4e5497a8525dd1daeabc5cd5c44539bcf3191e268cfa112e9eb1ccfa97d1c" exitCode=0 Dec 09 11:44:40 crc kubenswrapper[5002]: I1209 11:44:39.061692 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmhrf" event={"ID":"6853242a-1640-4e72-a63f-61931ee9ea3f","Type":"ContainerDied","Data":"9ac4e5497a8525dd1daeabc5cd5c44539bcf3191e268cfa112e9eb1ccfa97d1c"} Dec 09 11:44:40 crc kubenswrapper[5002]: I1209 11:44:39.064404 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffa689c7-aa9c-49e5-81ca-607f501e98bd","Type":"ContainerStarted","Data":"4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06"} Dec 09 11:44:42 crc kubenswrapper[5002]: I1209 11:44:42.101398 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cq7rt" event={"ID":"27ff2b0a-bd1e-4263-bd46-532a5790f1f3","Type":"ContainerStarted","Data":"842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf"} Dec 09 11:44:42 crc kubenswrapper[5002]: I1209 11:44:42.108261 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmhrf" event={"ID":"6853242a-1640-4e72-a63f-61931ee9ea3f","Type":"ContainerStarted","Data":"eb69c4d3ec3c48a1d30ac9d22291af2ada93379f84695287065c23ccd83ae433"} Dec 09 11:44:42 crc kubenswrapper[5002]: I1209 11:44:42.121781 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffa689c7-aa9c-49e5-81ca-607f501e98bd","Type":"ContainerStarted","Data":"c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839"} Dec 09 11:44:44 crc kubenswrapper[5002]: I1209 11:44:44.060903 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:44:44 crc kubenswrapper[5002]: E1209 11:44:44.061796 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:44:45 crc kubenswrapper[5002]: I1209 11:44:45.150789 5002 generic.go:334] "Generic (PLEG): container finished" podID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" containerID="842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf" exitCode=0 Dec 09 11:44:45 crc kubenswrapper[5002]: I1209 11:44:45.150846 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cq7rt" event={"ID":"27ff2b0a-bd1e-4263-bd46-532a5790f1f3","Type":"ContainerDied","Data":"842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf"} Dec 09 11:44:46 crc kubenswrapper[5002]: I1209 11:44:46.354274 5002 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.055179245s: [/var/lib/containers/storage/overlay/c84b64e6fb5a9d4ae067a552805ddc419f52caddf328cd7ee7d41e3931e6eb96/diff /var/log/pods/openstack_alertmanager-metric-storage-0_3fe6db24-e908-49ab-be4c-5482809166e0/alertmanager/0.log]; will not log again for this container unless duration exceeds 2s Dec 09 11:44:47 crc kubenswrapper[5002]: I1209 11:44:47.053488 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-8kjkp"] Dec 09 11:44:47 crc kubenswrapper[5002]: I1209 11:44:47.063415 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-8kjkp"] Dec 09 11:44:48 crc kubenswrapper[5002]: I1209 11:44:48.109688 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8013239c-dfd4-407a-aa67-d57b2ce2aac5" path="/var/lib/kubelet/pods/8013239c-dfd4-407a-aa67-d57b2ce2aac5/volumes" Dec 09 11:44:48 crc kubenswrapper[5002]: I1209 11:44:48.785624 5002 scope.go:117] "RemoveContainer" containerID="3e59c03533020fe2ca5756b8f99eb2364b35c8f23674f71f840e505b68b56eac" Dec 09 11:44:49 crc kubenswrapper[5002]: I1209 11:44:49.912711 5002 scope.go:117] "RemoveContainer" containerID="b3d392e02c2adf871c8bfa7fec0bd8307d58287da6ff4b2875b7e99a0e4df62a" Dec 09 11:44:49 crc kubenswrapper[5002]: I1209 11:44:49.959170 5002 scope.go:117] "RemoveContainer" containerID="4943ee674f928e1daa38145cc1df49f274cc9de9527018f716f2a60fb5fd4d20" Dec 09 11:44:50 crc kubenswrapper[5002]: I1209 11:44:50.011486 5002 scope.go:117] "RemoveContainer" containerID="d016bff16ba99528c7723b173b133ea96bad49711180df12a533544c72ea88a0" Dec 09 11:44:50 crc kubenswrapper[5002]: I1209 11:44:50.082559 5002 scope.go:117] "RemoveContainer" containerID="c62a7bd81f29cab3e89ac6d0e06dd2164973b7b6399f640e691f17b8ab0f37c3" Dec 09 11:44:50 crc kubenswrapper[5002]: I1209 11:44:50.105525 5002 scope.go:117] "RemoveContainer" containerID="bc386f309010208a9e507929a5a7930742f3b1063b25febf38482d67eb33280f" Dec 09 11:44:50 crc kubenswrapper[5002]: I1209 11:44:50.159831 5002 scope.go:117] "RemoveContainer" containerID="bc3058c8296bc813b787621a3c504285a0ae7c0905d7ec99d8318031a21d4e2f" Dec 09 11:44:50 crc kubenswrapper[5002]: I1209 11:44:50.184766 5002 scope.go:117] "RemoveContainer" containerID="d9b186599c730c3ced7d8b2d84556ffd618c00151e035c62bce2480676ab9269" Dec 09 11:44:50 crc kubenswrapper[5002]: I1209 11:44:50.217436 5002 scope.go:117] "RemoveContainer" containerID="c8be20e4e0e5c214f785b464bd36592547711f4fddd96159be4aa632d51e3dbf" Dec 09 11:44:50 crc kubenswrapper[5002]: I1209 11:44:50.257047 5002 scope.go:117] "RemoveContainer" containerID="634ad4e82176747ed0439293087d0fa0499d64dac7e1b6fb3087fd5178f4cf7d" Dec 09 11:44:50 crc kubenswrapper[5002]: I1209 11:44:50.876883 5002 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.090806408s: [/var/lib/containers/storage/overlay/ccd2780fa90658ee6e92ce4470edeae4c6a9f2bde60747d1d6f959a43e50fe94/diff /var/log/pods/openstack_prometheus-metric-storage-0_49ee88e1-200f-4c9b-8411-cedfd8cf9afb/prometheus/0.log]; will not log again for this container unless duration exceeds 2s Dec 09 11:44:51 crc kubenswrapper[5002]: I1209 11:44:51.251917 5002 generic.go:334] "Generic (PLEG): container finished" podID="6853242a-1640-4e72-a63f-61931ee9ea3f" containerID="eb69c4d3ec3c48a1d30ac9d22291af2ada93379f84695287065c23ccd83ae433" exitCode=0 Dec 09 11:44:51 crc kubenswrapper[5002]: I1209 11:44:51.251983 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmhrf" event={"ID":"6853242a-1640-4e72-a63f-61931ee9ea3f","Type":"ContainerDied","Data":"eb69c4d3ec3c48a1d30ac9d22291af2ada93379f84695287065c23ccd83ae433"} Dec 09 11:44:55 crc kubenswrapper[5002]: I1209 11:44:55.295288 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cq7rt" event={"ID":"27ff2b0a-bd1e-4263-bd46-532a5790f1f3","Type":"ContainerStarted","Data":"22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5"} Dec 09 11:44:55 crc kubenswrapper[5002]: I1209 11:44:55.297395 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmhrf" event={"ID":"6853242a-1640-4e72-a63f-61931ee9ea3f","Type":"ContainerStarted","Data":"d2d75b4285fd009c67d50b1769cb947fa04d53ee9fa3fe058daf0901ca89c605"} Dec 09 11:44:55 crc kubenswrapper[5002]: I1209 11:44:55.299321 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffa689c7-aa9c-49e5-81ca-607f501e98bd","Type":"ContainerStarted","Data":"9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa"} Dec 09 11:44:55 crc kubenswrapper[5002]: I1209 11:44:55.321999 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cq7rt" podStartSLOduration=6.855858427 podStartE2EDuration="21.321967869s" podCreationTimestamp="2025-12-09 11:44:34 +0000 UTC" firstStartedPulling="2025-12-09 11:44:39.057602812 +0000 UTC m=+6211.449653893" lastFinishedPulling="2025-12-09 11:44:53.523712254 +0000 UTC m=+6225.915763335" observedRunningTime="2025-12-09 11:44:55.313889373 +0000 UTC m=+6227.705940444" watchObservedRunningTime="2025-12-09 11:44:55.321967869 +0000 UTC m=+6227.714018950" Dec 09 11:44:55 crc kubenswrapper[5002]: I1209 11:44:55.334115 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qmhrf" podStartSLOduration=9.130122207 podStartE2EDuration="23.334091464s" podCreationTimestamp="2025-12-09 11:44:32 +0000 UTC" firstStartedPulling="2025-12-09 11:44:40.076285001 +0000 UTC m=+6212.468336082" lastFinishedPulling="2025-12-09 11:44:54.280254258 +0000 UTC m=+6226.672305339" observedRunningTime="2025-12-09 11:44:55.333358984 +0000 UTC m=+6227.725410085" watchObservedRunningTime="2025-12-09 11:44:55.334091464 +0000 UTC m=+6227.726142555" Dec 09 11:44:57 crc kubenswrapper[5002]: I1209 11:44:57.319186 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffa689c7-aa9c-49e5-81ca-607f501e98bd","Type":"ContainerStarted","Data":"4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c"} Dec 09 11:44:57 crc kubenswrapper[5002]: I1209 11:44:57.319775 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 11:44:57 crc kubenswrapper[5002]: I1209 11:44:57.375836 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.942581146 podStartE2EDuration="44.375806781s" podCreationTimestamp="2025-12-09 11:44:13 +0000 UTC" firstStartedPulling="2025-12-09 11:44:14.592031931 +0000 UTC m=+6186.984083032" lastFinishedPulling="2025-12-09 11:44:56.025257586 +0000 UTC m=+6228.417308667" observedRunningTime="2025-12-09 11:44:57.36904166 +0000 UTC m=+6229.761092741" watchObservedRunningTime="2025-12-09 11:44:57.375806781 +0000 UTC m=+6229.767857862" Dec 09 11:44:58 crc kubenswrapper[5002]: I1209 11:44:58.069444 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:44:58 crc kubenswrapper[5002]: E1209 11:44:58.069683 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.157297 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5"] Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.159464 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.161406 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.161754 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.172328 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5"] Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.293125 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84b92a8a-c566-499c-903d-9fc0021b343e-secret-volume\") pod \"collect-profiles-29421345-6ksl5\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.293373 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84b92a8a-c566-499c-903d-9fc0021b343e-config-volume\") pod \"collect-profiles-29421345-6ksl5\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.293728 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lssq6\" (UniqueName: \"kubernetes.io/projected/84b92a8a-c566-499c-903d-9fc0021b343e-kube-api-access-lssq6\") pod \"collect-profiles-29421345-6ksl5\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.395432 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lssq6\" (UniqueName: \"kubernetes.io/projected/84b92a8a-c566-499c-903d-9fc0021b343e-kube-api-access-lssq6\") pod \"collect-profiles-29421345-6ksl5\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.395595 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84b92a8a-c566-499c-903d-9fc0021b343e-secret-volume\") pod \"collect-profiles-29421345-6ksl5\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.395671 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84b92a8a-c566-499c-903d-9fc0021b343e-config-volume\") pod \"collect-profiles-29421345-6ksl5\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.396626 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84b92a8a-c566-499c-903d-9fc0021b343e-config-volume\") pod \"collect-profiles-29421345-6ksl5\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.402166 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84b92a8a-c566-499c-903d-9fc0021b343e-secret-volume\") pod \"collect-profiles-29421345-6ksl5\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.424714 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lssq6\" (UniqueName: \"kubernetes.io/projected/84b92a8a-c566-499c-903d-9fc0021b343e-kube-api-access-lssq6\") pod \"collect-profiles-29421345-6ksl5\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.493287 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:00 crc kubenswrapper[5002]: I1209 11:45:00.991642 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5"] Dec 09 11:45:00 crc kubenswrapper[5002]: W1209 11:45:00.993409 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84b92a8a_c566_499c_903d_9fc0021b343e.slice/crio-f15c04bf137056e09e8a89bc622a03d972940c67920b98ea8b755016ee9afb78 WatchSource:0}: Error finding container f15c04bf137056e09e8a89bc622a03d972940c67920b98ea8b755016ee9afb78: Status 404 returned error can't find the container with id f15c04bf137056e09e8a89bc622a03d972940c67920b98ea8b755016ee9afb78 Dec 09 11:45:01 crc kubenswrapper[5002]: I1209 11:45:01.375583 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" event={"ID":"84b92a8a-c566-499c-903d-9fc0021b343e","Type":"ContainerStarted","Data":"8a1b453c23b1d66d7543315ea300dee42cc788781abcda5d26c38ed4ee58aa23"} Dec 09 11:45:01 crc kubenswrapper[5002]: I1209 11:45:01.375929 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" event={"ID":"84b92a8a-c566-499c-903d-9fc0021b343e","Type":"ContainerStarted","Data":"f15c04bf137056e09e8a89bc622a03d972940c67920b98ea8b755016ee9afb78"} Dec 09 11:45:01 crc kubenswrapper[5002]: I1209 11:45:01.398114 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" podStartSLOduration=1.398094176 podStartE2EDuration="1.398094176s" podCreationTimestamp="2025-12-09 11:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:45:01.395043274 +0000 UTC m=+6233.787094355" watchObservedRunningTime="2025-12-09 11:45:01.398094176 +0000 UTC m=+6233.790145257" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.385955 5002 generic.go:334] "Generic (PLEG): container finished" podID="84b92a8a-c566-499c-903d-9fc0021b343e" containerID="8a1b453c23b1d66d7543315ea300dee42cc788781abcda5d26c38ed4ee58aa23" exitCode=0 Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.386005 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" event={"ID":"84b92a8a-c566-499c-903d-9fc0021b343e","Type":"ContainerDied","Data":"8a1b453c23b1d66d7543315ea300dee42cc788781abcda5d26c38ed4ee58aa23"} Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.414412 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-78vks"] Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.416352 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-78vks" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.442861 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-78vks"] Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.520462 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-364d-account-create-update-5gppk"] Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.522351 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-364d-account-create-update-5gppk" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.525032 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.530184 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-364d-account-create-update-5gppk"] Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.541049 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpnrc\" (UniqueName: \"kubernetes.io/projected/45156329-405f-4f04-9a55-3568465720b3-kube-api-access-fpnrc\") pod \"aodh-db-create-78vks\" (UID: \"45156329-405f-4f04-9a55-3568465720b3\") " pod="openstack/aodh-db-create-78vks" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.541195 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45156329-405f-4f04-9a55-3568465720b3-operator-scripts\") pod \"aodh-db-create-78vks\" (UID: \"45156329-405f-4f04-9a55-3568465720b3\") " pod="openstack/aodh-db-create-78vks" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.620787 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.620854 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.642854 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpnrc\" (UniqueName: \"kubernetes.io/projected/45156329-405f-4f04-9a55-3568465720b3-kube-api-access-fpnrc\") pod \"aodh-db-create-78vks\" (UID: \"45156329-405f-4f04-9a55-3568465720b3\") " pod="openstack/aodh-db-create-78vks" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.642992 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c94563f6-d63b-478e-a796-9615ab525f8b-operator-scripts\") pod \"aodh-364d-account-create-update-5gppk\" (UID: \"c94563f6-d63b-478e-a796-9615ab525f8b\") " pod="openstack/aodh-364d-account-create-update-5gppk" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.643020 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxqcj\" (UniqueName: \"kubernetes.io/projected/c94563f6-d63b-478e-a796-9615ab525f8b-kube-api-access-zxqcj\") pod \"aodh-364d-account-create-update-5gppk\" (UID: \"c94563f6-d63b-478e-a796-9615ab525f8b\") " pod="openstack/aodh-364d-account-create-update-5gppk" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.643059 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45156329-405f-4f04-9a55-3568465720b3-operator-scripts\") pod \"aodh-db-create-78vks\" (UID: \"45156329-405f-4f04-9a55-3568465720b3\") " pod="openstack/aodh-db-create-78vks" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.643727 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45156329-405f-4f04-9a55-3568465720b3-operator-scripts\") pod \"aodh-db-create-78vks\" (UID: \"45156329-405f-4f04-9a55-3568465720b3\") " pod="openstack/aodh-db-create-78vks" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.667677 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpnrc\" (UniqueName: \"kubernetes.io/projected/45156329-405f-4f04-9a55-3568465720b3-kube-api-access-fpnrc\") pod \"aodh-db-create-78vks\" (UID: \"45156329-405f-4f04-9a55-3568465720b3\") " pod="openstack/aodh-db-create-78vks" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.691779 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.744883 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c94563f6-d63b-478e-a796-9615ab525f8b-operator-scripts\") pod \"aodh-364d-account-create-update-5gppk\" (UID: \"c94563f6-d63b-478e-a796-9615ab525f8b\") " pod="openstack/aodh-364d-account-create-update-5gppk" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.744948 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxqcj\" (UniqueName: \"kubernetes.io/projected/c94563f6-d63b-478e-a796-9615ab525f8b-kube-api-access-zxqcj\") pod \"aodh-364d-account-create-update-5gppk\" (UID: \"c94563f6-d63b-478e-a796-9615ab525f8b\") " pod="openstack/aodh-364d-account-create-update-5gppk" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.745735 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c94563f6-d63b-478e-a796-9615ab525f8b-operator-scripts\") pod \"aodh-364d-account-create-update-5gppk\" (UID: \"c94563f6-d63b-478e-a796-9615ab525f8b\") " pod="openstack/aodh-364d-account-create-update-5gppk" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.745860 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-78vks" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.768691 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxqcj\" (UniqueName: \"kubernetes.io/projected/c94563f6-d63b-478e-a796-9615ab525f8b-kube-api-access-zxqcj\") pod \"aodh-364d-account-create-update-5gppk\" (UID: \"c94563f6-d63b-478e-a796-9615ab525f8b\") " pod="openstack/aodh-364d-account-create-update-5gppk" Dec 09 11:45:02 crc kubenswrapper[5002]: I1209 11:45:02.843649 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-364d-account-create-update-5gppk" Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.372829 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-78vks"] Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.398355 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-78vks" event={"ID":"45156329-405f-4f04-9a55-3568465720b3","Type":"ContainerStarted","Data":"3ccf81e90d5e92df662bc4653a0706359f631d7d43e208422972f7a6deec1eeb"} Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.460198 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.490377 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-364d-account-create-update-5gppk"] Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.519773 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmhrf"] Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.815249 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.874232 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84b92a8a-c566-499c-903d-9fc0021b343e-config-volume\") pod \"84b92a8a-c566-499c-903d-9fc0021b343e\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.874331 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lssq6\" (UniqueName: \"kubernetes.io/projected/84b92a8a-c566-499c-903d-9fc0021b343e-kube-api-access-lssq6\") pod \"84b92a8a-c566-499c-903d-9fc0021b343e\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.874410 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84b92a8a-c566-499c-903d-9fc0021b343e-secret-volume\") pod \"84b92a8a-c566-499c-903d-9fc0021b343e\" (UID: \"84b92a8a-c566-499c-903d-9fc0021b343e\") " Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.875673 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b92a8a-c566-499c-903d-9fc0021b343e-config-volume" (OuterVolumeSpecName: "config-volume") pod "84b92a8a-c566-499c-903d-9fc0021b343e" (UID: "84b92a8a-c566-499c-903d-9fc0021b343e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.881020 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b92a8a-c566-499c-903d-9fc0021b343e-kube-api-access-lssq6" (OuterVolumeSpecName: "kube-api-access-lssq6") pod "84b92a8a-c566-499c-903d-9fc0021b343e" (UID: "84b92a8a-c566-499c-903d-9fc0021b343e"). InnerVolumeSpecName "kube-api-access-lssq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.881111 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b92a8a-c566-499c-903d-9fc0021b343e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "84b92a8a-c566-499c-903d-9fc0021b343e" (UID: "84b92a8a-c566-499c-903d-9fc0021b343e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.977350 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84b92a8a-c566-499c-903d-9fc0021b343e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.977988 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lssq6\" (UniqueName: \"kubernetes.io/projected/84b92a8a-c566-499c-903d-9fc0021b343e-kube-api-access-lssq6\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:03 crc kubenswrapper[5002]: I1209 11:45:03.978006 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84b92a8a-c566-499c-903d-9fc0021b343e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:04 crc kubenswrapper[5002]: I1209 11:45:04.416738 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-364d-account-create-update-5gppk" event={"ID":"c94563f6-d63b-478e-a796-9615ab525f8b","Type":"ContainerStarted","Data":"636e8e457008c6d06bcea2a0074f00b0ef6e363760f3d379a16dd2bbeaea2918"} Dec 09 11:45:04 crc kubenswrapper[5002]: I1209 11:45:04.416792 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-364d-account-create-update-5gppk" event={"ID":"c94563f6-d63b-478e-a796-9615ab525f8b","Type":"ContainerStarted","Data":"9a10dbf70b936b7f0b240d8fec0bd83b3b0d97451237423d59c9ceb6a5cd16db"} Dec 09 11:45:04 crc kubenswrapper[5002]: I1209 11:45:04.419846 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" event={"ID":"84b92a8a-c566-499c-903d-9fc0021b343e","Type":"ContainerDied","Data":"f15c04bf137056e09e8a89bc622a03d972940c67920b98ea8b755016ee9afb78"} Dec 09 11:45:04 crc kubenswrapper[5002]: I1209 11:45:04.420163 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f15c04bf137056e09e8a89bc622a03d972940c67920b98ea8b755016ee9afb78" Dec 09 11:45:04 crc kubenswrapper[5002]: I1209 11:45:04.419864 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5" Dec 09 11:45:04 crc kubenswrapper[5002]: I1209 11:45:04.422420 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-78vks" event={"ID":"45156329-405f-4f04-9a55-3568465720b3","Type":"ContainerStarted","Data":"19c556d2629036d9a997f5c5b1a3bbf0146f2d067156fff6ad0847815dd15620"} Dec 09 11:45:04 crc kubenswrapper[5002]: I1209 11:45:04.440266 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-364d-account-create-update-5gppk" podStartSLOduration=2.440156885 podStartE2EDuration="2.440156885s" podCreationTimestamp="2025-12-09 11:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:45:04.429292554 +0000 UTC m=+6236.821343655" watchObservedRunningTime="2025-12-09 11:45:04.440156885 +0000 UTC m=+6236.832207966" Dec 09 11:45:04 crc kubenswrapper[5002]: I1209 11:45:04.453696 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-78vks" podStartSLOduration=2.453678048 podStartE2EDuration="2.453678048s" podCreationTimestamp="2025-12-09 11:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:45:04.448697314 +0000 UTC m=+6236.840748425" watchObservedRunningTime="2025-12-09 11:45:04.453678048 +0000 UTC m=+6236.845729129" Dec 09 11:45:04 crc kubenswrapper[5002]: I1209 11:45:04.493738 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r"] Dec 09 11:45:04 crc kubenswrapper[5002]: I1209 11:45:04.502625 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421300-mvt9r"] Dec 09 11:45:05 crc kubenswrapper[5002]: I1209 11:45:05.122921 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:45:05 crc kubenswrapper[5002]: I1209 11:45:05.122978 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:45:05 crc kubenswrapper[5002]: I1209 11:45:05.169067 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:45:05 crc kubenswrapper[5002]: I1209 11:45:05.453920 5002 generic.go:334] "Generic (PLEG): container finished" podID="c94563f6-d63b-478e-a796-9615ab525f8b" containerID="636e8e457008c6d06bcea2a0074f00b0ef6e363760f3d379a16dd2bbeaea2918" exitCode=0 Dec 09 11:45:05 crc kubenswrapper[5002]: I1209 11:45:05.454178 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-364d-account-create-update-5gppk" event={"ID":"c94563f6-d63b-478e-a796-9615ab525f8b","Type":"ContainerDied","Data":"636e8e457008c6d06bcea2a0074f00b0ef6e363760f3d379a16dd2bbeaea2918"} Dec 09 11:45:05 crc kubenswrapper[5002]: I1209 11:45:05.459694 5002 generic.go:334] "Generic (PLEG): container finished" podID="45156329-405f-4f04-9a55-3568465720b3" containerID="19c556d2629036d9a997f5c5b1a3bbf0146f2d067156fff6ad0847815dd15620" exitCode=0 Dec 09 11:45:05 crc kubenswrapper[5002]: I1209 11:45:05.459786 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-78vks" event={"ID":"45156329-405f-4f04-9a55-3568465720b3","Type":"ContainerDied","Data":"19c556d2629036d9a997f5c5b1a3bbf0146f2d067156fff6ad0847815dd15620"} Dec 09 11:45:05 crc kubenswrapper[5002]: I1209 11:45:05.460121 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qmhrf" podUID="6853242a-1640-4e72-a63f-61931ee9ea3f" containerName="registry-server" containerID="cri-o://d2d75b4285fd009c67d50b1769cb947fa04d53ee9fa3fe058daf0901ca89c605" gracePeriod=2 Dec 09 11:45:05 crc kubenswrapper[5002]: I1209 11:45:05.519507 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:45:05 crc kubenswrapper[5002]: I1209 11:45:05.891154 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cq7rt"] Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.075410 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7f5a20-7e58-4d15-ad78-e9b07d787bd8" path="/var/lib/kubelet/pods/cf7f5a20-7e58-4d15-ad78-e9b07d787bd8/volumes" Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.472399 5002 generic.go:334] "Generic (PLEG): container finished" podID="6853242a-1640-4e72-a63f-61931ee9ea3f" containerID="d2d75b4285fd009c67d50b1769cb947fa04d53ee9fa3fe058daf0901ca89c605" exitCode=0 Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.472992 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmhrf" event={"ID":"6853242a-1640-4e72-a63f-61931ee9ea3f","Type":"ContainerDied","Data":"d2d75b4285fd009c67d50b1769cb947fa04d53ee9fa3fe058daf0901ca89c605"} Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.473036 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmhrf" event={"ID":"6853242a-1640-4e72-a63f-61931ee9ea3f","Type":"ContainerDied","Data":"f547e8679209f57676cdd6fb41377bbb5f2c9d352a284a0be130f46ae5283552"} Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.473046 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f547e8679209f57676cdd6fb41377bbb5f2c9d352a284a0be130f46ae5283552" Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.545570 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.629580 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-catalog-content\") pod \"6853242a-1640-4e72-a63f-61931ee9ea3f\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.629764 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4x8c\" (UniqueName: \"kubernetes.io/projected/6853242a-1640-4e72-a63f-61931ee9ea3f-kube-api-access-m4x8c\") pod \"6853242a-1640-4e72-a63f-61931ee9ea3f\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.630930 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-utilities\") pod \"6853242a-1640-4e72-a63f-61931ee9ea3f\" (UID: \"6853242a-1640-4e72-a63f-61931ee9ea3f\") " Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.631617 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-utilities" (OuterVolumeSpecName: "utilities") pod "6853242a-1640-4e72-a63f-61931ee9ea3f" (UID: "6853242a-1640-4e72-a63f-61931ee9ea3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.637671 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6853242a-1640-4e72-a63f-61931ee9ea3f-kube-api-access-m4x8c" (OuterVolumeSpecName: "kube-api-access-m4x8c") pod "6853242a-1640-4e72-a63f-61931ee9ea3f" (UID: "6853242a-1640-4e72-a63f-61931ee9ea3f"). InnerVolumeSpecName "kube-api-access-m4x8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.675267 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6853242a-1640-4e72-a63f-61931ee9ea3f" (UID: "6853242a-1640-4e72-a63f-61931ee9ea3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.733759 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.733798 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6853242a-1640-4e72-a63f-61931ee9ea3f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:06 crc kubenswrapper[5002]: I1209 11:45:06.733904 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4x8c\" (UniqueName: \"kubernetes.io/projected/6853242a-1640-4e72-a63f-61931ee9ea3f-kube-api-access-m4x8c\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.014425 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-364d-account-create-update-5gppk" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.023679 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-78vks" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.145172 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpnrc\" (UniqueName: \"kubernetes.io/projected/45156329-405f-4f04-9a55-3568465720b3-kube-api-access-fpnrc\") pod \"45156329-405f-4f04-9a55-3568465720b3\" (UID: \"45156329-405f-4f04-9a55-3568465720b3\") " Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.145320 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45156329-405f-4f04-9a55-3568465720b3-operator-scripts\") pod \"45156329-405f-4f04-9a55-3568465720b3\" (UID: \"45156329-405f-4f04-9a55-3568465720b3\") " Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.145617 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxqcj\" (UniqueName: \"kubernetes.io/projected/c94563f6-d63b-478e-a796-9615ab525f8b-kube-api-access-zxqcj\") pod \"c94563f6-d63b-478e-a796-9615ab525f8b\" (UID: \"c94563f6-d63b-478e-a796-9615ab525f8b\") " Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.145709 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c94563f6-d63b-478e-a796-9615ab525f8b-operator-scripts\") pod \"c94563f6-d63b-478e-a796-9615ab525f8b\" (UID: \"c94563f6-d63b-478e-a796-9615ab525f8b\") " Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.146281 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c94563f6-d63b-478e-a796-9615ab525f8b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c94563f6-d63b-478e-a796-9615ab525f8b" (UID: "c94563f6-d63b-478e-a796-9615ab525f8b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.147037 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45156329-405f-4f04-9a55-3568465720b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45156329-405f-4f04-9a55-3568465720b3" (UID: "45156329-405f-4f04-9a55-3568465720b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.150097 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45156329-405f-4f04-9a55-3568465720b3-kube-api-access-fpnrc" (OuterVolumeSpecName: "kube-api-access-fpnrc") pod "45156329-405f-4f04-9a55-3568465720b3" (UID: "45156329-405f-4f04-9a55-3568465720b3"). InnerVolumeSpecName "kube-api-access-fpnrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.150150 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c94563f6-d63b-478e-a796-9615ab525f8b-kube-api-access-zxqcj" (OuterVolumeSpecName: "kube-api-access-zxqcj") pod "c94563f6-d63b-478e-a796-9615ab525f8b" (UID: "c94563f6-d63b-478e-a796-9615ab525f8b"). InnerVolumeSpecName "kube-api-access-zxqcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.248755 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxqcj\" (UniqueName: \"kubernetes.io/projected/c94563f6-d63b-478e-a796-9615ab525f8b-kube-api-access-zxqcj\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.248840 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c94563f6-d63b-478e-a796-9615ab525f8b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.248862 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpnrc\" (UniqueName: \"kubernetes.io/projected/45156329-405f-4f04-9a55-3568465720b3-kube-api-access-fpnrc\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.248880 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45156329-405f-4f04-9a55-3568465720b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.484594 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-364d-account-create-update-5gppk" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.484594 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-364d-account-create-update-5gppk" event={"ID":"c94563f6-d63b-478e-a796-9615ab525f8b","Type":"ContainerDied","Data":"9a10dbf70b936b7f0b240d8fec0bd83b3b0d97451237423d59c9ceb6a5cd16db"} Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.484707 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a10dbf70b936b7f0b240d8fec0bd83b3b0d97451237423d59c9ceb6a5cd16db" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.486807 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-78vks" event={"ID":"45156329-405f-4f04-9a55-3568465720b3","Type":"ContainerDied","Data":"3ccf81e90d5e92df662bc4653a0706359f631d7d43e208422972f7a6deec1eeb"} Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.486863 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ccf81e90d5e92df662bc4653a0706359f631d7d43e208422972f7a6deec1eeb" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.486923 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cq7rt" podUID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" containerName="registry-server" containerID="cri-o://22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5" gracePeriod=2 Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.487213 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-78vks" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.487271 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmhrf" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.541248 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmhrf"] Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.549708 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qmhrf"] Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.858211 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-6sw44"] Dec 09 11:45:07 crc kubenswrapper[5002]: E1209 11:45:07.859012 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45156329-405f-4f04-9a55-3568465720b3" containerName="mariadb-database-create" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.859041 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="45156329-405f-4f04-9a55-3568465720b3" containerName="mariadb-database-create" Dec 09 11:45:07 crc kubenswrapper[5002]: E1209 11:45:07.859082 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6853242a-1640-4e72-a63f-61931ee9ea3f" containerName="extract-utilities" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.859091 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="6853242a-1640-4e72-a63f-61931ee9ea3f" containerName="extract-utilities" Dec 09 11:45:07 crc kubenswrapper[5002]: E1209 11:45:07.859108 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6853242a-1640-4e72-a63f-61931ee9ea3f" containerName="registry-server" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.859117 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="6853242a-1640-4e72-a63f-61931ee9ea3f" containerName="registry-server" Dec 09 11:45:07 crc kubenswrapper[5002]: E1209 11:45:07.859130 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94563f6-d63b-478e-a796-9615ab525f8b" containerName="mariadb-account-create-update" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.859138 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94563f6-d63b-478e-a796-9615ab525f8b" containerName="mariadb-account-create-update" Dec 09 11:45:07 crc kubenswrapper[5002]: E1209 11:45:07.859154 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6853242a-1640-4e72-a63f-61931ee9ea3f" containerName="extract-content" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.859163 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="6853242a-1640-4e72-a63f-61931ee9ea3f" containerName="extract-content" Dec 09 11:45:07 crc kubenswrapper[5002]: E1209 11:45:07.859196 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b92a8a-c566-499c-903d-9fc0021b343e" containerName="collect-profiles" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.859206 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b92a8a-c566-499c-903d-9fc0021b343e" containerName="collect-profiles" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.859465 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b92a8a-c566-499c-903d-9fc0021b343e" containerName="collect-profiles" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.859490 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="45156329-405f-4f04-9a55-3568465720b3" containerName="mariadb-database-create" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.859510 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="6853242a-1640-4e72-a63f-61931ee9ea3f" containerName="registry-server" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.859526 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94563f6-d63b-478e-a796-9615ab525f8b" containerName="mariadb-account-create-update" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.860489 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.863224 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.863387 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.863626 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5qdcd" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.863804 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.869578 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-6sw44"] Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.961287 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.962771 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vmn9\" (UniqueName: \"kubernetes.io/projected/8067dc57-a44d-4997-bfe0-945cb47eabe3-kube-api-access-6vmn9\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.963792 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-combined-ca-bundle\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.963835 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-scripts\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:07 crc kubenswrapper[5002]: I1209 11:45:07.964003 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-config-data\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.065094 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9fgm\" (UniqueName: \"kubernetes.io/projected/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-kube-api-access-r9fgm\") pod \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.065354 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-utilities\") pod \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.065398 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-catalog-content\") pod \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\" (UID: \"27ff2b0a-bd1e-4263-bd46-532a5790f1f3\") " Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.065752 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-config-data\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.065906 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vmn9\" (UniqueName: \"kubernetes.io/projected/8067dc57-a44d-4997-bfe0-945cb47eabe3-kube-api-access-6vmn9\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.065948 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-scripts\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.065965 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-combined-ca-bundle\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.066718 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-utilities" (OuterVolumeSpecName: "utilities") pod "27ff2b0a-bd1e-4263-bd46-532a5790f1f3" (UID: "27ff2b0a-bd1e-4263-bd46-532a5790f1f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.068744 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.073543 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-kube-api-access-r9fgm" (OuterVolumeSpecName: "kube-api-access-r9fgm") pod "27ff2b0a-bd1e-4263-bd46-532a5790f1f3" (UID: "27ff2b0a-bd1e-4263-bd46-532a5790f1f3"). InnerVolumeSpecName "kube-api-access-r9fgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.073605 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.075889 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-combined-ca-bundle\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.078170 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6853242a-1640-4e72-a63f-61931ee9ea3f" path="/var/lib/kubelet/pods/6853242a-1640-4e72-a63f-61931ee9ea3f/volumes" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.080266 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-config-data\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.084656 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-scripts\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.085274 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vmn9\" (UniqueName: \"kubernetes.io/projected/8067dc57-a44d-4997-bfe0-945cb47eabe3-kube-api-access-6vmn9\") pod \"aodh-db-sync-6sw44\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.091019 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27ff2b0a-bd1e-4263-bd46-532a5790f1f3" (UID: "27ff2b0a-bd1e-4263-bd46-532a5790f1f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.168234 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.168276 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.168291 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9fgm\" (UniqueName: \"kubernetes.io/projected/27ff2b0a-bd1e-4263-bd46-532a5790f1f3-kube-api-access-r9fgm\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.259658 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5qdcd" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.267998 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.509404 5002 generic.go:334] "Generic (PLEG): container finished" podID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" containerID="22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5" exitCode=0 Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.509787 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cq7rt" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.509773 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cq7rt" event={"ID":"27ff2b0a-bd1e-4263-bd46-532a5790f1f3","Type":"ContainerDied","Data":"22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5"} Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.509867 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cq7rt" event={"ID":"27ff2b0a-bd1e-4263-bd46-532a5790f1f3","Type":"ContainerDied","Data":"2a1d9e03d63286418db686f92329d86516871277984724bc6ccd571adccc9192"} Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.509889 5002 scope.go:117] "RemoveContainer" containerID="22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.537547 5002 scope.go:117] "RemoveContainer" containerID="842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.554839 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cq7rt"] Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.568493 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cq7rt"] Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.576891 5002 scope.go:117] "RemoveContainer" containerID="f823d6527fe9f4ae1d7c208322bf7e841de5d3ba91441b08c5aae0e4fd4589bc" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.602695 5002 scope.go:117] "RemoveContainer" containerID="22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5" Dec 09 11:45:08 crc kubenswrapper[5002]: E1209 11:45:08.603244 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5\": container with ID starting with 22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5 not found: ID does not exist" containerID="22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.603293 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5"} err="failed to get container status \"22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5\": rpc error: code = NotFound desc = could not find container \"22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5\": container with ID starting with 22c41dc8cccc2fe5895054ac656eab3940ed2407e8412ca93455a87c1fb3f6c5 not found: ID does not exist" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.603323 5002 scope.go:117] "RemoveContainer" containerID="842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf" Dec 09 11:45:08 crc kubenswrapper[5002]: E1209 11:45:08.603612 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf\": container with ID starting with 842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf not found: ID does not exist" containerID="842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.603639 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf"} err="failed to get container status \"842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf\": rpc error: code = NotFound desc = could not find container \"842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf\": container with ID starting with 842c1833926ff8a2978e076719887281cf291307ee24b59bc53c950bf31fc9bf not found: ID does not exist" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.603653 5002 scope.go:117] "RemoveContainer" containerID="f823d6527fe9f4ae1d7c208322bf7e841de5d3ba91441b08c5aae0e4fd4589bc" Dec 09 11:45:08 crc kubenswrapper[5002]: E1209 11:45:08.603856 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f823d6527fe9f4ae1d7c208322bf7e841de5d3ba91441b08c5aae0e4fd4589bc\": container with ID starting with f823d6527fe9f4ae1d7c208322bf7e841de5d3ba91441b08c5aae0e4fd4589bc not found: ID does not exist" containerID="f823d6527fe9f4ae1d7c208322bf7e841de5d3ba91441b08c5aae0e4fd4589bc" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.603882 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f823d6527fe9f4ae1d7c208322bf7e841de5d3ba91441b08c5aae0e4fd4589bc"} err="failed to get container status \"f823d6527fe9f4ae1d7c208322bf7e841de5d3ba91441b08c5aae0e4fd4589bc\": rpc error: code = NotFound desc = could not find container \"f823d6527fe9f4ae1d7c208322bf7e841de5d3ba91441b08c5aae0e4fd4589bc\": container with ID starting with f823d6527fe9f4ae1d7c208322bf7e841de5d3ba91441b08c5aae0e4fd4589bc not found: ID does not exist" Dec 09 11:45:08 crc kubenswrapper[5002]: I1209 11:45:08.741639 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-6sw44"] Dec 09 11:45:08 crc kubenswrapper[5002]: W1209 11:45:08.747472 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8067dc57_a44d_4997_bfe0_945cb47eabe3.slice/crio-9bf944ab3356f80f131d760dfbaf9eb36d4b53e3c215fb005ba115344d781330 WatchSource:0}: Error finding container 9bf944ab3356f80f131d760dfbaf9eb36d4b53e3c215fb005ba115344d781330: Status 404 returned error can't find the container with id 9bf944ab3356f80f131d760dfbaf9eb36d4b53e3c215fb005ba115344d781330 Dec 09 11:45:09 crc kubenswrapper[5002]: I1209 11:45:09.526102 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-6sw44" event={"ID":"8067dc57-a44d-4997-bfe0-945cb47eabe3","Type":"ContainerStarted","Data":"9bf944ab3356f80f131d760dfbaf9eb36d4b53e3c215fb005ba115344d781330"} Dec 09 11:45:10 crc kubenswrapper[5002]: I1209 11:45:10.073058 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" path="/var/lib/kubelet/pods/27ff2b0a-bd1e-4263-bd46-532a5790f1f3/volumes" Dec 09 11:45:11 crc kubenswrapper[5002]: I1209 11:45:11.062407 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:45:11 crc kubenswrapper[5002]: E1209 11:45:11.062955 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:45:14 crc kubenswrapper[5002]: I1209 11:45:14.032972 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 11:45:15 crc kubenswrapper[5002]: I1209 11:45:15.495800 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 11:45:16 crc kubenswrapper[5002]: I1209 11:45:16.598127 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-6sw44" event={"ID":"8067dc57-a44d-4997-bfe0-945cb47eabe3","Type":"ContainerStarted","Data":"049f26688284e6813c69c3aa3f4ff7847df86e26ade416de72508c7c7eab60c5"} Dec 09 11:45:19 crc kubenswrapper[5002]: I1209 11:45:19.632492 5002 generic.go:334] "Generic (PLEG): container finished" podID="8067dc57-a44d-4997-bfe0-945cb47eabe3" containerID="049f26688284e6813c69c3aa3f4ff7847df86e26ade416de72508c7c7eab60c5" exitCode=0 Dec 09 11:45:19 crc kubenswrapper[5002]: I1209 11:45:19.632594 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-6sw44" event={"ID":"8067dc57-a44d-4997-bfe0-945cb47eabe3","Type":"ContainerDied","Data":"049f26688284e6813c69c3aa3f4ff7847df86e26ade416de72508c7c7eab60c5"} Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.130126 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.248695 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-combined-ca-bundle\") pod \"8067dc57-a44d-4997-bfe0-945cb47eabe3\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.248895 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vmn9\" (UniqueName: \"kubernetes.io/projected/8067dc57-a44d-4997-bfe0-945cb47eabe3-kube-api-access-6vmn9\") pod \"8067dc57-a44d-4997-bfe0-945cb47eabe3\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.249048 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-scripts\") pod \"8067dc57-a44d-4997-bfe0-945cb47eabe3\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.249068 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-config-data\") pod \"8067dc57-a44d-4997-bfe0-945cb47eabe3\" (UID: \"8067dc57-a44d-4997-bfe0-945cb47eabe3\") " Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.257073 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8067dc57-a44d-4997-bfe0-945cb47eabe3-kube-api-access-6vmn9" (OuterVolumeSpecName: "kube-api-access-6vmn9") pod "8067dc57-a44d-4997-bfe0-945cb47eabe3" (UID: "8067dc57-a44d-4997-bfe0-945cb47eabe3"). InnerVolumeSpecName "kube-api-access-6vmn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.258563 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-scripts" (OuterVolumeSpecName: "scripts") pod "8067dc57-a44d-4997-bfe0-945cb47eabe3" (UID: "8067dc57-a44d-4997-bfe0-945cb47eabe3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.285152 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8067dc57-a44d-4997-bfe0-945cb47eabe3" (UID: "8067dc57-a44d-4997-bfe0-945cb47eabe3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.285274 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-config-data" (OuterVolumeSpecName: "config-data") pod "8067dc57-a44d-4997-bfe0-945cb47eabe3" (UID: "8067dc57-a44d-4997-bfe0-945cb47eabe3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.351013 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vmn9\" (UniqueName: \"kubernetes.io/projected/8067dc57-a44d-4997-bfe0-945cb47eabe3-kube-api-access-6vmn9\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.351304 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.351315 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.351324 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8067dc57-a44d-4997-bfe0-945cb47eabe3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.659987 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-6sw44" event={"ID":"8067dc57-a44d-4997-bfe0-945cb47eabe3","Type":"ContainerDied","Data":"9bf944ab3356f80f131d760dfbaf9eb36d4b53e3c215fb005ba115344d781330"} Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.660044 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bf944ab3356f80f131d760dfbaf9eb36d4b53e3c215fb005ba115344d781330" Dec 09 11:45:21 crc kubenswrapper[5002]: I1209 11:45:21.660132 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-6sw44" Dec 09 11:45:22 crc kubenswrapper[5002]: I1209 11:45:22.060339 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:45:22 crc kubenswrapper[5002]: E1209 11:45:22.060841 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.018619 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 09 11:45:23 crc kubenswrapper[5002]: E1209 11:45:23.019830 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" containerName="extract-content" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.019851 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" containerName="extract-content" Dec 09 11:45:23 crc kubenswrapper[5002]: E1209 11:45:23.019901 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" containerName="extract-utilities" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.019912 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" containerName="extract-utilities" Dec 09 11:45:23 crc kubenswrapper[5002]: E1209 11:45:23.019931 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8067dc57-a44d-4997-bfe0-945cb47eabe3" containerName="aodh-db-sync" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.019940 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8067dc57-a44d-4997-bfe0-945cb47eabe3" containerName="aodh-db-sync" Dec 09 11:45:23 crc kubenswrapper[5002]: E1209 11:45:23.019970 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" containerName="registry-server" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.019979 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" containerName="registry-server" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.020224 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="8067dc57-a44d-4997-bfe0-945cb47eabe3" containerName="aodh-db-sync" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.020250 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="27ff2b0a-bd1e-4263-bd46-532a5790f1f3" containerName="registry-server" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.022945 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.024756 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.025021 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5qdcd" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.026661 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.046507 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.085396 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b2002d-986f-41bd-a7d8-1b59680b72e7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.085520 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b2002d-986f-41bd-a7d8-1b59680b72e7-scripts\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.085570 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdjmf\" (UniqueName: \"kubernetes.io/projected/d6b2002d-986f-41bd-a7d8-1b59680b72e7-kube-api-access-hdjmf\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.085674 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b2002d-986f-41bd-a7d8-1b59680b72e7-config-data\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.187499 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b2002d-986f-41bd-a7d8-1b59680b72e7-scripts\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.187572 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdjmf\" (UniqueName: \"kubernetes.io/projected/d6b2002d-986f-41bd-a7d8-1b59680b72e7-kube-api-access-hdjmf\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.187633 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b2002d-986f-41bd-a7d8-1b59680b72e7-config-data\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.187725 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b2002d-986f-41bd-a7d8-1b59680b72e7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.192632 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b2002d-986f-41bd-a7d8-1b59680b72e7-scripts\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.193220 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b2002d-986f-41bd-a7d8-1b59680b72e7-config-data\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.199666 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b2002d-986f-41bd-a7d8-1b59680b72e7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.204490 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdjmf\" (UniqueName: \"kubernetes.io/projected/d6b2002d-986f-41bd-a7d8-1b59680b72e7-kube-api-access-hdjmf\") pod \"aodh-0\" (UID: \"d6b2002d-986f-41bd-a7d8-1b59680b72e7\") " pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.350938 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 09 11:45:23 crc kubenswrapper[5002]: I1209 11:45:23.892145 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 09 11:45:24 crc kubenswrapper[5002]: I1209 11:45:24.692042 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d6b2002d-986f-41bd-a7d8-1b59680b72e7","Type":"ContainerStarted","Data":"589830a9a8cdf6de45fbfa263abd785b04fcedcffc37c7cb167795324cc4d030"} Dec 09 11:45:24 crc kubenswrapper[5002]: I1209 11:45:24.927202 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:45:24 crc kubenswrapper[5002]: I1209 11:45:24.927645 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="ceilometer-central-agent" containerID="cri-o://4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06" gracePeriod=30 Dec 09 11:45:24 crc kubenswrapper[5002]: I1209 11:45:24.927715 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="ceilometer-notification-agent" containerID="cri-o://c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839" gracePeriod=30 Dec 09 11:45:24 crc kubenswrapper[5002]: I1209 11:45:24.927723 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="sg-core" containerID="cri-o://9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa" gracePeriod=30 Dec 09 11:45:24 crc kubenswrapper[5002]: I1209 11:45:24.927781 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="proxy-httpd" containerID="cri-o://4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c" gracePeriod=30 Dec 09 11:45:25 crc kubenswrapper[5002]: I1209 11:45:25.709502 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d6b2002d-986f-41bd-a7d8-1b59680b72e7","Type":"ContainerStarted","Data":"18c03e3afcf3e4f206061866a2c76b8171b1d950e7b50db7fa8c95d8741591c1"} Dec 09 11:45:25 crc kubenswrapper[5002]: I1209 11:45:25.716164 5002 generic.go:334] "Generic (PLEG): container finished" podID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerID="4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c" exitCode=0 Dec 09 11:45:25 crc kubenswrapper[5002]: I1209 11:45:25.716200 5002 generic.go:334] "Generic (PLEG): container finished" podID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerID="9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa" exitCode=2 Dec 09 11:45:25 crc kubenswrapper[5002]: I1209 11:45:25.716212 5002 generic.go:334] "Generic (PLEG): container finished" podID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerID="4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06" exitCode=0 Dec 09 11:45:25 crc kubenswrapper[5002]: I1209 11:45:25.716236 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffa689c7-aa9c-49e5-81ca-607f501e98bd","Type":"ContainerDied","Data":"4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c"} Dec 09 11:45:25 crc kubenswrapper[5002]: I1209 11:45:25.716265 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffa689c7-aa9c-49e5-81ca-607f501e98bd","Type":"ContainerDied","Data":"9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa"} Dec 09 11:45:25 crc kubenswrapper[5002]: I1209 11:45:25.716278 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffa689c7-aa9c-49e5-81ca-607f501e98bd","Type":"ContainerDied","Data":"4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06"} Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.428434 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.577474 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-sg-core-conf-yaml\") pod \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.577913 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-scripts\") pod \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.577962 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-run-httpd\") pod \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.578065 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-log-httpd\") pod \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.578268 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-combined-ca-bundle\") pod \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.578326 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g2ng\" (UniqueName: \"kubernetes.io/projected/ffa689c7-aa9c-49e5-81ca-607f501e98bd-kube-api-access-9g2ng\") pod \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.578358 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-config-data\") pod \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\" (UID: \"ffa689c7-aa9c-49e5-81ca-607f501e98bd\") " Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.578519 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ffa689c7-aa9c-49e5-81ca-607f501e98bd" (UID: "ffa689c7-aa9c-49e5-81ca-607f501e98bd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.579018 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ffa689c7-aa9c-49e5-81ca-607f501e98bd" (UID: "ffa689c7-aa9c-49e5-81ca-607f501e98bd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.579117 5002 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.581932 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa689c7-aa9c-49e5-81ca-607f501e98bd-kube-api-access-9g2ng" (OuterVolumeSpecName: "kube-api-access-9g2ng") pod "ffa689c7-aa9c-49e5-81ca-607f501e98bd" (UID: "ffa689c7-aa9c-49e5-81ca-607f501e98bd"). InnerVolumeSpecName "kube-api-access-9g2ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.582113 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-scripts" (OuterVolumeSpecName: "scripts") pod "ffa689c7-aa9c-49e5-81ca-607f501e98bd" (UID: "ffa689c7-aa9c-49e5-81ca-607f501e98bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.610355 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ffa689c7-aa9c-49e5-81ca-607f501e98bd" (UID: "ffa689c7-aa9c-49e5-81ca-607f501e98bd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.673927 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffa689c7-aa9c-49e5-81ca-607f501e98bd" (UID: "ffa689c7-aa9c-49e5-81ca-607f501e98bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.681412 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.681452 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g2ng\" (UniqueName: \"kubernetes.io/projected/ffa689c7-aa9c-49e5-81ca-607f501e98bd-kube-api-access-9g2ng\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.681466 5002 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.681478 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.681490 5002 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffa689c7-aa9c-49e5-81ca-607f501e98bd-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.682268 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-config-data" (OuterVolumeSpecName: "config-data") pod "ffa689c7-aa9c-49e5-81ca-607f501e98bd" (UID: "ffa689c7-aa9c-49e5-81ca-607f501e98bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.731743 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d6b2002d-986f-41bd-a7d8-1b59680b72e7","Type":"ContainerStarted","Data":"713068ac99e6a26fe4468dca548eda749956788de1259ff57b2d9b5e8e01ea8d"} Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.734908 5002 generic.go:334] "Generic (PLEG): container finished" podID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerID="c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839" exitCode=0 Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.734962 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffa689c7-aa9c-49e5-81ca-607f501e98bd","Type":"ContainerDied","Data":"c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839"} Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.734993 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffa689c7-aa9c-49e5-81ca-607f501e98bd","Type":"ContainerDied","Data":"32dd72b4877aab023e68a833b8da031b82d752adcff359b0e228865e632d7745"} Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.735013 5002 scope.go:117] "RemoveContainer" containerID="4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.735153 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.793774 5002 scope.go:117] "RemoveContainer" containerID="9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.798372 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffa689c7-aa9c-49e5-81ca-607f501e98bd-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.803292 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.817571 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.829520 5002 scope.go:117] "RemoveContainer" containerID="c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.845846 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:45:26 crc kubenswrapper[5002]: E1209 11:45:26.848279 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="proxy-httpd" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.848305 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="proxy-httpd" Dec 09 11:45:26 crc kubenswrapper[5002]: E1209 11:45:26.848336 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="ceilometer-notification-agent" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.848342 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="ceilometer-notification-agent" Dec 09 11:45:26 crc kubenswrapper[5002]: E1209 11:45:26.848369 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="sg-core" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.848375 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="sg-core" Dec 09 11:45:26 crc kubenswrapper[5002]: E1209 11:45:26.848388 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="ceilometer-central-agent" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.848394 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="ceilometer-central-agent" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.848597 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="ceilometer-central-agent" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.848793 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="ceilometer-notification-agent" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.848808 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="proxy-httpd" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.848843 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" containerName="sg-core" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.849175 5002 scope.go:117] "RemoveContainer" containerID="4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.853301 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.856138 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.856152 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.859990 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.880762 5002 scope.go:117] "RemoveContainer" containerID="4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c" Dec 09 11:45:26 crc kubenswrapper[5002]: E1209 11:45:26.881313 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c\": container with ID starting with 4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c not found: ID does not exist" containerID="4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.881410 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c"} err="failed to get container status \"4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c\": rpc error: code = NotFound desc = could not find container \"4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c\": container with ID starting with 4550af5a865220af0921f592d91bb58856e750d6cb2b2a81e0d9839b752be76c not found: ID does not exist" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.881551 5002 scope.go:117] "RemoveContainer" containerID="9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa" Dec 09 11:45:26 crc kubenswrapper[5002]: E1209 11:45:26.881964 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa\": container with ID starting with 9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa not found: ID does not exist" containerID="9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.882012 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa"} err="failed to get container status \"9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa\": rpc error: code = NotFound desc = could not find container \"9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa\": container with ID starting with 9d6ffc9330b05f051acdd6f43db4040e61c1c555a99f16ab1c4028fc34f3d4fa not found: ID does not exist" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.882042 5002 scope.go:117] "RemoveContainer" containerID="c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839" Dec 09 11:45:26 crc kubenswrapper[5002]: E1209 11:45:26.883433 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839\": container with ID starting with c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839 not found: ID does not exist" containerID="c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.883483 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839"} err="failed to get container status \"c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839\": rpc error: code = NotFound desc = could not find container \"c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839\": container with ID starting with c93d7e425a8b020b67a6e13b9f4b0bb0d940be09678dabb174f33bdf9a910839 not found: ID does not exist" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.883507 5002 scope.go:117] "RemoveContainer" containerID="4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06" Dec 09 11:45:26 crc kubenswrapper[5002]: E1209 11:45:26.884186 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06\": container with ID starting with 4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06 not found: ID does not exist" containerID="4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.884232 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06"} err="failed to get container status \"4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06\": rpc error: code = NotFound desc = could not find container \"4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06\": container with ID starting with 4cfee57eca2ef3b1e95e5ef1f9370d679a5879b48559af91d1cc7158006aac06 not found: ID does not exist" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.899544 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-log-httpd\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.899778 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-scripts\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.899999 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-run-httpd\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.900110 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-config-data\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.900185 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zx6p\" (UniqueName: \"kubernetes.io/projected/a0f4215e-37c9-4892-888e-ce9bd74ece35-kube-api-access-8zx6p\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.900271 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:26 crc kubenswrapper[5002]: I1209 11:45:26.900341 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.002932 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-run-httpd\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.003020 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-config-data\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.003052 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zx6p\" (UniqueName: \"kubernetes.io/projected/a0f4215e-37c9-4892-888e-ce9bd74ece35-kube-api-access-8zx6p\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.003099 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.003115 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.003226 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-log-httpd\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.003251 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-scripts\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.004410 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-run-httpd\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.005272 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-log-httpd\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.009305 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-scripts\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.009625 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-config-data\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.011551 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.012270 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.030913 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zx6p\" (UniqueName: \"kubernetes.io/projected/a0f4215e-37c9-4892-888e-ce9bd74ece35-kube-api-access-8zx6p\") pod \"ceilometer-0\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.186108 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:45:27 crc kubenswrapper[5002]: I1209 11:45:27.709146 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:45:27 crc kubenswrapper[5002]: W1209 11:45:27.837938 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0f4215e_37c9_4892_888e_ce9bd74ece35.slice/crio-4183e5929a220338582269b94bf7ca9e0571db4d8c68c985c64932de0e57c570 WatchSource:0}: Error finding container 4183e5929a220338582269b94bf7ca9e0571db4d8c68c985c64932de0e57c570: Status 404 returned error can't find the container with id 4183e5929a220338582269b94bf7ca9e0571db4d8c68c985c64932de0e57c570 Dec 09 11:45:28 crc kubenswrapper[5002]: I1209 11:45:28.088789 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa689c7-aa9c-49e5-81ca-607f501e98bd" path="/var/lib/kubelet/pods/ffa689c7-aa9c-49e5-81ca-607f501e98bd/volumes" Dec 09 11:45:28 crc kubenswrapper[5002]: I1209 11:45:28.758938 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0f4215e-37c9-4892-888e-ce9bd74ece35","Type":"ContainerStarted","Data":"bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224"} Dec 09 11:45:28 crc kubenswrapper[5002]: I1209 11:45:28.759281 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0f4215e-37c9-4892-888e-ce9bd74ece35","Type":"ContainerStarted","Data":"4183e5929a220338582269b94bf7ca9e0571db4d8c68c985c64932de0e57c570"} Dec 09 11:45:28 crc kubenswrapper[5002]: I1209 11:45:28.761002 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d6b2002d-986f-41bd-a7d8-1b59680b72e7","Type":"ContainerStarted","Data":"3b6a49c848a8603384b7936cddb00c9719ff81e0fb05a81a571911beee2c69cd"} Dec 09 11:45:29 crc kubenswrapper[5002]: I1209 11:45:29.773962 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0f4215e-37c9-4892-888e-ce9bd74ece35","Type":"ContainerStarted","Data":"4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16"} Dec 09 11:45:30 crc kubenswrapper[5002]: I1209 11:45:30.792592 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d6b2002d-986f-41bd-a7d8-1b59680b72e7","Type":"ContainerStarted","Data":"3f09aa37d25ec591a903d5e163d4b46ec7d5cd3206dda85986ddc838c4c81048"} Dec 09 11:45:30 crc kubenswrapper[5002]: I1209 11:45:30.796996 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0f4215e-37c9-4892-888e-ce9bd74ece35","Type":"ContainerStarted","Data":"514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57"} Dec 09 11:45:30 crc kubenswrapper[5002]: I1209 11:45:30.822596 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.826955005 podStartE2EDuration="8.822575455s" podCreationTimestamp="2025-12-09 11:45:22 +0000 UTC" firstStartedPulling="2025-12-09 11:45:23.903140561 +0000 UTC m=+6256.295191642" lastFinishedPulling="2025-12-09 11:45:29.898761011 +0000 UTC m=+6262.290812092" observedRunningTime="2025-12-09 11:45:30.816947235 +0000 UTC m=+6263.208998316" watchObservedRunningTime="2025-12-09 11:45:30.822575455 +0000 UTC m=+6263.214626536" Dec 09 11:45:31 crc kubenswrapper[5002]: I1209 11:45:31.041991 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3a2d-account-create-update-j2627"] Dec 09 11:45:31 crc kubenswrapper[5002]: I1209 11:45:31.057052 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3a2d-account-create-update-j2627"] Dec 09 11:45:31 crc kubenswrapper[5002]: I1209 11:45:31.066933 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-x6t76"] Dec 09 11:45:31 crc kubenswrapper[5002]: I1209 11:45:31.075311 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-x6t76"] Dec 09 11:45:32 crc kubenswrapper[5002]: I1209 11:45:32.082574 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b9fdf9c-44bc-48c9-8651-5394aca76af1" path="/var/lib/kubelet/pods/9b9fdf9c-44bc-48c9-8651-5394aca76af1/volumes" Dec 09 11:45:32 crc kubenswrapper[5002]: I1209 11:45:32.083891 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7ef5176-0dfd-4c7b-8b18-088cd7959673" path="/var/lib/kubelet/pods/e7ef5176-0dfd-4c7b-8b18-088cd7959673/volumes" Dec 09 11:45:32 crc kubenswrapper[5002]: I1209 11:45:32.817013 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0f4215e-37c9-4892-888e-ce9bd74ece35","Type":"ContainerStarted","Data":"5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100"} Dec 09 11:45:32 crc kubenswrapper[5002]: I1209 11:45:32.817380 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 11:45:32 crc kubenswrapper[5002]: I1209 11:45:32.840849 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.905917089 podStartE2EDuration="6.840825553s" podCreationTimestamp="2025-12-09 11:45:26 +0000 UTC" firstStartedPulling="2025-12-09 11:45:27.840738798 +0000 UTC m=+6260.232789879" lastFinishedPulling="2025-12-09 11:45:31.775647262 +0000 UTC m=+6264.167698343" observedRunningTime="2025-12-09 11:45:32.836793635 +0000 UTC m=+6265.228844716" watchObservedRunningTime="2025-12-09 11:45:32.840825553 +0000 UTC m=+6265.232876634" Dec 09 11:45:34 crc kubenswrapper[5002]: I1209 11:45:34.061069 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:45:34 crc kubenswrapper[5002]: E1209 11:45:34.061313 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.116607 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-w2hx7"] Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.118141 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-w2hx7" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.138520 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-w2hx7"] Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.138599 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94ee5751-ddf8-468a-b9e1-feb60a74481c-operator-scripts\") pod \"manila-db-create-w2hx7\" (UID: \"94ee5751-ddf8-468a-b9e1-feb60a74481c\") " pod="openstack/manila-db-create-w2hx7" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.138823 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4glz8\" (UniqueName: \"kubernetes.io/projected/94ee5751-ddf8-468a-b9e1-feb60a74481c-kube-api-access-4glz8\") pod \"manila-db-create-w2hx7\" (UID: \"94ee5751-ddf8-468a-b9e1-feb60a74481c\") " pod="openstack/manila-db-create-w2hx7" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.225378 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-1042-account-create-update-z8pfb"] Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.226999 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1042-account-create-update-z8pfb" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.228661 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.241335 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4glz8\" (UniqueName: \"kubernetes.io/projected/94ee5751-ddf8-468a-b9e1-feb60a74481c-kube-api-access-4glz8\") pod \"manila-db-create-w2hx7\" (UID: \"94ee5751-ddf8-468a-b9e1-feb60a74481c\") " pod="openstack/manila-db-create-w2hx7" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.241345 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1042-account-create-update-z8pfb"] Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.242189 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94ee5751-ddf8-468a-b9e1-feb60a74481c-operator-scripts\") pod \"manila-db-create-w2hx7\" (UID: \"94ee5751-ddf8-468a-b9e1-feb60a74481c\") " pod="openstack/manila-db-create-w2hx7" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.243055 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94ee5751-ddf8-468a-b9e1-feb60a74481c-operator-scripts\") pod \"manila-db-create-w2hx7\" (UID: \"94ee5751-ddf8-468a-b9e1-feb60a74481c\") " pod="openstack/manila-db-create-w2hx7" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.271139 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4glz8\" (UniqueName: \"kubernetes.io/projected/94ee5751-ddf8-468a-b9e1-feb60a74481c-kube-api-access-4glz8\") pod \"manila-db-create-w2hx7\" (UID: \"94ee5751-ddf8-468a-b9e1-feb60a74481c\") " pod="openstack/manila-db-create-w2hx7" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.343591 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-operator-scripts\") pod \"manila-1042-account-create-update-z8pfb\" (UID: \"f1532e90-d56a-4f6c-9f0f-3aa418f6498a\") " pod="openstack/manila-1042-account-create-update-z8pfb" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.343784 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jxs\" (UniqueName: \"kubernetes.io/projected/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-kube-api-access-k8jxs\") pod \"manila-1042-account-create-update-z8pfb\" (UID: \"f1532e90-d56a-4f6c-9f0f-3aa418f6498a\") " pod="openstack/manila-1042-account-create-update-z8pfb" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.440616 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-w2hx7" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.445766 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-operator-scripts\") pod \"manila-1042-account-create-update-z8pfb\" (UID: \"f1532e90-d56a-4f6c-9f0f-3aa418f6498a\") " pod="openstack/manila-1042-account-create-update-z8pfb" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.445955 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jxs\" (UniqueName: \"kubernetes.io/projected/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-kube-api-access-k8jxs\") pod \"manila-1042-account-create-update-z8pfb\" (UID: \"f1532e90-d56a-4f6c-9f0f-3aa418f6498a\") " pod="openstack/manila-1042-account-create-update-z8pfb" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.447473 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-operator-scripts\") pod \"manila-1042-account-create-update-z8pfb\" (UID: \"f1532e90-d56a-4f6c-9f0f-3aa418f6498a\") " pod="openstack/manila-1042-account-create-update-z8pfb" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.465878 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jxs\" (UniqueName: \"kubernetes.io/projected/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-kube-api-access-k8jxs\") pod \"manila-1042-account-create-update-z8pfb\" (UID: \"f1532e90-d56a-4f6c-9f0f-3aa418f6498a\") " pod="openstack/manila-1042-account-create-update-z8pfb" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.557129 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1042-account-create-update-z8pfb" Dec 09 11:45:37 crc kubenswrapper[5002]: I1209 11:45:37.960911 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-w2hx7"] Dec 09 11:45:37 crc kubenswrapper[5002]: W1209 11:45:37.964222 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94ee5751_ddf8_468a_b9e1_feb60a74481c.slice/crio-2b4cfa2e4f5913adfacefec965ef909402c4b4d8e39ffa89df3821742f233a4f WatchSource:0}: Error finding container 2b4cfa2e4f5913adfacefec965ef909402c4b4d8e39ffa89df3821742f233a4f: Status 404 returned error can't find the container with id 2b4cfa2e4f5913adfacefec965ef909402c4b4d8e39ffa89df3821742f233a4f Dec 09 11:45:38 crc kubenswrapper[5002]: I1209 11:45:38.141899 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1042-account-create-update-z8pfb"] Dec 09 11:45:38 crc kubenswrapper[5002]: W1209 11:45:38.144365 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1532e90_d56a_4f6c_9f0f_3aa418f6498a.slice/crio-00d47eb05e0f9d4f1a0b72f28b6c3b8f1de9a24102dc4d2e7e8c2eb794e40ae8 WatchSource:0}: Error finding container 00d47eb05e0f9d4f1a0b72f28b6c3b8f1de9a24102dc4d2e7e8c2eb794e40ae8: Status 404 returned error can't find the container with id 00d47eb05e0f9d4f1a0b72f28b6c3b8f1de9a24102dc4d2e7e8c2eb794e40ae8 Dec 09 11:45:38 crc kubenswrapper[5002]: I1209 11:45:38.881115 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-w2hx7" event={"ID":"94ee5751-ddf8-468a-b9e1-feb60a74481c","Type":"ContainerStarted","Data":"2b4cfa2e4f5913adfacefec965ef909402c4b4d8e39ffa89df3821742f233a4f"} Dec 09 11:45:38 crc kubenswrapper[5002]: I1209 11:45:38.882666 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1042-account-create-update-z8pfb" event={"ID":"f1532e90-d56a-4f6c-9f0f-3aa418f6498a","Type":"ContainerStarted","Data":"00d47eb05e0f9d4f1a0b72f28b6c3b8f1de9a24102dc4d2e7e8c2eb794e40ae8"} Dec 09 11:45:39 crc kubenswrapper[5002]: I1209 11:45:39.893793 5002 generic.go:334] "Generic (PLEG): container finished" podID="94ee5751-ddf8-468a-b9e1-feb60a74481c" containerID="44fb72e7119cc0ef6bc8a804a747b1a9d09bafd531f0db2bb4ab798bc2bb712c" exitCode=0 Dec 09 11:45:39 crc kubenswrapper[5002]: I1209 11:45:39.893857 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-w2hx7" event={"ID":"94ee5751-ddf8-468a-b9e1-feb60a74481c","Type":"ContainerDied","Data":"44fb72e7119cc0ef6bc8a804a747b1a9d09bafd531f0db2bb4ab798bc2bb712c"} Dec 09 11:45:39 crc kubenswrapper[5002]: I1209 11:45:39.896586 5002 generic.go:334] "Generic (PLEG): container finished" podID="f1532e90-d56a-4f6c-9f0f-3aa418f6498a" containerID="611aaee36f227f6bf7b43a35a0857458b868ae07fff5c6d9b65bfef529969114" exitCode=0 Dec 09 11:45:39 crc kubenswrapper[5002]: I1209 11:45:39.896617 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1042-account-create-update-z8pfb" event={"ID":"f1532e90-d56a-4f6c-9f0f-3aa418f6498a","Type":"ContainerDied","Data":"611aaee36f227f6bf7b43a35a0857458b868ae07fff5c6d9b65bfef529969114"} Dec 09 11:45:40 crc kubenswrapper[5002]: I1209 11:45:40.032892 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-7t5r2"] Dec 09 11:45:40 crc kubenswrapper[5002]: I1209 11:45:40.042539 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-7t5r2"] Dec 09 11:45:40 crc kubenswrapper[5002]: I1209 11:45:40.070400 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc57b111-13f4-4d51-9ea3-26261b356a72" path="/var/lib/kubelet/pods/dc57b111-13f4-4d51-9ea3-26261b356a72/volumes" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.448982 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1042-account-create-update-z8pfb" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.455789 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-w2hx7" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.646736 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94ee5751-ddf8-468a-b9e1-feb60a74481c-operator-scripts\") pod \"94ee5751-ddf8-468a-b9e1-feb60a74481c\" (UID: \"94ee5751-ddf8-468a-b9e1-feb60a74481c\") " Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.647190 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ee5751-ddf8-468a-b9e1-feb60a74481c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94ee5751-ddf8-468a-b9e1-feb60a74481c" (UID: "94ee5751-ddf8-468a-b9e1-feb60a74481c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.647342 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8jxs\" (UniqueName: \"kubernetes.io/projected/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-kube-api-access-k8jxs\") pod \"f1532e90-d56a-4f6c-9f0f-3aa418f6498a\" (UID: \"f1532e90-d56a-4f6c-9f0f-3aa418f6498a\") " Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.647386 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4glz8\" (UniqueName: \"kubernetes.io/projected/94ee5751-ddf8-468a-b9e1-feb60a74481c-kube-api-access-4glz8\") pod \"94ee5751-ddf8-468a-b9e1-feb60a74481c\" (UID: \"94ee5751-ddf8-468a-b9e1-feb60a74481c\") " Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.647694 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-operator-scripts\") pod \"f1532e90-d56a-4f6c-9f0f-3aa418f6498a\" (UID: \"f1532e90-d56a-4f6c-9f0f-3aa418f6498a\") " Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.648113 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1532e90-d56a-4f6c-9f0f-3aa418f6498a" (UID: "f1532e90-d56a-4f6c-9f0f-3aa418f6498a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.648319 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.648334 5002 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94ee5751-ddf8-468a-b9e1-feb60a74481c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.653045 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-kube-api-access-k8jxs" (OuterVolumeSpecName: "kube-api-access-k8jxs") pod "f1532e90-d56a-4f6c-9f0f-3aa418f6498a" (UID: "f1532e90-d56a-4f6c-9f0f-3aa418f6498a"). InnerVolumeSpecName "kube-api-access-k8jxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.653454 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94ee5751-ddf8-468a-b9e1-feb60a74481c-kube-api-access-4glz8" (OuterVolumeSpecName: "kube-api-access-4glz8") pod "94ee5751-ddf8-468a-b9e1-feb60a74481c" (UID: "94ee5751-ddf8-468a-b9e1-feb60a74481c"). InnerVolumeSpecName "kube-api-access-4glz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.749971 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8jxs\" (UniqueName: \"kubernetes.io/projected/f1532e90-d56a-4f6c-9f0f-3aa418f6498a-kube-api-access-k8jxs\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.750022 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4glz8\" (UniqueName: \"kubernetes.io/projected/94ee5751-ddf8-468a-b9e1-feb60a74481c-kube-api-access-4glz8\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.922731 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-w2hx7" event={"ID":"94ee5751-ddf8-468a-b9e1-feb60a74481c","Type":"ContainerDied","Data":"2b4cfa2e4f5913adfacefec965ef909402c4b4d8e39ffa89df3821742f233a4f"} Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.922766 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b4cfa2e4f5913adfacefec965ef909402c4b4d8e39ffa89df3821742f233a4f" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.922853 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-w2hx7" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.926236 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1042-account-create-update-z8pfb" event={"ID":"f1532e90-d56a-4f6c-9f0f-3aa418f6498a","Type":"ContainerDied","Data":"00d47eb05e0f9d4f1a0b72f28b6c3b8f1de9a24102dc4d2e7e8c2eb794e40ae8"} Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.926264 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1042-account-create-update-z8pfb" Dec 09 11:45:41 crc kubenswrapper[5002]: I1209 11:45:41.926283 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00d47eb05e0f9d4f1a0b72f28b6c3b8f1de9a24102dc4d2e7e8c2eb794e40ae8" Dec 09 11:45:41 crc kubenswrapper[5002]: E1209 11:45:41.988437 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94ee5751_ddf8_468a_b9e1_feb60a74481c.slice\": RecentStats: unable to find data in memory cache]" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.061180 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:45:47 crc kubenswrapper[5002]: E1209 11:45:47.063224 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.765637 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-qf6zt"] Dec 09 11:45:47 crc kubenswrapper[5002]: E1209 11:45:47.766619 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ee5751-ddf8-468a-b9e1-feb60a74481c" containerName="mariadb-database-create" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.766637 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ee5751-ddf8-468a-b9e1-feb60a74481c" containerName="mariadb-database-create" Dec 09 11:45:47 crc kubenswrapper[5002]: E1209 11:45:47.766674 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1532e90-d56a-4f6c-9f0f-3aa418f6498a" containerName="mariadb-account-create-update" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.766682 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1532e90-d56a-4f6c-9f0f-3aa418f6498a" containerName="mariadb-account-create-update" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.766965 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="94ee5751-ddf8-468a-b9e1-feb60a74481c" containerName="mariadb-database-create" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.766998 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1532e90-d56a-4f6c-9f0f-3aa418f6498a" containerName="mariadb-account-create-update" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.767938 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.772123 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-t5zsh" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.775084 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.775434 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-qf6zt"] Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.893696 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-job-config-data\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.893778 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-config-data\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.893846 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-combined-ca-bundle\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.893998 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5txdt\" (UniqueName: \"kubernetes.io/projected/a0393597-f5c1-4aea-83fd-dd94aa5018a8-kube-api-access-5txdt\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.997061 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-job-config-data\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.997126 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-config-data\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.997171 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-combined-ca-bundle\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:47 crc kubenswrapper[5002]: I1209 11:45:47.997257 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5txdt\" (UniqueName: \"kubernetes.io/projected/a0393597-f5c1-4aea-83fd-dd94aa5018a8-kube-api-access-5txdt\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:48 crc kubenswrapper[5002]: I1209 11:45:48.003021 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-combined-ca-bundle\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:48 crc kubenswrapper[5002]: I1209 11:45:48.003586 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-config-data\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:48 crc kubenswrapper[5002]: I1209 11:45:48.010351 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-job-config-data\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:48 crc kubenswrapper[5002]: I1209 11:45:48.013189 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5txdt\" (UniqueName: \"kubernetes.io/projected/a0393597-f5c1-4aea-83fd-dd94aa5018a8-kube-api-access-5txdt\") pod \"manila-db-sync-qf6zt\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:48 crc kubenswrapper[5002]: I1209 11:45:48.140580 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:48 crc kubenswrapper[5002]: I1209 11:45:48.828722 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-qf6zt"] Dec 09 11:45:48 crc kubenswrapper[5002]: I1209 11:45:48.997759 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-qf6zt" event={"ID":"a0393597-f5c1-4aea-83fd-dd94aa5018a8","Type":"ContainerStarted","Data":"cac37e27ee2d46630bd0daaa2f32039d0f797049b39d8de9774788cfcd620174"} Dec 09 11:45:50 crc kubenswrapper[5002]: I1209 11:45:50.520946 5002 scope.go:117] "RemoveContainer" containerID="7a5bcd0034d52d8610c57f3b65ce9c3e149fbea92fdecac4fe3eeef5b17be908" Dec 09 11:45:52 crc kubenswrapper[5002]: I1209 11:45:52.824945 5002 scope.go:117] "RemoveContainer" containerID="a443ecafaeb19ab0b0061806e2bf6231088995fa3701d88012ba579c3ee0022e" Dec 09 11:45:52 crc kubenswrapper[5002]: I1209 11:45:52.907203 5002 scope.go:117] "RemoveContainer" containerID="1c9cf717bec0dff95ea4ce5c72fcb3dbe2732a7bee5f7873eb95f4376473da47" Dec 09 11:45:53 crc kubenswrapper[5002]: I1209 11:45:53.066112 5002 scope.go:117] "RemoveContainer" containerID="2e1a378880476ce70b8a2de64ee1edfdff36107a284ade3fc53d8b8074c1fbaa" Dec 09 11:45:53 crc kubenswrapper[5002]: I1209 11:45:53.119274 5002 scope.go:117] "RemoveContainer" containerID="5da5993d7769a91dbc972d90a4db88c2c61901167fc3a3ff4fca524165e1363f" Dec 09 11:45:53 crc kubenswrapper[5002]: I1209 11:45:53.143119 5002 scope.go:117] "RemoveContainer" containerID="0106580d39d6e64710016e0d7529a027fc971a4c865017b02395d114443f7614" Dec 09 11:45:54 crc kubenswrapper[5002]: I1209 11:45:54.055323 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-qf6zt" event={"ID":"a0393597-f5c1-4aea-83fd-dd94aa5018a8","Type":"ContainerStarted","Data":"ea06cabb52460abfed294bbd120945f25757f9c70c094be62966355530e890ec"} Dec 09 11:45:54 crc kubenswrapper[5002]: I1209 11:45:54.079871 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-qf6zt" podStartSLOduration=3.038229776 podStartE2EDuration="7.079849297s" podCreationTimestamp="2025-12-09 11:45:47 +0000 UTC" firstStartedPulling="2025-12-09 11:45:48.839904929 +0000 UTC m=+6281.231956010" lastFinishedPulling="2025-12-09 11:45:52.88152441 +0000 UTC m=+6285.273575531" observedRunningTime="2025-12-09 11:45:54.079361384 +0000 UTC m=+6286.471412465" watchObservedRunningTime="2025-12-09 11:45:54.079849297 +0000 UTC m=+6286.471900378" Dec 09 11:45:56 crc kubenswrapper[5002]: I1209 11:45:56.078964 5002 generic.go:334] "Generic (PLEG): container finished" podID="a0393597-f5c1-4aea-83fd-dd94aa5018a8" containerID="ea06cabb52460abfed294bbd120945f25757f9c70c094be62966355530e890ec" exitCode=0 Dec 09 11:45:56 crc kubenswrapper[5002]: I1209 11:45:56.079067 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-qf6zt" event={"ID":"a0393597-f5c1-4aea-83fd-dd94aa5018a8","Type":"ContainerDied","Data":"ea06cabb52460abfed294bbd120945f25757f9c70c094be62966355530e890ec"} Dec 09 11:45:57 crc kubenswrapper[5002]: I1209 11:45:57.193503 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 11:45:57 crc kubenswrapper[5002]: I1209 11:45:57.771484 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:57 crc kubenswrapper[5002]: I1209 11:45:57.915366 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-job-config-data\") pod \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " Dec 09 11:45:57 crc kubenswrapper[5002]: I1209 11:45:57.915655 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-combined-ca-bundle\") pod \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " Dec 09 11:45:57 crc kubenswrapper[5002]: I1209 11:45:57.915721 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5txdt\" (UniqueName: \"kubernetes.io/projected/a0393597-f5c1-4aea-83fd-dd94aa5018a8-kube-api-access-5txdt\") pod \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " Dec 09 11:45:57 crc kubenswrapper[5002]: I1209 11:45:57.915753 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-config-data\") pod \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\" (UID: \"a0393597-f5c1-4aea-83fd-dd94aa5018a8\") " Dec 09 11:45:57 crc kubenswrapper[5002]: I1209 11:45:57.921712 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "a0393597-f5c1-4aea-83fd-dd94aa5018a8" (UID: "a0393597-f5c1-4aea-83fd-dd94aa5018a8"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:57 crc kubenswrapper[5002]: I1209 11:45:57.926332 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-config-data" (OuterVolumeSpecName: "config-data") pod "a0393597-f5c1-4aea-83fd-dd94aa5018a8" (UID: "a0393597-f5c1-4aea-83fd-dd94aa5018a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:57 crc kubenswrapper[5002]: I1209 11:45:57.937756 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0393597-f5c1-4aea-83fd-dd94aa5018a8-kube-api-access-5txdt" (OuterVolumeSpecName: "kube-api-access-5txdt") pod "a0393597-f5c1-4aea-83fd-dd94aa5018a8" (UID: "a0393597-f5c1-4aea-83fd-dd94aa5018a8"). InnerVolumeSpecName "kube-api-access-5txdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:57 crc kubenswrapper[5002]: I1209 11:45:57.951620 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0393597-f5c1-4aea-83fd-dd94aa5018a8" (UID: "a0393597-f5c1-4aea-83fd-dd94aa5018a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.017889 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.017931 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5txdt\" (UniqueName: \"kubernetes.io/projected/a0393597-f5c1-4aea-83fd-dd94aa5018a8-kube-api-access-5txdt\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.017945 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.017959 5002 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a0393597-f5c1-4aea-83fd-dd94aa5018a8-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.112758 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-qf6zt" event={"ID":"a0393597-f5c1-4aea-83fd-dd94aa5018a8","Type":"ContainerDied","Data":"cac37e27ee2d46630bd0daaa2f32039d0f797049b39d8de9774788cfcd620174"} Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.112838 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cac37e27ee2d46630bd0daaa2f32039d0f797049b39d8de9774788cfcd620174" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.112859 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-qf6zt" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.628008 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 09 11:45:58 crc kubenswrapper[5002]: E1209 11:45:58.628872 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0393597-f5c1-4aea-83fd-dd94aa5018a8" containerName="manila-db-sync" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.628887 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0393597-f5c1-4aea-83fd-dd94aa5018a8" containerName="manila-db-sync" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.629134 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0393597-f5c1-4aea-83fd-dd94aa5018a8" containerName="manila-db-sync" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.630452 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.633093 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-t5zsh" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.633282 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.641311 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.641455 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q47m7\" (UniqueName: \"kubernetes.io/projected/7865f250-f9ab-4946-bf60-7cd667f2dbfa-kube-api-access-q47m7\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.641558 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.641628 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7865f250-f9ab-4946-bf60-7cd667f2dbfa-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.641670 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-scripts\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.641970 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-config-data\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.643648 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.645562 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.649255 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.649883 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.650065 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.658213 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.673146 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.701576 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55586cc989-qn4kg"] Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.703765 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.717871 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55586cc989-qn4kg"] Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.743450 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22z7c\" (UniqueName: \"kubernetes.io/projected/bfb0ee03-26c6-4869-904a-519db3cf6dd0-kube-api-access-22z7c\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.743542 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.743578 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7865f250-f9ab-4946-bf60-7cd667f2dbfa-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.743617 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-scripts\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.743653 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.743675 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.743696 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.745565 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-config\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.745690 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-sb\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.745758 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-scripts\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.745788 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.745892 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-config-data\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.745977 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-config-data\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.746024 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-dns-svc\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.746084 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-ceph\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.746118 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.746161 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxs62\" (UniqueName: \"kubernetes.io/projected/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-kube-api-access-dxs62\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.746214 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-nb\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.746312 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q47m7\" (UniqueName: \"kubernetes.io/projected/7865f250-f9ab-4946-bf60-7cd667f2dbfa-kube-api-access-q47m7\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.747793 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7865f250-f9ab-4946-bf60-7cd667f2dbfa-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.752545 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-scripts\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.768501 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.779422 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-config-data\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.790877 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7865f250-f9ab-4946-bf60-7cd667f2dbfa-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.797462 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q47m7\" (UniqueName: \"kubernetes.io/projected/7865f250-f9ab-4946-bf60-7cd667f2dbfa-kube-api-access-q47m7\") pod \"manila-scheduler-0\" (UID: \"7865f250-f9ab-4946-bf60-7cd667f2dbfa\") " pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850126 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850181 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-scripts\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850233 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-config-data\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850287 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-dns-svc\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850321 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-ceph\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850347 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxs62\" (UniqueName: \"kubernetes.io/projected/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-kube-api-access-dxs62\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850378 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-nb\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850447 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22z7c\" (UniqueName: \"kubernetes.io/projected/bfb0ee03-26c6-4869-904a-519db3cf6dd0-kube-api-access-22z7c\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850515 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850535 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850575 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850631 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-config\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.850679 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-sb\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.851700 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-sb\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.852500 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-nb\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.853309 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-dns-svc\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.855505 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.857660 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.859845 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-config\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.865093 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.866508 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-config-data\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.870321 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-scripts\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.875560 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-ceph\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.886761 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxs62\" (UniqueName: \"kubernetes.io/projected/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-kube-api-access-dxs62\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.892929 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4edcd85-a7e2-4745-bb74-b3babfa3caf0-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"b4edcd85-a7e2-4745-bb74-b3babfa3caf0\") " pod="openstack/manila-share-share1-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.893428 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.895528 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.898062 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.899622 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22z7c\" (UniqueName: \"kubernetes.io/projected/bfb0ee03-26c6-4869-904a-519db3cf6dd0-kube-api-access-22z7c\") pod \"dnsmasq-dns-55586cc989-qn4kg\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.916871 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.960604 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 09 11:45:58 crc kubenswrapper[5002]: I1209 11:45:58.974806 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.029753 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.068173 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-config-data\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.068501 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-scripts\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.068526 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-logs\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.068590 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-etc-machine-id\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.068622 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.068650 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-config-data-custom\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.068697 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9q87\" (UniqueName: \"kubernetes.io/projected/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-kube-api-access-n9q87\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.170346 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9q87\" (UniqueName: \"kubernetes.io/projected/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-kube-api-access-n9q87\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.170424 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-config-data\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.170508 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-scripts\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.170532 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-logs\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.170608 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-etc-machine-id\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.170649 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.170678 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-config-data-custom\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.171552 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-etc-machine-id\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.176756 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-config-data-custom\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.177200 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-logs\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.180296 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-config-data\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.184238 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-scripts\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.186343 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.198311 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9q87\" (UniqueName: \"kubernetes.io/projected/570b8a7a-298d-45bf-b6e3-eeb433ca5d87-kube-api-access-n9q87\") pod \"manila-api-0\" (UID: \"570b8a7a-298d-45bf-b6e3-eeb433ca5d87\") " pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.356219 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.571979 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.666185 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 09 11:45:59 crc kubenswrapper[5002]: W1209 11:45:59.668212 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4edcd85_a7e2_4745_bb74_b3babfa3caf0.slice/crio-32d20437a00c8b0b7c0abcb3c44d6b969a0b4c34b80f8d3c1b84062974554267 WatchSource:0}: Error finding container 32d20437a00c8b0b7c0abcb3c44d6b969a0b4c34b80f8d3c1b84062974554267: Status 404 returned error can't find the container with id 32d20437a00c8b0b7c0abcb3c44d6b969a0b4c34b80f8d3c1b84062974554267 Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.710683 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55586cc989-qn4kg"] Dec 09 11:45:59 crc kubenswrapper[5002]: W1209 11:45:59.717498 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfb0ee03_26c6_4869_904a_519db3cf6dd0.slice/crio-668c77304a933c092b1abb98e5ed9a7a85a9943a007f5c6e4f495dcbf4e9f1d5 WatchSource:0}: Error finding container 668c77304a933c092b1abb98e5ed9a7a85a9943a007f5c6e4f495dcbf4e9f1d5: Status 404 returned error can't find the container with id 668c77304a933c092b1abb98e5ed9a7a85a9943a007f5c6e4f495dcbf4e9f1d5 Dec 09 11:45:59 crc kubenswrapper[5002]: W1209 11:45:59.967927 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod570b8a7a_298d_45bf_b6e3_eeb433ca5d87.slice/crio-2af5f3631dd4002fc12bedc6b4836c162587a00b3a0b6d538e7e73379083e752 WatchSource:0}: Error finding container 2af5f3631dd4002fc12bedc6b4836c162587a00b3a0b6d538e7e73379083e752: Status 404 returned error can't find the container with id 2af5f3631dd4002fc12bedc6b4836c162587a00b3a0b6d538e7e73379083e752 Dec 09 11:45:59 crc kubenswrapper[5002]: I1209 11:45:59.985079 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 09 11:46:00 crc kubenswrapper[5002]: I1209 11:46:00.061080 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:46:00 crc kubenswrapper[5002]: E1209 11:46:00.061386 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:46:00 crc kubenswrapper[5002]: I1209 11:46:00.151487 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"570b8a7a-298d-45bf-b6e3-eeb433ca5d87","Type":"ContainerStarted","Data":"2af5f3631dd4002fc12bedc6b4836c162587a00b3a0b6d538e7e73379083e752"} Dec 09 11:46:00 crc kubenswrapper[5002]: I1209 11:46:00.152680 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"b4edcd85-a7e2-4745-bb74-b3babfa3caf0","Type":"ContainerStarted","Data":"32d20437a00c8b0b7c0abcb3c44d6b969a0b4c34b80f8d3c1b84062974554267"} Dec 09 11:46:00 crc kubenswrapper[5002]: I1209 11:46:00.160364 5002 generic.go:334] "Generic (PLEG): container finished" podID="bfb0ee03-26c6-4869-904a-519db3cf6dd0" containerID="c4408813849003899ec2f36d1464583978407a99e49c8a86ca988edd9bccb03b" exitCode=0 Dec 09 11:46:00 crc kubenswrapper[5002]: I1209 11:46:00.160470 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" event={"ID":"bfb0ee03-26c6-4869-904a-519db3cf6dd0","Type":"ContainerDied","Data":"c4408813849003899ec2f36d1464583978407a99e49c8a86ca988edd9bccb03b"} Dec 09 11:46:00 crc kubenswrapper[5002]: I1209 11:46:00.160503 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" event={"ID":"bfb0ee03-26c6-4869-904a-519db3cf6dd0","Type":"ContainerStarted","Data":"668c77304a933c092b1abb98e5ed9a7a85a9943a007f5c6e4f495dcbf4e9f1d5"} Dec 09 11:46:00 crc kubenswrapper[5002]: I1209 11:46:00.170570 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7865f250-f9ab-4946-bf60-7cd667f2dbfa","Type":"ContainerStarted","Data":"93a59ad362e435dd2d3aff44f77be0293cc1ea7c9ad1054f6f8009030f844eb2"} Dec 09 11:46:01 crc kubenswrapper[5002]: I1209 11:46:01.181597 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"570b8a7a-298d-45bf-b6e3-eeb433ca5d87","Type":"ContainerStarted","Data":"bef3541fafa98f547eea3964df608199ca00a07aba1fd787f69b4e8118a73863"} Dec 09 11:46:01 crc kubenswrapper[5002]: I1209 11:46:01.182262 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"570b8a7a-298d-45bf-b6e3-eeb433ca5d87","Type":"ContainerStarted","Data":"b25c48eaa392b22a10b5046f469665b5ef202420d949417e0300f554f14c3156"} Dec 09 11:46:01 crc kubenswrapper[5002]: I1209 11:46:01.182459 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 09 11:46:01 crc kubenswrapper[5002]: I1209 11:46:01.188510 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" event={"ID":"bfb0ee03-26c6-4869-904a-519db3cf6dd0","Type":"ContainerStarted","Data":"03f71dc9eb0315728217af22e84ceed384d946dde5f375c38faf7d6cbbb98e87"} Dec 09 11:46:01 crc kubenswrapper[5002]: I1209 11:46:01.188878 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:46:01 crc kubenswrapper[5002]: I1209 11:46:01.216010 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.215987665 podStartE2EDuration="3.215987665s" podCreationTimestamp="2025-12-09 11:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:46:01.199122513 +0000 UTC m=+6293.591173594" watchObservedRunningTime="2025-12-09 11:46:01.215987665 +0000 UTC m=+6293.608038746" Dec 09 11:46:01 crc kubenswrapper[5002]: I1209 11:46:01.249053 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" podStartSLOduration=3.24903356 podStartE2EDuration="3.24903356s" podCreationTimestamp="2025-12-09 11:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:46:01.218318877 +0000 UTC m=+6293.610369958" watchObservedRunningTime="2025-12-09 11:46:01.24903356 +0000 UTC m=+6293.641084641" Dec 09 11:46:02 crc kubenswrapper[5002]: I1209 11:46:02.209004 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7865f250-f9ab-4946-bf60-7cd667f2dbfa","Type":"ContainerStarted","Data":"c0bf400bfe1daeac0dec3121ae2c86b048c99f4e815a7e03ac10a150e1082724"} Dec 09 11:46:03 crc kubenswrapper[5002]: I1209 11:46:03.221352 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7865f250-f9ab-4946-bf60-7cd667f2dbfa","Type":"ContainerStarted","Data":"927068f6e8c0842aba44ed5a793fa45e85b6dbc5ce65fc13ea169ca31da76a2a"} Dec 09 11:46:03 crc kubenswrapper[5002]: I1209 11:46:03.253435 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.234106355 podStartE2EDuration="5.253416466s" podCreationTimestamp="2025-12-09 11:45:58 +0000 UTC" firstStartedPulling="2025-12-09 11:45:59.584193769 +0000 UTC m=+6291.976244850" lastFinishedPulling="2025-12-09 11:46:00.60350388 +0000 UTC m=+6292.995554961" observedRunningTime="2025-12-09 11:46:03.242265518 +0000 UTC m=+6295.634316599" watchObservedRunningTime="2025-12-09 11:46:03.253416466 +0000 UTC m=+6295.645467547" Dec 09 11:46:06 crc kubenswrapper[5002]: I1209 11:46:06.253746 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"b4edcd85-a7e2-4745-bb74-b3babfa3caf0","Type":"ContainerStarted","Data":"1d62f7abb85a5e2bdf4807adab6d461a3363c208f52ccc27d950cbaf19eb7d9e"} Dec 09 11:46:07 crc kubenswrapper[5002]: I1209 11:46:07.269798 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"b4edcd85-a7e2-4745-bb74-b3babfa3caf0","Type":"ContainerStarted","Data":"0e74ce2da7a973187a170b4befb6e7ee72f4e4da1ca4f1787ab8d12f607ca823"} Dec 09 11:46:07 crc kubenswrapper[5002]: I1209 11:46:07.309111 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.421992891 podStartE2EDuration="9.309085655s" podCreationTimestamp="2025-12-09 11:45:58 +0000 UTC" firstStartedPulling="2025-12-09 11:45:59.670948041 +0000 UTC m=+6292.062999112" lastFinishedPulling="2025-12-09 11:46:05.558040795 +0000 UTC m=+6297.950091876" observedRunningTime="2025-12-09 11:46:07.298410019 +0000 UTC m=+6299.690461120" watchObservedRunningTime="2025-12-09 11:46:07.309085655 +0000 UTC m=+6299.701136736" Dec 09 11:46:08 crc kubenswrapper[5002]: I1209 11:46:08.961415 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 09 11:46:08 crc kubenswrapper[5002]: I1209 11:46:08.975761 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 09 11:46:09 crc kubenswrapper[5002]: I1209 11:46:09.031976 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:46:09 crc kubenswrapper[5002]: I1209 11:46:09.102732 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-9s9kk"] Dec 09 11:46:09 crc kubenswrapper[5002]: I1209 11:46:09.290400 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" podUID="e7c18105-50bb-4814-aebb-df97e8948e01" containerName="dnsmasq-dns" containerID="cri-o://22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8" gracePeriod=10 Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.001022 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.082212 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-config\") pod \"e7c18105-50bb-4814-aebb-df97e8948e01\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.082257 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-dns-svc\") pod \"e7c18105-50bb-4814-aebb-df97e8948e01\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.082420 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-nb\") pod \"e7c18105-50bb-4814-aebb-df97e8948e01\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.082520 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s96bx\" (UniqueName: \"kubernetes.io/projected/e7c18105-50bb-4814-aebb-df97e8948e01-kube-api-access-s96bx\") pod \"e7c18105-50bb-4814-aebb-df97e8948e01\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.082603 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-sb\") pod \"e7c18105-50bb-4814-aebb-df97e8948e01\" (UID: \"e7c18105-50bb-4814-aebb-df97e8948e01\") " Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.156928 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c18105-50bb-4814-aebb-df97e8948e01-kube-api-access-s96bx" (OuterVolumeSpecName: "kube-api-access-s96bx") pod "e7c18105-50bb-4814-aebb-df97e8948e01" (UID: "e7c18105-50bb-4814-aebb-df97e8948e01"). InnerVolumeSpecName "kube-api-access-s96bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.184912 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s96bx\" (UniqueName: \"kubernetes.io/projected/e7c18105-50bb-4814-aebb-df97e8948e01-kube-api-access-s96bx\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.238161 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7c18105-50bb-4814-aebb-df97e8948e01" (UID: "e7c18105-50bb-4814-aebb-df97e8948e01"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.238580 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7c18105-50bb-4814-aebb-df97e8948e01" (UID: "e7c18105-50bb-4814-aebb-df97e8948e01"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.240280 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e7c18105-50bb-4814-aebb-df97e8948e01" (UID: "e7c18105-50bb-4814-aebb-df97e8948e01"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.247119 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-config" (OuterVolumeSpecName: "config") pod "e7c18105-50bb-4814-aebb-df97e8948e01" (UID: "e7c18105-50bb-4814-aebb-df97e8948e01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.286860 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.287417 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.287481 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.287553 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c18105-50bb-4814-aebb-df97e8948e01-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.301901 5002 generic.go:334] "Generic (PLEG): container finished" podID="e7c18105-50bb-4814-aebb-df97e8948e01" containerID="22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8" exitCode=0 Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.301946 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" event={"ID":"e7c18105-50bb-4814-aebb-df97e8948e01","Type":"ContainerDied","Data":"22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8"} Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.301974 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" event={"ID":"e7c18105-50bb-4814-aebb-df97e8948e01","Type":"ContainerDied","Data":"4e303effba94515a8e34d5ff108296121c4cf8929a6a64e8d08b18efaa17c189"} Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.301996 5002 scope.go:117] "RemoveContainer" containerID="22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.302020 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc68b6c7-9s9kk" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.340593 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-9s9kk"] Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.340971 5002 scope.go:117] "RemoveContainer" containerID="632d80b35a4374799f90b84c6841468d945f85dfec8ee9ef5b70b7083f4ea4c4" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.355958 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-9s9kk"] Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.374153 5002 scope.go:117] "RemoveContainer" containerID="22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8" Dec 09 11:46:10 crc kubenswrapper[5002]: E1209 11:46:10.374582 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8\": container with ID starting with 22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8 not found: ID does not exist" containerID="22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.374621 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8"} err="failed to get container status \"22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8\": rpc error: code = NotFound desc = could not find container \"22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8\": container with ID starting with 22214553af323e7765f9637d4e163d535bd46c20164d2b33de78f9cd8238f4e8 not found: ID does not exist" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.374647 5002 scope.go:117] "RemoveContainer" containerID="632d80b35a4374799f90b84c6841468d945f85dfec8ee9ef5b70b7083f4ea4c4" Dec 09 11:46:10 crc kubenswrapper[5002]: E1209 11:46:10.375316 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632d80b35a4374799f90b84c6841468d945f85dfec8ee9ef5b70b7083f4ea4c4\": container with ID starting with 632d80b35a4374799f90b84c6841468d945f85dfec8ee9ef5b70b7083f4ea4c4 not found: ID does not exist" containerID="632d80b35a4374799f90b84c6841468d945f85dfec8ee9ef5b70b7083f4ea4c4" Dec 09 11:46:10 crc kubenswrapper[5002]: I1209 11:46:10.375397 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632d80b35a4374799f90b84c6841468d945f85dfec8ee9ef5b70b7083f4ea4c4"} err="failed to get container status \"632d80b35a4374799f90b84c6841468d945f85dfec8ee9ef5b70b7083f4ea4c4\": rpc error: code = NotFound desc = could not find container \"632d80b35a4374799f90b84c6841468d945f85dfec8ee9ef5b70b7083f4ea4c4\": container with ID starting with 632d80b35a4374799f90b84c6841468d945f85dfec8ee9ef5b70b7083f4ea4c4 not found: ID does not exist" Dec 09 11:46:11 crc kubenswrapper[5002]: I1209 11:46:11.664881 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:46:11 crc kubenswrapper[5002]: I1209 11:46:11.665419 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="ceilometer-central-agent" containerID="cri-o://bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224" gracePeriod=30 Dec 09 11:46:11 crc kubenswrapper[5002]: I1209 11:46:11.665513 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="ceilometer-notification-agent" containerID="cri-o://4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16" gracePeriod=30 Dec 09 11:46:11 crc kubenswrapper[5002]: I1209 11:46:11.665508 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="sg-core" containerID="cri-o://514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57" gracePeriod=30 Dec 09 11:46:11 crc kubenswrapper[5002]: I1209 11:46:11.665537 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="proxy-httpd" containerID="cri-o://5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100" gracePeriod=30 Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.061683 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.080494 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c18105-50bb-4814-aebb-df97e8948e01" path="/var/lib/kubelet/pods/e7c18105-50bb-4814-aebb-df97e8948e01/volumes" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.333327 5002 generic.go:334] "Generic (PLEG): container finished" podID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerID="5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100" exitCode=0 Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.333932 5002 generic.go:334] "Generic (PLEG): container finished" podID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerID="514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57" exitCode=2 Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.333948 5002 generic.go:334] "Generic (PLEG): container finished" podID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerID="bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224" exitCode=0 Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.333407 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0f4215e-37c9-4892-888e-ce9bd74ece35","Type":"ContainerDied","Data":"5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100"} Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.334004 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0f4215e-37c9-4892-888e-ce9bd74ece35","Type":"ContainerDied","Data":"514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57"} Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.334024 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0f4215e-37c9-4892-888e-ce9bd74ece35","Type":"ContainerDied","Data":"bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224"} Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.729760 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.852717 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-run-httpd\") pod \"a0f4215e-37c9-4892-888e-ce9bd74ece35\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.852849 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-config-data\") pod \"a0f4215e-37c9-4892-888e-ce9bd74ece35\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.852985 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-sg-core-conf-yaml\") pod \"a0f4215e-37c9-4892-888e-ce9bd74ece35\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.853026 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zx6p\" (UniqueName: \"kubernetes.io/projected/a0f4215e-37c9-4892-888e-ce9bd74ece35-kube-api-access-8zx6p\") pod \"a0f4215e-37c9-4892-888e-ce9bd74ece35\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.853074 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-scripts\") pod \"a0f4215e-37c9-4892-888e-ce9bd74ece35\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.853110 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-combined-ca-bundle\") pod \"a0f4215e-37c9-4892-888e-ce9bd74ece35\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.853221 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-log-httpd\") pod \"a0f4215e-37c9-4892-888e-ce9bd74ece35\" (UID: \"a0f4215e-37c9-4892-888e-ce9bd74ece35\") " Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.853357 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a0f4215e-37c9-4892-888e-ce9bd74ece35" (UID: "a0f4215e-37c9-4892-888e-ce9bd74ece35"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.853790 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a0f4215e-37c9-4892-888e-ce9bd74ece35" (UID: "a0f4215e-37c9-4892-888e-ce9bd74ece35"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.854006 5002 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.854021 5002 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f4215e-37c9-4892-888e-ce9bd74ece35-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.862382 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-scripts" (OuterVolumeSpecName: "scripts") pod "a0f4215e-37c9-4892-888e-ce9bd74ece35" (UID: "a0f4215e-37c9-4892-888e-ce9bd74ece35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.865112 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f4215e-37c9-4892-888e-ce9bd74ece35-kube-api-access-8zx6p" (OuterVolumeSpecName: "kube-api-access-8zx6p") pod "a0f4215e-37c9-4892-888e-ce9bd74ece35" (UID: "a0f4215e-37c9-4892-888e-ce9bd74ece35"). InnerVolumeSpecName "kube-api-access-8zx6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.890469 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a0f4215e-37c9-4892-888e-ce9bd74ece35" (UID: "a0f4215e-37c9-4892-888e-ce9bd74ece35"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.956472 5002 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.956825 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zx6p\" (UniqueName: \"kubernetes.io/projected/a0f4215e-37c9-4892-888e-ce9bd74ece35-kube-api-access-8zx6p\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.956860 5002 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:12 crc kubenswrapper[5002]: I1209 11:46:12.990532 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-config-data" (OuterVolumeSpecName: "config-data") pod "a0f4215e-37c9-4892-888e-ce9bd74ece35" (UID: "a0f4215e-37c9-4892-888e-ce9bd74ece35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:12.999697 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0f4215e-37c9-4892-888e-ce9bd74ece35" (UID: "a0f4215e-37c9-4892-888e-ce9bd74ece35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.059001 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.059038 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f4215e-37c9-4892-888e-ce9bd74ece35-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.347541 5002 generic.go:334] "Generic (PLEG): container finished" podID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerID="4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16" exitCode=0 Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.347595 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0f4215e-37c9-4892-888e-ce9bd74ece35","Type":"ContainerDied","Data":"4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16"} Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.347621 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0f4215e-37c9-4892-888e-ce9bd74ece35","Type":"ContainerDied","Data":"4183e5929a220338582269b94bf7ca9e0571db4d8c68c985c64932de0e57c570"} Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.347639 5002 scope.go:117] "RemoveContainer" containerID="5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.347763 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.353870 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"0b77c9efff9501c1fc3a03ad7a9700e7bb9032c2f86b3d0fba67afd52dc18d5f"} Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.385718 5002 scope.go:117] "RemoveContainer" containerID="514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.425347 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.450002 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.463234 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.466100 5002 scope.go:117] "RemoveContainer" containerID="4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16" Dec 09 11:46:13 crc kubenswrapper[5002]: E1209 11:46:13.468984 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c18105-50bb-4814-aebb-df97e8948e01" containerName="init" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.469005 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c18105-50bb-4814-aebb-df97e8948e01" containerName="init" Dec 09 11:46:13 crc kubenswrapper[5002]: E1209 11:46:13.469032 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="ceilometer-notification-agent" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.469041 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="ceilometer-notification-agent" Dec 09 11:46:13 crc kubenswrapper[5002]: E1209 11:46:13.469053 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="sg-core" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.469059 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="sg-core" Dec 09 11:46:13 crc kubenswrapper[5002]: E1209 11:46:13.469082 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="proxy-httpd" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.469087 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="proxy-httpd" Dec 09 11:46:13 crc kubenswrapper[5002]: E1209 11:46:13.469104 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c18105-50bb-4814-aebb-df97e8948e01" containerName="dnsmasq-dns" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.469112 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c18105-50bb-4814-aebb-df97e8948e01" containerName="dnsmasq-dns" Dec 09 11:46:13 crc kubenswrapper[5002]: E1209 11:46:13.469126 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="ceilometer-central-agent" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.469132 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="ceilometer-central-agent" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.469385 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="ceilometer-notification-agent" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.469399 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="ceilometer-central-agent" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.469409 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="sg-core" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.469424 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" containerName="proxy-httpd" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.469432 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c18105-50bb-4814-aebb-df97e8948e01" containerName="dnsmasq-dns" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.472181 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.474484 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.474630 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.483033 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.502496 5002 scope.go:117] "RemoveContainer" containerID="bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.525989 5002 scope.go:117] "RemoveContainer" containerID="5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100" Dec 09 11:46:13 crc kubenswrapper[5002]: E1209 11:46:13.526762 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100\": container with ID starting with 5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100 not found: ID does not exist" containerID="5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.526827 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100"} err="failed to get container status \"5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100\": rpc error: code = NotFound desc = could not find container \"5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100\": container with ID starting with 5f0125f21b371e4f5fb61040909e63f0660711e15f25a5cedfecde1bf6bda100 not found: ID does not exist" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.526858 5002 scope.go:117] "RemoveContainer" containerID="514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57" Dec 09 11:46:13 crc kubenswrapper[5002]: E1209 11:46:13.527206 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57\": container with ID starting with 514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57 not found: ID does not exist" containerID="514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.527245 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57"} err="failed to get container status \"514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57\": rpc error: code = NotFound desc = could not find container \"514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57\": container with ID starting with 514b7afac22fec7fb5a4558a856ef18041628d5091e489495792c3ddcfdfbf57 not found: ID does not exist" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.527273 5002 scope.go:117] "RemoveContainer" containerID="4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16" Dec 09 11:46:13 crc kubenswrapper[5002]: E1209 11:46:13.527641 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16\": container with ID starting with 4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16 not found: ID does not exist" containerID="4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.527661 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16"} err="failed to get container status \"4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16\": rpc error: code = NotFound desc = could not find container \"4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16\": container with ID starting with 4fbb1ccd354656789909bc4d182996a73299e8845664e451baa791722c051d16 not found: ID does not exist" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.527673 5002 scope.go:117] "RemoveContainer" containerID="bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224" Dec 09 11:46:13 crc kubenswrapper[5002]: E1209 11:46:13.527899 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224\": container with ID starting with bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224 not found: ID does not exist" containerID="bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.527918 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224"} err="failed to get container status \"bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224\": rpc error: code = NotFound desc = could not find container \"bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224\": container with ID starting with bbdb56412785dfdc9ba983e618a3a9950aaaf769987127947e4a6e870b81f224 not found: ID does not exist" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.571653 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.571745 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.571776 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1db29fd6-56f8-4153-9760-638a8baac418-run-httpd\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.571851 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1db29fd6-56f8-4153-9760-638a8baac418-log-httpd\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.572176 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-config-data\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.572349 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89jbg\" (UniqueName: \"kubernetes.io/projected/1db29fd6-56f8-4153-9760-638a8baac418-kube-api-access-89jbg\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.572419 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-scripts\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.674338 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.674411 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.674432 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1db29fd6-56f8-4153-9760-638a8baac418-run-httpd\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.674481 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1db29fd6-56f8-4153-9760-638a8baac418-log-httpd\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.674538 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-config-data\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.674607 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89jbg\" (UniqueName: \"kubernetes.io/projected/1db29fd6-56f8-4153-9760-638a8baac418-kube-api-access-89jbg\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.674643 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-scripts\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.676012 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1db29fd6-56f8-4153-9760-638a8baac418-run-httpd\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.676210 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1db29fd6-56f8-4153-9760-638a8baac418-log-httpd\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.680430 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.680471 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-scripts\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.680837 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.681061 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1db29fd6-56f8-4153-9760-638a8baac418-config-data\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.690914 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89jbg\" (UniqueName: \"kubernetes.io/projected/1db29fd6-56f8-4153-9760-638a8baac418-kube-api-access-89jbg\") pod \"ceilometer-0\" (UID: \"1db29fd6-56f8-4153-9760-638a8baac418\") " pod="openstack/ceilometer-0" Dec 09 11:46:13 crc kubenswrapper[5002]: I1209 11:46:13.793353 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:46:14 crc kubenswrapper[5002]: I1209 11:46:14.073271 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f4215e-37c9-4892-888e-ce9bd74ece35" path="/var/lib/kubelet/pods/a0f4215e-37c9-4892-888e-ce9bd74ece35/volumes" Dec 09 11:46:14 crc kubenswrapper[5002]: I1209 11:46:14.310080 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:46:14 crc kubenswrapper[5002]: I1209 11:46:14.364982 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1db29fd6-56f8-4153-9760-638a8baac418","Type":"ContainerStarted","Data":"84c96de7801864e7e2bb2e587dcf2bb064e681a64548b92daf1c286dfc12837f"} Dec 09 11:46:15 crc kubenswrapper[5002]: I1209 11:46:15.376079 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1db29fd6-56f8-4153-9760-638a8baac418","Type":"ContainerStarted","Data":"0ecea3adaca09920a9a11bb6b4d0d1708a2044f29ae6578fac5c9f8de5f74daa"} Dec 09 11:46:16 crc kubenswrapper[5002]: I1209 11:46:16.386586 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1db29fd6-56f8-4153-9760-638a8baac418","Type":"ContainerStarted","Data":"26c773778b21b622f02d58731c24c709564fda1253cf82358c77a9e535794e5c"} Dec 09 11:46:17 crc kubenswrapper[5002]: I1209 11:46:17.400274 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1db29fd6-56f8-4153-9760-638a8baac418","Type":"ContainerStarted","Data":"c0aecb87f8c0c3c726909554ab86e75d13a2f2476a1fc07d6659685a0ac4eb26"} Dec 09 11:46:18 crc kubenswrapper[5002]: I1209 11:46:18.411539 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1db29fd6-56f8-4153-9760-638a8baac418","Type":"ContainerStarted","Data":"05ead64ec5fbe468f06e476017aff4a51608eb97bb39f714554966009768e038"} Dec 09 11:46:18 crc kubenswrapper[5002]: I1209 11:46:18.412138 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 11:46:18 crc kubenswrapper[5002]: I1209 11:46:18.441896 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.215987086 podStartE2EDuration="5.441873391s" podCreationTimestamp="2025-12-09 11:46:13 +0000 UTC" firstStartedPulling="2025-12-09 11:46:14.307550005 +0000 UTC m=+6306.699601086" lastFinishedPulling="2025-12-09 11:46:17.53343631 +0000 UTC m=+6309.925487391" observedRunningTime="2025-12-09 11:46:18.43364393 +0000 UTC m=+6310.825695011" watchObservedRunningTime="2025-12-09 11:46:18.441873391 +0000 UTC m=+6310.833924482" Dec 09 11:46:20 crc kubenswrapper[5002]: I1209 11:46:20.557161 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 09 11:46:20 crc kubenswrapper[5002]: I1209 11:46:20.749713 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 09 11:46:20 crc kubenswrapper[5002]: I1209 11:46:20.830939 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 09 11:46:43 crc kubenswrapper[5002]: I1209 11:46:43.853139 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.607111 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-k2jw6"] Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.609792 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.612094 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.621026 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-k2jw6"] Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.683215 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-dns-svc\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.683396 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-config\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.683611 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-sb\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.683665 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-nb\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.683692 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2z95\" (UniqueName: \"kubernetes.io/projected/6752ef9c-92c9-4664-8d55-1848c909be7a-kube-api-access-v2z95\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.683902 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-openstack-cell1\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.786658 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-openstack-cell1\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.786721 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-dns-svc\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.786801 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-config\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.787000 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-sb\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.787037 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-nb\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.787071 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2z95\" (UniqueName: \"kubernetes.io/projected/6752ef9c-92c9-4664-8d55-1848c909be7a-kube-api-access-v2z95\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.788053 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-dns-svc\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.788085 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-openstack-cell1\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.788059 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-config\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.788061 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-nb\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.788220 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-sb\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.813527 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2z95\" (UniqueName: \"kubernetes.io/projected/6752ef9c-92c9-4664-8d55-1848c909be7a-kube-api-access-v2z95\") pod \"dnsmasq-dns-7b54c866bc-k2jw6\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:08 crc kubenswrapper[5002]: I1209 11:47:08.928397 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:09 crc kubenswrapper[5002]: I1209 11:47:09.461619 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-k2jw6"] Dec 09 11:47:09 crc kubenswrapper[5002]: I1209 11:47:09.960512 5002 generic.go:334] "Generic (PLEG): container finished" podID="6752ef9c-92c9-4664-8d55-1848c909be7a" containerID="6e1c9c503cc517a53caebdbf750ab788d22d4d9167dbdc76acd9b5daa6ee7127" exitCode=0 Dec 09 11:47:09 crc kubenswrapper[5002]: I1209 11:47:09.960623 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" event={"ID":"6752ef9c-92c9-4664-8d55-1848c909be7a","Type":"ContainerDied","Data":"6e1c9c503cc517a53caebdbf750ab788d22d4d9167dbdc76acd9b5daa6ee7127"} Dec 09 11:47:09 crc kubenswrapper[5002]: I1209 11:47:09.960873 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" event={"ID":"6752ef9c-92c9-4664-8d55-1848c909be7a","Type":"ContainerStarted","Data":"38caee3a080be6ad0dec18186e5765612da295a63e1965ec9c93a2ce422a4dfd"} Dec 09 11:47:10 crc kubenswrapper[5002]: I1209 11:47:10.971972 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" event={"ID":"6752ef9c-92c9-4664-8d55-1848c909be7a","Type":"ContainerStarted","Data":"7e64376f71b116ac049f79d50183881ac9ac4a8778ae660ffb743ed1f27519be"} Dec 09 11:47:10 crc kubenswrapper[5002]: I1209 11:47:10.972240 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:10 crc kubenswrapper[5002]: I1209 11:47:10.990212 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" podStartSLOduration=2.99019298 podStartE2EDuration="2.99019298s" podCreationTimestamp="2025-12-09 11:47:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:47:10.987889148 +0000 UTC m=+6363.379940239" watchObservedRunningTime="2025-12-09 11:47:10.99019298 +0000 UTC m=+6363.382244061" Dec 09 11:47:18 crc kubenswrapper[5002]: I1209 11:47:18.930603 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:18 crc kubenswrapper[5002]: I1209 11:47:18.990518 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55586cc989-qn4kg"] Dec 09 11:47:18 crc kubenswrapper[5002]: I1209 11:47:18.990975 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" podUID="bfb0ee03-26c6-4869-904a-519db3cf6dd0" containerName="dnsmasq-dns" containerID="cri-o://03f71dc9eb0315728217af22e84ceed384d946dde5f375c38faf7d6cbbb98e87" gracePeriod=10 Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.031057 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" podUID="bfb0ee03-26c6-4869-904a-519db3cf6dd0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.141:5353: connect: connection refused" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.149358 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d6cd869d9-7gk6k"] Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.151717 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.187919 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6cd869d9-7gk6k"] Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.324004 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-ovsdbserver-nb\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.324073 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-dns-svc\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.324098 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-openstack-cell1\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.324117 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-config\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.324151 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-ovsdbserver-sb\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.324398 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ptdp\" (UniqueName: \"kubernetes.io/projected/4e6f3711-6d8b-451e-84db-d89343c6f1cd-kube-api-access-6ptdp\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.426172 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-openstack-cell1\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.426430 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-config\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.426489 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-ovsdbserver-sb\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.426562 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ptdp\" (UniqueName: \"kubernetes.io/projected/4e6f3711-6d8b-451e-84db-d89343c6f1cd-kube-api-access-6ptdp\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.426644 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-ovsdbserver-nb\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.426688 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-dns-svc\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.427236 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-openstack-cell1\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.427403 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-dns-svc\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.427910 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-ovsdbserver-sb\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.428186 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-ovsdbserver-nb\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.428479 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e6f3711-6d8b-451e-84db-d89343c6f1cd-config\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.450206 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ptdp\" (UniqueName: \"kubernetes.io/projected/4e6f3711-6d8b-451e-84db-d89343c6f1cd-kube-api-access-6ptdp\") pod \"dnsmasq-dns-d6cd869d9-7gk6k\" (UID: \"4e6f3711-6d8b-451e-84db-d89343c6f1cd\") " pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:19 crc kubenswrapper[5002]: I1209 11:47:19.488589 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:20 crc kubenswrapper[5002]: I1209 11:47:20.044752 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6cd869d9-7gk6k"] Dec 09 11:47:20 crc kubenswrapper[5002]: I1209 11:47:20.088013 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" event={"ID":"4e6f3711-6d8b-451e-84db-d89343c6f1cd","Type":"ContainerStarted","Data":"6cfd150a83a94cd23bb76344b9d251746911184da33c5e7d677f0a5cfc0bfd35"} Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.124337 5002 generic.go:334] "Generic (PLEG): container finished" podID="bfb0ee03-26c6-4869-904a-519db3cf6dd0" containerID="03f71dc9eb0315728217af22e84ceed384d946dde5f375c38faf7d6cbbb98e87" exitCode=0 Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.124454 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" event={"ID":"bfb0ee03-26c6-4869-904a-519db3cf6dd0","Type":"ContainerDied","Data":"03f71dc9eb0315728217af22e84ceed384d946dde5f375c38faf7d6cbbb98e87"} Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.142995 5002 generic.go:334] "Generic (PLEG): container finished" podID="4e6f3711-6d8b-451e-84db-d89343c6f1cd" containerID="f707ceb4dea195075774f1116520027170f3a9642b6633696fa3f3ad06a33bcc" exitCode=0 Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.143033 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" event={"ID":"4e6f3711-6d8b-451e-84db-d89343c6f1cd","Type":"ContainerDied","Data":"f707ceb4dea195075774f1116520027170f3a9642b6633696fa3f3ad06a33bcc"} Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.359744 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.420000 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-nb\") pod \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.420088 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-dns-svc\") pod \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.420233 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22z7c\" (UniqueName: \"kubernetes.io/projected/bfb0ee03-26c6-4869-904a-519db3cf6dd0-kube-api-access-22z7c\") pod \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.420353 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-config\") pod \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.420381 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-sb\") pod \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\" (UID: \"bfb0ee03-26c6-4869-904a-519db3cf6dd0\") " Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.426349 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb0ee03-26c6-4869-904a-519db3cf6dd0-kube-api-access-22z7c" (OuterVolumeSpecName: "kube-api-access-22z7c") pod "bfb0ee03-26c6-4869-904a-519db3cf6dd0" (UID: "bfb0ee03-26c6-4869-904a-519db3cf6dd0"). InnerVolumeSpecName "kube-api-access-22z7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.500675 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bfb0ee03-26c6-4869-904a-519db3cf6dd0" (UID: "bfb0ee03-26c6-4869-904a-519db3cf6dd0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.509750 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-config" (OuterVolumeSpecName: "config") pod "bfb0ee03-26c6-4869-904a-519db3cf6dd0" (UID: "bfb0ee03-26c6-4869-904a-519db3cf6dd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.519237 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bfb0ee03-26c6-4869-904a-519db3cf6dd0" (UID: "bfb0ee03-26c6-4869-904a-519db3cf6dd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.523484 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22z7c\" (UniqueName: \"kubernetes.io/projected/bfb0ee03-26c6-4869-904a-519db3cf6dd0-kube-api-access-22z7c\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.523515 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.523526 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.523534 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.524113 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bfb0ee03-26c6-4869-904a-519db3cf6dd0" (UID: "bfb0ee03-26c6-4869-904a-519db3cf6dd0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:47:22 crc kubenswrapper[5002]: I1209 11:47:22.625428 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfb0ee03-26c6-4869-904a-519db3cf6dd0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:23 crc kubenswrapper[5002]: I1209 11:47:23.166894 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" event={"ID":"bfb0ee03-26c6-4869-904a-519db3cf6dd0","Type":"ContainerDied","Data":"668c77304a933c092b1abb98e5ed9a7a85a9943a007f5c6e4f495dcbf4e9f1d5"} Dec 09 11:47:23 crc kubenswrapper[5002]: I1209 11:47:23.167355 5002 scope.go:117] "RemoveContainer" containerID="03f71dc9eb0315728217af22e84ceed384d946dde5f375c38faf7d6cbbb98e87" Dec 09 11:47:23 crc kubenswrapper[5002]: I1209 11:47:23.166927 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55586cc989-qn4kg" Dec 09 11:47:23 crc kubenswrapper[5002]: I1209 11:47:23.174911 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" event={"ID":"4e6f3711-6d8b-451e-84db-d89343c6f1cd","Type":"ContainerStarted","Data":"eca97621d6d33d4a9de406a73824032702be8c0a96b7542ea9bd434d4415a231"} Dec 09 11:47:23 crc kubenswrapper[5002]: I1209 11:47:23.175189 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:23 crc kubenswrapper[5002]: I1209 11:47:23.196585 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" podStartSLOduration=4.196563981 podStartE2EDuration="4.196563981s" podCreationTimestamp="2025-12-09 11:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:47:23.194438164 +0000 UTC m=+6375.586489265" watchObservedRunningTime="2025-12-09 11:47:23.196563981 +0000 UTC m=+6375.588615062" Dec 09 11:47:23 crc kubenswrapper[5002]: I1209 11:47:23.243152 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55586cc989-qn4kg"] Dec 09 11:47:23 crc kubenswrapper[5002]: I1209 11:47:23.247277 5002 scope.go:117] "RemoveContainer" containerID="c4408813849003899ec2f36d1464583978407a99e49c8a86ca988edd9bccb03b" Dec 09 11:47:23 crc kubenswrapper[5002]: I1209 11:47:23.254970 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55586cc989-qn4kg"] Dec 09 11:47:24 crc kubenswrapper[5002]: I1209 11:47:24.103186 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb0ee03-26c6-4869-904a-519db3cf6dd0" path="/var/lib/kubelet/pods/bfb0ee03-26c6-4869-904a-519db3cf6dd0/volumes" Dec 09 11:47:29 crc kubenswrapper[5002]: I1209 11:47:29.490009 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d6cd869d9-7gk6k" Dec 09 11:47:29 crc kubenswrapper[5002]: I1209 11:47:29.587931 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-k2jw6"] Dec 09 11:47:29 crc kubenswrapper[5002]: I1209 11:47:29.588168 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" podUID="6752ef9c-92c9-4664-8d55-1848c909be7a" containerName="dnsmasq-dns" containerID="cri-o://7e64376f71b116ac049f79d50183881ac9ac4a8778ae660ffb743ed1f27519be" gracePeriod=10 Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.249166 5002 generic.go:334] "Generic (PLEG): container finished" podID="6752ef9c-92c9-4664-8d55-1848c909be7a" containerID="7e64376f71b116ac049f79d50183881ac9ac4a8778ae660ffb743ed1f27519be" exitCode=0 Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.249447 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" event={"ID":"6752ef9c-92c9-4664-8d55-1848c909be7a","Type":"ContainerDied","Data":"7e64376f71b116ac049f79d50183881ac9ac4a8778ae660ffb743ed1f27519be"} Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.493328 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.694901 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-config\") pod \"6752ef9c-92c9-4664-8d55-1848c909be7a\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.694971 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-nb\") pod \"6752ef9c-92c9-4664-8d55-1848c909be7a\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.695003 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2z95\" (UniqueName: \"kubernetes.io/projected/6752ef9c-92c9-4664-8d55-1848c909be7a-kube-api-access-v2z95\") pod \"6752ef9c-92c9-4664-8d55-1848c909be7a\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.695179 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-openstack-cell1\") pod \"6752ef9c-92c9-4664-8d55-1848c909be7a\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.695351 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-dns-svc\") pod \"6752ef9c-92c9-4664-8d55-1848c909be7a\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.695425 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-sb\") pod \"6752ef9c-92c9-4664-8d55-1848c909be7a\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.721112 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6752ef9c-92c9-4664-8d55-1848c909be7a-kube-api-access-v2z95" (OuterVolumeSpecName: "kube-api-access-v2z95") pod "6752ef9c-92c9-4664-8d55-1848c909be7a" (UID: "6752ef9c-92c9-4664-8d55-1848c909be7a"). InnerVolumeSpecName "kube-api-access-v2z95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.760346 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6752ef9c-92c9-4664-8d55-1848c909be7a" (UID: "6752ef9c-92c9-4664-8d55-1848c909be7a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.773703 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-config" (OuterVolumeSpecName: "config") pod "6752ef9c-92c9-4664-8d55-1848c909be7a" (UID: "6752ef9c-92c9-4664-8d55-1848c909be7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.784204 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6752ef9c-92c9-4664-8d55-1848c909be7a" (UID: "6752ef9c-92c9-4664-8d55-1848c909be7a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.789281 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "6752ef9c-92c9-4664-8d55-1848c909be7a" (UID: "6752ef9c-92c9-4664-8d55-1848c909be7a"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.797312 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6752ef9c-92c9-4664-8d55-1848c909be7a" (UID: "6752ef9c-92c9-4664-8d55-1848c909be7a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.797445 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-dns-svc\") pod \"6752ef9c-92c9-4664-8d55-1848c909be7a\" (UID: \"6752ef9c-92c9-4664-8d55-1848c909be7a\") " Dec 09 11:47:30 crc kubenswrapper[5002]: W1209 11:47:30.797571 5002 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6752ef9c-92c9-4664-8d55-1848c909be7a/volumes/kubernetes.io~configmap/dns-svc Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.797584 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6752ef9c-92c9-4664-8d55-1848c909be7a" (UID: "6752ef9c-92c9-4664-8d55-1848c909be7a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.797895 5002 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.797915 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.797927 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2z95\" (UniqueName: \"kubernetes.io/projected/6752ef9c-92c9-4664-8d55-1848c909be7a-kube-api-access-v2z95\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.797936 5002 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.797944 5002 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:30 crc kubenswrapper[5002]: I1209 11:47:30.797951 5002 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6752ef9c-92c9-4664-8d55-1848c909be7a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:31 crc kubenswrapper[5002]: I1209 11:47:31.260412 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" event={"ID":"6752ef9c-92c9-4664-8d55-1848c909be7a","Type":"ContainerDied","Data":"38caee3a080be6ad0dec18186e5765612da295a63e1965ec9c93a2ce422a4dfd"} Dec 09 11:47:31 crc kubenswrapper[5002]: I1209 11:47:31.260468 5002 scope.go:117] "RemoveContainer" containerID="7e64376f71b116ac049f79d50183881ac9ac4a8778ae660ffb743ed1f27519be" Dec 09 11:47:31 crc kubenswrapper[5002]: I1209 11:47:31.260486 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54c866bc-k2jw6" Dec 09 11:47:31 crc kubenswrapper[5002]: I1209 11:47:31.283187 5002 scope.go:117] "RemoveContainer" containerID="6e1c9c503cc517a53caebdbf750ab788d22d4d9167dbdc76acd9b5daa6ee7127" Dec 09 11:47:31 crc kubenswrapper[5002]: I1209 11:47:31.298885 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-k2jw6"] Dec 09 11:47:31 crc kubenswrapper[5002]: I1209 11:47:31.308263 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-k2jw6"] Dec 09 11:47:32 crc kubenswrapper[5002]: I1209 11:47:32.073828 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6752ef9c-92c9-4664-8d55-1848c909be7a" path="/var/lib/kubelet/pods/6752ef9c-92c9-4664-8d55-1848c909be7a/volumes" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.637038 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp"] Dec 09 11:47:40 crc kubenswrapper[5002]: E1209 11:47:40.638414 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6752ef9c-92c9-4664-8d55-1848c909be7a" containerName="dnsmasq-dns" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.638437 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="6752ef9c-92c9-4664-8d55-1848c909be7a" containerName="dnsmasq-dns" Dec 09 11:47:40 crc kubenswrapper[5002]: E1209 11:47:40.638472 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6752ef9c-92c9-4664-8d55-1848c909be7a" containerName="init" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.638483 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="6752ef9c-92c9-4664-8d55-1848c909be7a" containerName="init" Dec 09 11:47:40 crc kubenswrapper[5002]: E1209 11:47:40.638514 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb0ee03-26c6-4869-904a-519db3cf6dd0" containerName="dnsmasq-dns" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.638526 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb0ee03-26c6-4869-904a-519db3cf6dd0" containerName="dnsmasq-dns" Dec 09 11:47:40 crc kubenswrapper[5002]: E1209 11:47:40.638541 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb0ee03-26c6-4869-904a-519db3cf6dd0" containerName="init" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.638552 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb0ee03-26c6-4869-904a-519db3cf6dd0" containerName="init" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.638974 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb0ee03-26c6-4869-904a-519db3cf6dd0" containerName="dnsmasq-dns" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.639029 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="6752ef9c-92c9-4664-8d55-1848c909be7a" containerName="dnsmasq-dns" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.640261 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.642295 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.642489 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.643485 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.643770 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.650290 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp"] Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.707613 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.707715 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.707792 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.708059 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfrf5\" (UniqueName: \"kubernetes.io/projected/03b994f6-6474-46af-8a64-6e9f9169e073-kube-api-access-dfrf5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.708286 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.810742 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.811288 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfrf5\" (UniqueName: \"kubernetes.io/projected/03b994f6-6474-46af-8a64-6e9f9169e073-kube-api-access-dfrf5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.811453 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.812416 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.812486 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.818054 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.818254 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.820027 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.821667 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.826414 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfrf5\" (UniqueName: \"kubernetes.io/projected/03b994f6-6474-46af-8a64-6e9f9169e073-kube-api-access-dfrf5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:40 crc kubenswrapper[5002]: I1209 11:47:40.983731 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:47:41 crc kubenswrapper[5002]: I1209 11:47:41.816098 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp"] Dec 09 11:47:42 crc kubenswrapper[5002]: I1209 11:47:42.378367 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" event={"ID":"03b994f6-6474-46af-8a64-6e9f9169e073","Type":"ContainerStarted","Data":"a04d8fb70c0e773070fb698d63ebd222cbf75c473da4e97529219a1c91734dbe"} Dec 09 11:47:53 crc kubenswrapper[5002]: I1209 11:47:53.427858 5002 scope.go:117] "RemoveContainer" containerID="fa0c95cd6a5a2021273ed0c0ea0a76748db43764b31355aa204820d63547a606" Dec 09 11:47:54 crc kubenswrapper[5002]: I1209 11:47:54.342275 5002 scope.go:117] "RemoveContainer" containerID="ce071d4e632eaabf87d73b39d82e96c163b1f99773a26ef04c7b1b00c0c4daff" Dec 09 11:47:55 crc kubenswrapper[5002]: I1209 11:47:55.524594 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" event={"ID":"03b994f6-6474-46af-8a64-6e9f9169e073","Type":"ContainerStarted","Data":"3e30477252b7a04c15aa3a7b03afcd98c75829bff01efdce003200987dcd1cea"} Dec 09 11:47:55 crc kubenswrapper[5002]: I1209 11:47:55.554711 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" podStartSLOduration=2.812295877 podStartE2EDuration="15.554687134s" podCreationTimestamp="2025-12-09 11:47:40 +0000 UTC" firstStartedPulling="2025-12-09 11:47:41.819750842 +0000 UTC m=+6394.211801923" lastFinishedPulling="2025-12-09 11:47:54.562142099 +0000 UTC m=+6406.954193180" observedRunningTime="2025-12-09 11:47:55.544473901 +0000 UTC m=+6407.936524992" watchObservedRunningTime="2025-12-09 11:47:55.554687134 +0000 UTC m=+6407.946738215" Dec 09 11:48:08 crc kubenswrapper[5002]: I1209 11:48:08.676521 5002 generic.go:334] "Generic (PLEG): container finished" podID="03b994f6-6474-46af-8a64-6e9f9169e073" containerID="3e30477252b7a04c15aa3a7b03afcd98c75829bff01efdce003200987dcd1cea" exitCode=0 Dec 09 11:48:08 crc kubenswrapper[5002]: I1209 11:48:08.676678 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" event={"ID":"03b994f6-6474-46af-8a64-6e9f9169e073","Type":"ContainerDied","Data":"3e30477252b7a04c15aa3a7b03afcd98c75829bff01efdce003200987dcd1cea"} Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.206958 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.291611 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfrf5\" (UniqueName: \"kubernetes.io/projected/03b994f6-6474-46af-8a64-6e9f9169e073-kube-api-access-dfrf5\") pod \"03b994f6-6474-46af-8a64-6e9f9169e073\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.291691 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ssh-key\") pod \"03b994f6-6474-46af-8a64-6e9f9169e073\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.291796 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ceph\") pod \"03b994f6-6474-46af-8a64-6e9f9169e073\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.291973 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-pre-adoption-validation-combined-ca-bundle\") pod \"03b994f6-6474-46af-8a64-6e9f9169e073\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.292040 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-inventory\") pod \"03b994f6-6474-46af-8a64-6e9f9169e073\" (UID: \"03b994f6-6474-46af-8a64-6e9f9169e073\") " Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.298036 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "03b994f6-6474-46af-8a64-6e9f9169e073" (UID: "03b994f6-6474-46af-8a64-6e9f9169e073"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.305043 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b994f6-6474-46af-8a64-6e9f9169e073-kube-api-access-dfrf5" (OuterVolumeSpecName: "kube-api-access-dfrf5") pod "03b994f6-6474-46af-8a64-6e9f9169e073" (UID: "03b994f6-6474-46af-8a64-6e9f9169e073"). InnerVolumeSpecName "kube-api-access-dfrf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.305394 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ceph" (OuterVolumeSpecName: "ceph") pod "03b994f6-6474-46af-8a64-6e9f9169e073" (UID: "03b994f6-6474-46af-8a64-6e9f9169e073"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.325941 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-inventory" (OuterVolumeSpecName: "inventory") pod "03b994f6-6474-46af-8a64-6e9f9169e073" (UID: "03b994f6-6474-46af-8a64-6e9f9169e073"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.329127 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "03b994f6-6474-46af-8a64-6e9f9169e073" (UID: "03b994f6-6474-46af-8a64-6e9f9169e073"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.397994 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.398039 5002 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.398058 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.398071 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfrf5\" (UniqueName: \"kubernetes.io/projected/03b994f6-6474-46af-8a64-6e9f9169e073-kube-api-access-dfrf5\") on node \"crc\" DevicePath \"\"" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.398084 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03b994f6-6474-46af-8a64-6e9f9169e073-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.697567 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" event={"ID":"03b994f6-6474-46af-8a64-6e9f9169e073","Type":"ContainerDied","Data":"a04d8fb70c0e773070fb698d63ebd222cbf75c473da4e97529219a1c91734dbe"} Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.697989 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a04d8fb70c0e773070fb698d63ebd222cbf75c473da4e97529219a1c91734dbe" Dec 09 11:48:10 crc kubenswrapper[5002]: I1209 11:48:10.697638 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.224670 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg"] Dec 09 11:48:13 crc kubenswrapper[5002]: E1209 11:48:13.225614 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b994f6-6474-46af-8a64-6e9f9169e073" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.225634 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b994f6-6474-46af-8a64-6e9f9169e073" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.226062 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b994f6-6474-46af-8a64-6e9f9169e073" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.227606 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.229552 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.229739 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.229835 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.230060 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.236453 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg"] Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.361193 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vz5t\" (UniqueName: \"kubernetes.io/projected/17cd99c9-1797-4270-9cdb-0589c1797d27-kube-api-access-2vz5t\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.361348 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.361423 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.361508 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.361624 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.463475 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.463951 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.464011 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.464124 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.464294 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vz5t\" (UniqueName: \"kubernetes.io/projected/17cd99c9-1797-4270-9cdb-0589c1797d27-kube-api-access-2vz5t\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.470629 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.471072 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.472327 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.477502 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.485295 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vz5t\" (UniqueName: \"kubernetes.io/projected/17cd99c9-1797-4270-9cdb-0589c1797d27-kube-api-access-2vz5t\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:13 crc kubenswrapper[5002]: I1209 11:48:13.565479 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:48:14 crc kubenswrapper[5002]: I1209 11:48:14.131505 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg"] Dec 09 11:48:14 crc kubenswrapper[5002]: I1209 11:48:14.136314 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:48:14 crc kubenswrapper[5002]: I1209 11:48:14.747068 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" event={"ID":"17cd99c9-1797-4270-9cdb-0589c1797d27","Type":"ContainerStarted","Data":"82d036f8296354d45f75d96b565605300536dd4cc977c1449062021f7ad7898a"} Dec 09 11:48:15 crc kubenswrapper[5002]: I1209 11:48:15.757427 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" event={"ID":"17cd99c9-1797-4270-9cdb-0589c1797d27","Type":"ContainerStarted","Data":"ff956f6c172e46d4bf8bd46747c0659c0db9544509e7602b0913d5c434524afc"} Dec 09 11:48:15 crc kubenswrapper[5002]: I1209 11:48:15.784887 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" podStartSLOduration=2.343455076 podStartE2EDuration="2.784867949s" podCreationTimestamp="2025-12-09 11:48:13 +0000 UTC" firstStartedPulling="2025-12-09 11:48:14.136065417 +0000 UTC m=+6426.528116498" lastFinishedPulling="2025-12-09 11:48:14.57747826 +0000 UTC m=+6426.969529371" observedRunningTime="2025-12-09 11:48:15.77744636 +0000 UTC m=+6428.169497511" watchObservedRunningTime="2025-12-09 11:48:15.784867949 +0000 UTC m=+6428.176919030" Dec 09 11:48:26 crc kubenswrapper[5002]: I1209 11:48:26.040094 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-4rz5z"] Dec 09 11:48:26 crc kubenswrapper[5002]: I1209 11:48:26.049347 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-4rz5z"] Dec 09 11:48:26 crc kubenswrapper[5002]: I1209 11:48:26.075707 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a693ed-64c3-4ee0-9b56-505f565ac236" path="/var/lib/kubelet/pods/c4a693ed-64c3-4ee0-9b56-505f565ac236/volumes" Dec 09 11:48:27 crc kubenswrapper[5002]: I1209 11:48:27.040004 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-9572-account-create-update-tndnz"] Dec 09 11:48:27 crc kubenswrapper[5002]: I1209 11:48:27.049948 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-9572-account-create-update-tndnz"] Dec 09 11:48:28 crc kubenswrapper[5002]: I1209 11:48:28.083531 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb2a13d-3ba7-4bf4-be68-aac295d2fd0e" path="/var/lib/kubelet/pods/acb2a13d-3ba7-4bf4-be68-aac295d2fd0e/volumes" Dec 09 11:48:33 crc kubenswrapper[5002]: I1209 11:48:33.040429 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-kbfr9"] Dec 09 11:48:33 crc kubenswrapper[5002]: I1209 11:48:33.056147 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-kbfr9"] Dec 09 11:48:34 crc kubenswrapper[5002]: I1209 11:48:34.028073 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-5d20-account-create-update-hklfw"] Dec 09 11:48:34 crc kubenswrapper[5002]: I1209 11:48:34.038724 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-5d20-account-create-update-hklfw"] Dec 09 11:48:34 crc kubenswrapper[5002]: I1209 11:48:34.074775 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada679ee-2fbf-4aef-bf7a-82eaa257438a" path="/var/lib/kubelet/pods/ada679ee-2fbf-4aef-bf7a-82eaa257438a/volumes" Dec 09 11:48:34 crc kubenswrapper[5002]: I1209 11:48:34.075412 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1def173-a869-4bd2-9013-32339fd3ff8f" path="/var/lib/kubelet/pods/d1def173-a869-4bd2-9013-32339fd3ff8f/volumes" Dec 09 11:48:37 crc kubenswrapper[5002]: I1209 11:48:37.964375 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:48:37 crc kubenswrapper[5002]: I1209 11:48:37.966103 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:48:54 crc kubenswrapper[5002]: I1209 11:48:54.661728 5002 scope.go:117] "RemoveContainer" containerID="00ccc1de5aa50d50b2926b04b4d727d3c300a33afd8b92b01a506e300aac9a61" Dec 09 11:48:54 crc kubenswrapper[5002]: I1209 11:48:54.692302 5002 scope.go:117] "RemoveContainer" containerID="6fdf88d40c3bb0f02ecda408e9dc6177c3c0027d2aa9ac896901a258a17e127e" Dec 09 11:48:54 crc kubenswrapper[5002]: I1209 11:48:54.777135 5002 scope.go:117] "RemoveContainer" containerID="82501750b20e812ac7ecbd26445f5270a910a89031bc417ab4ec3d3fba069f02" Dec 09 11:48:54 crc kubenswrapper[5002]: I1209 11:48:54.806598 5002 scope.go:117] "RemoveContainer" containerID="aad73eaa74f82fba0bc03f2558f65514bfddb37fde2a01411140bb881a849a62" Dec 09 11:48:54 crc kubenswrapper[5002]: I1209 11:48:54.867258 5002 scope.go:117] "RemoveContainer" containerID="24aeb3a07fa6035b052b77ff64bcfed6f92598c2d7d4ba54a96657a28c09ced4" Dec 09 11:48:54 crc kubenswrapper[5002]: I1209 11:48:54.901065 5002 scope.go:117] "RemoveContainer" containerID="d8ed1401f64444376d1fdd43134a78bef62b903fc936675ec019b767876ce995" Dec 09 11:48:54 crc kubenswrapper[5002]: I1209 11:48:54.950937 5002 scope.go:117] "RemoveContainer" containerID="1a26833391e2a4a79ef081dd09398845556a322dc5ac2726c9549d9e3d629d16" Dec 09 11:49:07 crc kubenswrapper[5002]: I1209 11:49:07.964310 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:49:07 crc kubenswrapper[5002]: I1209 11:49:07.964849 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:49:10 crc kubenswrapper[5002]: I1209 11:49:10.044746 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-lcrd7"] Dec 09 11:49:10 crc kubenswrapper[5002]: I1209 11:49:10.055012 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-lcrd7"] Dec 09 11:49:10 crc kubenswrapper[5002]: I1209 11:49:10.077328 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d660fbb0-443c-4a2d-9e92-1a89bac33302" path="/var/lib/kubelet/pods/d660fbb0-443c-4a2d-9e92-1a89bac33302/volumes" Dec 09 11:49:37 crc kubenswrapper[5002]: I1209 11:49:37.964797 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:49:37 crc kubenswrapper[5002]: I1209 11:49:37.965991 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:49:37 crc kubenswrapper[5002]: I1209 11:49:37.966055 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 11:49:37 crc kubenswrapper[5002]: I1209 11:49:37.966956 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b77c9efff9501c1fc3a03ad7a9700e7bb9032c2f86b3d0fba67afd52dc18d5f"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:49:37 crc kubenswrapper[5002]: I1209 11:49:37.967018 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://0b77c9efff9501c1fc3a03ad7a9700e7bb9032c2f86b3d0fba67afd52dc18d5f" gracePeriod=600 Dec 09 11:49:38 crc kubenswrapper[5002]: I1209 11:49:38.657075 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="0b77c9efff9501c1fc3a03ad7a9700e7bb9032c2f86b3d0fba67afd52dc18d5f" exitCode=0 Dec 09 11:49:38 crc kubenswrapper[5002]: I1209 11:49:38.657146 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"0b77c9efff9501c1fc3a03ad7a9700e7bb9032c2f86b3d0fba67afd52dc18d5f"} Dec 09 11:49:38 crc kubenswrapper[5002]: I1209 11:49:38.657515 5002 scope.go:117] "RemoveContainer" containerID="39962d0376837cc534e6b0a62303166efdae767fb36cfb81ae7c7eb077d56c3e" Dec 09 11:49:39 crc kubenswrapper[5002]: I1209 11:49:39.670516 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389"} Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.054512 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2x44d"] Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.057632 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.074862 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2x44d"] Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.123260 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-utilities\") pod \"certified-operators-2x44d\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.123707 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc87t\" (UniqueName: \"kubernetes.io/projected/9a5f5179-c349-4468-92ac-ebed5c82f34a-kube-api-access-vc87t\") pod \"certified-operators-2x44d\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.124095 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-catalog-content\") pod \"certified-operators-2x44d\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.226694 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc87t\" (UniqueName: \"kubernetes.io/projected/9a5f5179-c349-4468-92ac-ebed5c82f34a-kube-api-access-vc87t\") pod \"certified-operators-2x44d\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.227079 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-catalog-content\") pod \"certified-operators-2x44d\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.227212 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-utilities\") pod \"certified-operators-2x44d\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.227895 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-catalog-content\") pod \"certified-operators-2x44d\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.228004 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-utilities\") pod \"certified-operators-2x44d\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.250315 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc87t\" (UniqueName: \"kubernetes.io/projected/9a5f5179-c349-4468-92ac-ebed5c82f34a-kube-api-access-vc87t\") pod \"certified-operators-2x44d\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.388741 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:46 crc kubenswrapper[5002]: I1209 11:49:46.936580 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2x44d"] Dec 09 11:49:47 crc kubenswrapper[5002]: I1209 11:49:47.753779 5002 generic.go:334] "Generic (PLEG): container finished" podID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerID="914b44ded9b3fc855c5c035fb6d0a98e926a14e64d30a572dc3fbd2ff8635111" exitCode=0 Dec 09 11:49:47 crc kubenswrapper[5002]: I1209 11:49:47.753861 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2x44d" event={"ID":"9a5f5179-c349-4468-92ac-ebed5c82f34a","Type":"ContainerDied","Data":"914b44ded9b3fc855c5c035fb6d0a98e926a14e64d30a572dc3fbd2ff8635111"} Dec 09 11:49:47 crc kubenswrapper[5002]: I1209 11:49:47.754122 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2x44d" event={"ID":"9a5f5179-c349-4468-92ac-ebed5c82f34a","Type":"ContainerStarted","Data":"d3d8e3776be6613b2f97739e738f7148d6cd7e77e3deea654b292646a231d2a4"} Dec 09 11:49:49 crc kubenswrapper[5002]: I1209 11:49:49.775504 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2x44d" event={"ID":"9a5f5179-c349-4468-92ac-ebed5c82f34a","Type":"ContainerStarted","Data":"6bb1158440f2e4b9febbe7c8c4d5a87535420f67bf3f64f7a5508c1d5fd5eeda"} Dec 09 11:49:51 crc kubenswrapper[5002]: I1209 11:49:51.798732 5002 generic.go:334] "Generic (PLEG): container finished" podID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerID="6bb1158440f2e4b9febbe7c8c4d5a87535420f67bf3f64f7a5508c1d5fd5eeda" exitCode=0 Dec 09 11:49:51 crc kubenswrapper[5002]: I1209 11:49:51.799102 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2x44d" event={"ID":"9a5f5179-c349-4468-92ac-ebed5c82f34a","Type":"ContainerDied","Data":"6bb1158440f2e4b9febbe7c8c4d5a87535420f67bf3f64f7a5508c1d5fd5eeda"} Dec 09 11:49:53 crc kubenswrapper[5002]: I1209 11:49:53.731638 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="603f27c8-09c3-4c86-8774-3a5937dfaf8e" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 09 11:49:53 crc kubenswrapper[5002]: I1209 11:49:53.731668 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="603f27c8-09c3-4c86-8774-3a5937dfaf8e" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 09 11:49:54 crc kubenswrapper[5002]: I1209 11:49:54.826243 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2x44d" event={"ID":"9a5f5179-c349-4468-92ac-ebed5c82f34a","Type":"ContainerStarted","Data":"dabfb7c90ad20e6f2e7521132c1243b2ff0d0338ebf4ea004bf5e397bfb8f67f"} Dec 09 11:49:54 crc kubenswrapper[5002]: I1209 11:49:54.858764 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2x44d" podStartSLOduration=2.336756573 podStartE2EDuration="8.858743521s" podCreationTimestamp="2025-12-09 11:49:46 +0000 UTC" firstStartedPulling="2025-12-09 11:49:47.757273621 +0000 UTC m=+6520.149324702" lastFinishedPulling="2025-12-09 11:49:54.279260569 +0000 UTC m=+6526.671311650" observedRunningTime="2025-12-09 11:49:54.851460936 +0000 UTC m=+6527.243512027" watchObservedRunningTime="2025-12-09 11:49:54.858743521 +0000 UTC m=+6527.250794592" Dec 09 11:49:55 crc kubenswrapper[5002]: I1209 11:49:55.142558 5002 scope.go:117] "RemoveContainer" containerID="0842b6fbe1e457931cb81a051aae1496d027ef1ec2164002202ebc57531755f3" Dec 09 11:49:55 crc kubenswrapper[5002]: I1209 11:49:55.166055 5002 scope.go:117] "RemoveContainer" containerID="38b392a7acf133fa7119c93e414e97230818278d2853225f3a89404e4fca77eb" Dec 09 11:49:56 crc kubenswrapper[5002]: I1209 11:49:56.389587 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:56 crc kubenswrapper[5002]: I1209 11:49:56.389979 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:49:57 crc kubenswrapper[5002]: I1209 11:49:57.453793 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-2x44d" podUID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerName="registry-server" probeResult="failure" output=< Dec 09 11:49:57 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 11:49:57 crc kubenswrapper[5002]: > Dec 09 11:50:06 crc kubenswrapper[5002]: I1209 11:50:06.479259 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:50:06 crc kubenswrapper[5002]: I1209 11:50:06.540630 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:50:06 crc kubenswrapper[5002]: I1209 11:50:06.721731 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2x44d"] Dec 09 11:50:07 crc kubenswrapper[5002]: I1209 11:50:07.992519 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2x44d" podUID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerName="registry-server" containerID="cri-o://dabfb7c90ad20e6f2e7521132c1243b2ff0d0338ebf4ea004bf5e397bfb8f67f" gracePeriod=2 Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.396098 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.525509 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-catalog-content\") pod \"9a5f5179-c349-4468-92ac-ebed5c82f34a\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.525786 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc87t\" (UniqueName: \"kubernetes.io/projected/9a5f5179-c349-4468-92ac-ebed5c82f34a-kube-api-access-vc87t\") pod \"9a5f5179-c349-4468-92ac-ebed5c82f34a\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.525952 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-utilities\") pod \"9a5f5179-c349-4468-92ac-ebed5c82f34a\" (UID: \"9a5f5179-c349-4468-92ac-ebed5c82f34a\") " Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.527334 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-utilities" (OuterVolumeSpecName: "utilities") pod "9a5f5179-c349-4468-92ac-ebed5c82f34a" (UID: "9a5f5179-c349-4468-92ac-ebed5c82f34a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.531004 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5f5179-c349-4468-92ac-ebed5c82f34a-kube-api-access-vc87t" (OuterVolumeSpecName: "kube-api-access-vc87t") pod "9a5f5179-c349-4468-92ac-ebed5c82f34a" (UID: "9a5f5179-c349-4468-92ac-ebed5c82f34a"). InnerVolumeSpecName "kube-api-access-vc87t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.591387 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a5f5179-c349-4468-92ac-ebed5c82f34a" (UID: "9a5f5179-c349-4468-92ac-ebed5c82f34a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.629866 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc87t\" (UniqueName: \"kubernetes.io/projected/9a5f5179-c349-4468-92ac-ebed5c82f34a-kube-api-access-vc87t\") on node \"crc\" DevicePath \"\"" Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.629903 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.629916 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a5f5179-c349-4468-92ac-ebed5c82f34a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.733197 5002 generic.go:334] "Generic (PLEG): container finished" podID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerID="dabfb7c90ad20e6f2e7521132c1243b2ff0d0338ebf4ea004bf5e397bfb8f67f" exitCode=0 Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.733550 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2x44d" event={"ID":"9a5f5179-c349-4468-92ac-ebed5c82f34a","Type":"ContainerDied","Data":"dabfb7c90ad20e6f2e7521132c1243b2ff0d0338ebf4ea004bf5e397bfb8f67f"} Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.733589 5002 scope.go:117] "RemoveContainer" containerID="dabfb7c90ad20e6f2e7521132c1243b2ff0d0338ebf4ea004bf5e397bfb8f67f" Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.759677 5002 scope.go:117] "RemoveContainer" containerID="6bb1158440f2e4b9febbe7c8c4d5a87535420f67bf3f64f7a5508c1d5fd5eeda" Dec 09 11:50:15 crc kubenswrapper[5002]: I1209 11:50:15.797229 5002 scope.go:117] "RemoveContainer" containerID="914b44ded9b3fc855c5c035fb6d0a98e926a14e64d30a572dc3fbd2ff8635111" Dec 09 11:50:16 crc kubenswrapper[5002]: I1209 11:50:16.745582 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2x44d" event={"ID":"9a5f5179-c349-4468-92ac-ebed5c82f34a","Type":"ContainerDied","Data":"d3d8e3776be6613b2f97739e738f7148d6cd7e77e3deea654b292646a231d2a4"} Dec 09 11:50:16 crc kubenswrapper[5002]: I1209 11:50:16.745732 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2x44d" Dec 09 11:50:16 crc kubenswrapper[5002]: I1209 11:50:16.774454 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2x44d"] Dec 09 11:50:16 crc kubenswrapper[5002]: I1209 11:50:16.783620 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2x44d"] Dec 09 11:50:18 crc kubenswrapper[5002]: I1209 11:50:18.077915 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a5f5179-c349-4468-92ac-ebed5c82f34a" path="/var/lib/kubelet/pods/9a5f5179-c349-4468-92ac-ebed5c82f34a/volumes" Dec 09 11:50:55 crc kubenswrapper[5002]: I1209 11:50:55.292067 5002 scope.go:117] "RemoveContainer" containerID="9ac4e5497a8525dd1daeabc5cd5c44539bcf3191e268cfa112e9eb1ccfa97d1c" Dec 09 11:50:55 crc kubenswrapper[5002]: I1209 11:50:55.346656 5002 scope.go:117] "RemoveContainer" containerID="eb69c4d3ec3c48a1d30ac9d22291af2ada93379f84695287065c23ccd83ae433" Dec 09 11:50:55 crc kubenswrapper[5002]: I1209 11:50:55.389231 5002 scope.go:117] "RemoveContainer" containerID="d2d75b4285fd009c67d50b1769cb947fa04d53ee9fa3fe058daf0901ca89c605" Dec 09 11:51:58 crc kubenswrapper[5002]: I1209 11:51:58.042970 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-nbj95"] Dec 09 11:51:58 crc kubenswrapper[5002]: I1209 11:51:58.074149 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-ffd2-account-create-update-7dng6"] Dec 09 11:51:58 crc kubenswrapper[5002]: I1209 11:51:58.074185 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-nbj95"] Dec 09 11:51:58 crc kubenswrapper[5002]: I1209 11:51:58.079261 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-ffd2-account-create-update-7dng6"] Dec 09 11:52:00 crc kubenswrapper[5002]: I1209 11:52:00.076558 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4700fedc-db85-433e-b657-cecd4f746ca2" path="/var/lib/kubelet/pods/4700fedc-db85-433e-b657-cecd4f746ca2/volumes" Dec 09 11:52:00 crc kubenswrapper[5002]: I1209 11:52:00.077559 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59341756-042a-491d-a6de-f3a3f8d06cd2" path="/var/lib/kubelet/pods/59341756-042a-491d-a6de-f3a3f8d06cd2/volumes" Dec 09 11:52:07 crc kubenswrapper[5002]: I1209 11:52:07.964329 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:52:07 crc kubenswrapper[5002]: I1209 11:52:07.964873 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:52:10 crc kubenswrapper[5002]: I1209 11:52:10.027741 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-8wvkc"] Dec 09 11:52:10 crc kubenswrapper[5002]: I1209 11:52:10.037681 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-8wvkc"] Dec 09 11:52:10 crc kubenswrapper[5002]: I1209 11:52:10.071885 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="665c443a-dd3a-417d-a5b4-8a9851fe80c7" path="/var/lib/kubelet/pods/665c443a-dd3a-417d-a5b4-8a9851fe80c7/volumes" Dec 09 11:52:37 crc kubenswrapper[5002]: I1209 11:52:37.965066 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:52:37 crc kubenswrapper[5002]: I1209 11:52:37.965547 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:52:55 crc kubenswrapper[5002]: I1209 11:52:55.498118 5002 scope.go:117] "RemoveContainer" containerID="9cb4bcc27e93416cb6b88f623e6f71e134169d4614fdd7ea8266151b23fc4e28" Dec 09 11:52:55 crc kubenswrapper[5002]: I1209 11:52:55.525290 5002 scope.go:117] "RemoveContainer" containerID="d9be45632cab05c8d66b8dcd9fc2dbcb754354846d336d02aed7cabd2c1bdb19" Dec 09 11:52:55 crc kubenswrapper[5002]: I1209 11:52:55.614182 5002 scope.go:117] "RemoveContainer" containerID="707fbd821f391fe27359a758a6735fa7aafcb1ae3a67e8f3c0015f0841e5b3e8" Dec 09 11:53:07 crc kubenswrapper[5002]: I1209 11:53:07.965357 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:53:07 crc kubenswrapper[5002]: I1209 11:53:07.965879 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:53:07 crc kubenswrapper[5002]: I1209 11:53:07.965934 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 11:53:07 crc kubenswrapper[5002]: I1209 11:53:07.966865 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:53:07 crc kubenswrapper[5002]: I1209 11:53:07.966924 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" gracePeriod=600 Dec 09 11:53:08 crc kubenswrapper[5002]: E1209 11:53:08.093643 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:53:08 crc kubenswrapper[5002]: I1209 11:53:08.607233 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" exitCode=0 Dec 09 11:53:08 crc kubenswrapper[5002]: I1209 11:53:08.607273 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389"} Dec 09 11:53:08 crc kubenswrapper[5002]: I1209 11:53:08.607553 5002 scope.go:117] "RemoveContainer" containerID="0b77c9efff9501c1fc3a03ad7a9700e7bb9032c2f86b3d0fba67afd52dc18d5f" Dec 09 11:53:08 crc kubenswrapper[5002]: I1209 11:53:08.608320 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:53:08 crc kubenswrapper[5002]: E1209 11:53:08.608745 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:53:15 crc kubenswrapper[5002]: E1209 11:53:15.114947 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49c6392_68b2_4847_9291_a0b4d9c1cbef.slice/crio-conmon-743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389.scope\": RecentStats: unable to find data in memory cache]" Dec 09 11:53:20 crc kubenswrapper[5002]: I1209 11:53:20.070295 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:53:20 crc kubenswrapper[5002]: E1209 11:53:20.071368 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:53:25 crc kubenswrapper[5002]: E1209 11:53:25.435109 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49c6392_68b2_4847_9291_a0b4d9c1cbef.slice/crio-conmon-743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389.scope\": RecentStats: unable to find data in memory cache]" Dec 09 11:53:31 crc kubenswrapper[5002]: I1209 11:53:31.060910 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:53:31 crc kubenswrapper[5002]: E1209 11:53:31.061716 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:53:35 crc kubenswrapper[5002]: E1209 11:53:35.741131 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49c6392_68b2_4847_9291_a0b4d9c1cbef.slice/crio-conmon-743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389.scope\": RecentStats: unable to find data in memory cache]" Dec 09 11:53:46 crc kubenswrapper[5002]: I1209 11:53:46.060915 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:53:46 crc kubenswrapper[5002]: E1209 11:53:46.061998 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:53:46 crc kubenswrapper[5002]: E1209 11:53:46.081697 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49c6392_68b2_4847_9291_a0b4d9c1cbef.slice/crio-conmon-743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389.scope\": RecentStats: unable to find data in memory cache]" Dec 09 11:53:56 crc kubenswrapper[5002]: E1209 11:53:56.361574 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49c6392_68b2_4847_9291_a0b4d9c1cbef.slice/crio-conmon-743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389.scope\": RecentStats: unable to find data in memory cache]" Dec 09 11:53:59 crc kubenswrapper[5002]: I1209 11:53:59.060141 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:53:59 crc kubenswrapper[5002]: E1209 11:53:59.061145 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:54:06 crc kubenswrapper[5002]: E1209 11:54:06.652663 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49c6392_68b2_4847_9291_a0b4d9c1cbef.slice/crio-conmon-743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389.scope\": RecentStats: unable to find data in memory cache]" Dec 09 11:54:12 crc kubenswrapper[5002]: I1209 11:54:12.060246 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:54:12 crc kubenswrapper[5002]: E1209 11:54:12.061187 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.580146 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x28d8"] Dec 09 11:54:22 crc kubenswrapper[5002]: E1209 11:54:22.581633 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerName="extract-utilities" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.581652 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerName="extract-utilities" Dec 09 11:54:22 crc kubenswrapper[5002]: E1209 11:54:22.581677 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerName="registry-server" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.581686 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerName="registry-server" Dec 09 11:54:22 crc kubenswrapper[5002]: E1209 11:54:22.581746 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerName="extract-content" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.581756 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerName="extract-content" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.582295 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5f5179-c349-4468-92ac-ebed5c82f34a" containerName="registry-server" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.598636 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.602864 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x28d8"] Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.699599 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-catalog-content\") pod \"redhat-operators-x28d8\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.699695 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qllzp\" (UniqueName: \"kubernetes.io/projected/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-kube-api-access-qllzp\") pod \"redhat-operators-x28d8\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.699794 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-utilities\") pod \"redhat-operators-x28d8\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.801799 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-utilities\") pod \"redhat-operators-x28d8\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.801980 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-catalog-content\") pod \"redhat-operators-x28d8\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.802032 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qllzp\" (UniqueName: \"kubernetes.io/projected/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-kube-api-access-qllzp\") pod \"redhat-operators-x28d8\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.802297 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-utilities\") pod \"redhat-operators-x28d8\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.802584 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-catalog-content\") pod \"redhat-operators-x28d8\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.824935 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qllzp\" (UniqueName: \"kubernetes.io/projected/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-kube-api-access-qllzp\") pod \"redhat-operators-x28d8\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:22 crc kubenswrapper[5002]: I1209 11:54:22.927128 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:23 crc kubenswrapper[5002]: W1209 11:54:23.429173 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad3dec3_37e9_463e_92b1_2648cbd3c67f.slice/crio-a1af99a413735d6e27c2b022377e3c1342d35cfe7124caebcafb66929b5d4876 WatchSource:0}: Error finding container a1af99a413735d6e27c2b022377e3c1342d35cfe7124caebcafb66929b5d4876: Status 404 returned error can't find the container with id a1af99a413735d6e27c2b022377e3c1342d35cfe7124caebcafb66929b5d4876 Dec 09 11:54:23 crc kubenswrapper[5002]: I1209 11:54:23.431348 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x28d8"] Dec 09 11:54:23 crc kubenswrapper[5002]: I1209 11:54:23.591725 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x28d8" event={"ID":"4ad3dec3-37e9-463e-92b1-2648cbd3c67f","Type":"ContainerStarted","Data":"a1af99a413735d6e27c2b022377e3c1342d35cfe7124caebcafb66929b5d4876"} Dec 09 11:54:24 crc kubenswrapper[5002]: I1209 11:54:24.602323 5002 generic.go:334] "Generic (PLEG): container finished" podID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerID="019342f3963cf09421d02d09f2bb6e2edde8d23e8049679e0440dfb748483ab9" exitCode=0 Dec 09 11:54:24 crc kubenswrapper[5002]: I1209 11:54:24.602429 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x28d8" event={"ID":"4ad3dec3-37e9-463e-92b1-2648cbd3c67f","Type":"ContainerDied","Data":"019342f3963cf09421d02d09f2bb6e2edde8d23e8049679e0440dfb748483ab9"} Dec 09 11:54:24 crc kubenswrapper[5002]: I1209 11:54:24.605569 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:54:25 crc kubenswrapper[5002]: I1209 11:54:25.060680 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:54:25 crc kubenswrapper[5002]: E1209 11:54:25.061493 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:54:25 crc kubenswrapper[5002]: I1209 11:54:25.615474 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x28d8" event={"ID":"4ad3dec3-37e9-463e-92b1-2648cbd3c67f","Type":"ContainerStarted","Data":"e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a"} Dec 09 11:54:30 crc kubenswrapper[5002]: I1209 11:54:30.683910 5002 generic.go:334] "Generic (PLEG): container finished" podID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerID="e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a" exitCode=0 Dec 09 11:54:30 crc kubenswrapper[5002]: I1209 11:54:30.684005 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x28d8" event={"ID":"4ad3dec3-37e9-463e-92b1-2648cbd3c67f","Type":"ContainerDied","Data":"e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a"} Dec 09 11:54:31 crc kubenswrapper[5002]: I1209 11:54:31.698156 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x28d8" event={"ID":"4ad3dec3-37e9-463e-92b1-2648cbd3c67f","Type":"ContainerStarted","Data":"348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4"} Dec 09 11:54:31 crc kubenswrapper[5002]: I1209 11:54:31.721886 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x28d8" podStartSLOduration=3.164996028 podStartE2EDuration="9.721855421s" podCreationTimestamp="2025-12-09 11:54:22 +0000 UTC" firstStartedPulling="2025-12-09 11:54:24.60535762 +0000 UTC m=+6796.997408701" lastFinishedPulling="2025-12-09 11:54:31.162217013 +0000 UTC m=+6803.554268094" observedRunningTime="2025-12-09 11:54:31.721229025 +0000 UTC m=+6804.113280126" watchObservedRunningTime="2025-12-09 11:54:31.721855421 +0000 UTC m=+6804.113906502" Dec 09 11:54:32 crc kubenswrapper[5002]: I1209 11:54:32.927341 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:32 crc kubenswrapper[5002]: I1209 11:54:32.927717 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:33 crc kubenswrapper[5002]: I1209 11:54:33.973575 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x28d8" podUID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerName="registry-server" probeResult="failure" output=< Dec 09 11:54:33 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 11:54:33 crc kubenswrapper[5002]: > Dec 09 11:54:37 crc kubenswrapper[5002]: I1209 11:54:37.060256 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:54:37 crc kubenswrapper[5002]: E1209 11:54:37.060607 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:54:43 crc kubenswrapper[5002]: I1209 11:54:43.071610 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:43 crc kubenswrapper[5002]: I1209 11:54:43.118294 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:43 crc kubenswrapper[5002]: I1209 11:54:43.323267 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x28d8"] Dec 09 11:54:44 crc kubenswrapper[5002]: I1209 11:54:44.816698 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x28d8" podUID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerName="registry-server" containerID="cri-o://348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4" gracePeriod=2 Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.290027 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.478450 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qllzp\" (UniqueName: \"kubernetes.io/projected/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-kube-api-access-qllzp\") pod \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.478579 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-catalog-content\") pod \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.478774 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-utilities\") pod \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\" (UID: \"4ad3dec3-37e9-463e-92b1-2648cbd3c67f\") " Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.479505 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-utilities" (OuterVolumeSpecName: "utilities") pod "4ad3dec3-37e9-463e-92b1-2648cbd3c67f" (UID: "4ad3dec3-37e9-463e-92b1-2648cbd3c67f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.484259 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-kube-api-access-qllzp" (OuterVolumeSpecName: "kube-api-access-qllzp") pod "4ad3dec3-37e9-463e-92b1-2648cbd3c67f" (UID: "4ad3dec3-37e9-463e-92b1-2648cbd3c67f"). InnerVolumeSpecName "kube-api-access-qllzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.581729 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.581766 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qllzp\" (UniqueName: \"kubernetes.io/projected/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-kube-api-access-qllzp\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.634199 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ad3dec3-37e9-463e-92b1-2648cbd3c67f" (UID: "4ad3dec3-37e9-463e-92b1-2648cbd3c67f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.683435 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ad3dec3-37e9-463e-92b1-2648cbd3c67f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.828032 5002 generic.go:334] "Generic (PLEG): container finished" podID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerID="348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4" exitCode=0 Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.828073 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x28d8" event={"ID":"4ad3dec3-37e9-463e-92b1-2648cbd3c67f","Type":"ContainerDied","Data":"348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4"} Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.828132 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x28d8" event={"ID":"4ad3dec3-37e9-463e-92b1-2648cbd3c67f","Type":"ContainerDied","Data":"a1af99a413735d6e27c2b022377e3c1342d35cfe7124caebcafb66929b5d4876"} Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.828158 5002 scope.go:117] "RemoveContainer" containerID="348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.828659 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x28d8" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.859359 5002 scope.go:117] "RemoveContainer" containerID="e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.874608 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x28d8"] Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.888669 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x28d8"] Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.899988 5002 scope.go:117] "RemoveContainer" containerID="019342f3963cf09421d02d09f2bb6e2edde8d23e8049679e0440dfb748483ab9" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.942752 5002 scope.go:117] "RemoveContainer" containerID="348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4" Dec 09 11:54:45 crc kubenswrapper[5002]: E1209 11:54:45.944219 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4\": container with ID starting with 348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4 not found: ID does not exist" containerID="348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.944270 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4"} err="failed to get container status \"348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4\": rpc error: code = NotFound desc = could not find container \"348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4\": container with ID starting with 348339d64fda0685099195fef4150efc21465f6a44d71cf04d4f7ac334b7e1a4 not found: ID does not exist" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.944304 5002 scope.go:117] "RemoveContainer" containerID="e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a" Dec 09 11:54:45 crc kubenswrapper[5002]: E1209 11:54:45.945867 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a\": container with ID starting with e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a not found: ID does not exist" containerID="e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.945923 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a"} err="failed to get container status \"e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a\": rpc error: code = NotFound desc = could not find container \"e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a\": container with ID starting with e0952c3622a4f09364d358616a50d16b182c42dd0af2fd33709f1d4c9608871a not found: ID does not exist" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.945957 5002 scope.go:117] "RemoveContainer" containerID="019342f3963cf09421d02d09f2bb6e2edde8d23e8049679e0440dfb748483ab9" Dec 09 11:54:45 crc kubenswrapper[5002]: E1209 11:54:45.946261 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"019342f3963cf09421d02d09f2bb6e2edde8d23e8049679e0440dfb748483ab9\": container with ID starting with 019342f3963cf09421d02d09f2bb6e2edde8d23e8049679e0440dfb748483ab9 not found: ID does not exist" containerID="019342f3963cf09421d02d09f2bb6e2edde8d23e8049679e0440dfb748483ab9" Dec 09 11:54:45 crc kubenswrapper[5002]: I1209 11:54:45.946303 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019342f3963cf09421d02d09f2bb6e2edde8d23e8049679e0440dfb748483ab9"} err="failed to get container status \"019342f3963cf09421d02d09f2bb6e2edde8d23e8049679e0440dfb748483ab9\": rpc error: code = NotFound desc = could not find container \"019342f3963cf09421d02d09f2bb6e2edde8d23e8049679e0440dfb748483ab9\": container with ID starting with 019342f3963cf09421d02d09f2bb6e2edde8d23e8049679e0440dfb748483ab9 not found: ID does not exist" Dec 09 11:54:46 crc kubenswrapper[5002]: I1209 11:54:46.083460 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" path="/var/lib/kubelet/pods/4ad3dec3-37e9-463e-92b1-2648cbd3c67f/volumes" Dec 09 11:54:51 crc kubenswrapper[5002]: I1209 11:54:51.060979 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:54:51 crc kubenswrapper[5002]: E1209 11:54:51.061743 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:55:06 crc kubenswrapper[5002]: I1209 11:55:06.061133 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:55:06 crc kubenswrapper[5002]: E1209 11:55:06.062625 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:55:07 crc kubenswrapper[5002]: I1209 11:55:07.058737 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-364d-account-create-update-5gppk"] Dec 09 11:55:07 crc kubenswrapper[5002]: I1209 11:55:07.072595 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-78vks"] Dec 09 11:55:07 crc kubenswrapper[5002]: I1209 11:55:07.080954 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-364d-account-create-update-5gppk"] Dec 09 11:55:07 crc kubenswrapper[5002]: I1209 11:55:07.088302 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-78vks"] Dec 09 11:55:08 crc kubenswrapper[5002]: I1209 11:55:08.073989 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45156329-405f-4f04-9a55-3568465720b3" path="/var/lib/kubelet/pods/45156329-405f-4f04-9a55-3568465720b3/volumes" Dec 09 11:55:08 crc kubenswrapper[5002]: I1209 11:55:08.074547 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c94563f6-d63b-478e-a796-9615ab525f8b" path="/var/lib/kubelet/pods/c94563f6-d63b-478e-a796-9615ab525f8b/volumes" Dec 09 11:55:17 crc kubenswrapper[5002]: I1209 11:55:17.061431 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:55:17 crc kubenswrapper[5002]: E1209 11:55:17.062434 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:55:21 crc kubenswrapper[5002]: I1209 11:55:21.045652 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-6sw44"] Dec 09 11:55:21 crc kubenswrapper[5002]: I1209 11:55:21.059419 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-6sw44"] Dec 09 11:55:22 crc kubenswrapper[5002]: I1209 11:55:22.074545 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8067dc57-a44d-4997-bfe0-945cb47eabe3" path="/var/lib/kubelet/pods/8067dc57-a44d-4997-bfe0-945cb47eabe3/volumes" Dec 09 11:55:32 crc kubenswrapper[5002]: I1209 11:55:32.060393 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:55:32 crc kubenswrapper[5002]: E1209 11:55:32.061160 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:55:42 crc kubenswrapper[5002]: I1209 11:55:42.036426 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-w2hx7"] Dec 09 11:55:42 crc kubenswrapper[5002]: I1209 11:55:42.044993 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-1042-account-create-update-z8pfb"] Dec 09 11:55:42 crc kubenswrapper[5002]: I1209 11:55:42.053311 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-1042-account-create-update-z8pfb"] Dec 09 11:55:42 crc kubenswrapper[5002]: I1209 11:55:42.078141 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1532e90-d56a-4f6c-9f0f-3aa418f6498a" path="/var/lib/kubelet/pods/f1532e90-d56a-4f6c-9f0f-3aa418f6498a/volumes" Dec 09 11:55:42 crc kubenswrapper[5002]: I1209 11:55:42.078935 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-w2hx7"] Dec 09 11:55:43 crc kubenswrapper[5002]: I1209 11:55:43.060858 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:55:43 crc kubenswrapper[5002]: E1209 11:55:43.061166 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:55:44 crc kubenswrapper[5002]: I1209 11:55:44.074493 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94ee5751-ddf8-468a-b9e1-feb60a74481c" path="/var/lib/kubelet/pods/94ee5751-ddf8-468a-b9e1-feb60a74481c/volumes" Dec 09 11:55:55 crc kubenswrapper[5002]: I1209 11:55:55.799449 5002 scope.go:117] "RemoveContainer" containerID="636e8e457008c6d06bcea2a0074f00b0ef6e363760f3d379a16dd2bbeaea2918" Dec 09 11:55:55 crc kubenswrapper[5002]: I1209 11:55:55.825296 5002 scope.go:117] "RemoveContainer" containerID="44fb72e7119cc0ef6bc8a804a747b1a9d09bafd531f0db2bb4ab798bc2bb712c" Dec 09 11:55:55 crc kubenswrapper[5002]: I1209 11:55:55.885606 5002 scope.go:117] "RemoveContainer" containerID="049f26688284e6813c69c3aa3f4ff7847df86e26ade416de72508c7c7eab60c5" Dec 09 11:55:55 crc kubenswrapper[5002]: I1209 11:55:55.957376 5002 scope.go:117] "RemoveContainer" containerID="611aaee36f227f6bf7b43a35a0857458b868ae07fff5c6d9b65bfef529969114" Dec 09 11:55:55 crc kubenswrapper[5002]: I1209 11:55:55.987985 5002 scope.go:117] "RemoveContainer" containerID="19c556d2629036d9a997f5c5b1a3bbf0146f2d067156fff6ad0847815dd15620" Dec 09 11:55:57 crc kubenswrapper[5002]: I1209 11:55:57.060575 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:55:57 crc kubenswrapper[5002]: E1209 11:55:57.061207 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:55:58 crc kubenswrapper[5002]: I1209 11:55:58.045916 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-qf6zt"] Dec 09 11:55:58 crc kubenswrapper[5002]: I1209 11:55:58.055100 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-qf6zt"] Dec 09 11:55:58 crc kubenswrapper[5002]: I1209 11:55:58.073050 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0393597-f5c1-4aea-83fd-dd94aa5018a8" path="/var/lib/kubelet/pods/a0393597-f5c1-4aea-83fd-dd94aa5018a8/volumes" Dec 09 11:56:09 crc kubenswrapper[5002]: I1209 11:56:09.061187 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:56:09 crc kubenswrapper[5002]: E1209 11:56:09.062070 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.702027 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7b5w5"] Dec 09 11:56:12 crc kubenswrapper[5002]: E1209 11:56:12.703531 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerName="extract-utilities" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.703550 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerName="extract-utilities" Dec 09 11:56:12 crc kubenswrapper[5002]: E1209 11:56:12.703585 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerName="registry-server" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.703592 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerName="registry-server" Dec 09 11:56:12 crc kubenswrapper[5002]: E1209 11:56:12.703616 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerName="extract-content" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.703624 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerName="extract-content" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.704149 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad3dec3-37e9-463e-92b1-2648cbd3c67f" containerName="registry-server" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.711410 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.723298 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7b5w5"] Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.796437 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8gz6\" (UniqueName: \"kubernetes.io/projected/35c5f211-ceb3-42d3-bb99-d41af7772be6-kube-api-access-x8gz6\") pod \"community-operators-7b5w5\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.796540 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-catalog-content\") pod \"community-operators-7b5w5\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.796648 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-utilities\") pod \"community-operators-7b5w5\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.898351 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-utilities\") pod \"community-operators-7b5w5\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.898455 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8gz6\" (UniqueName: \"kubernetes.io/projected/35c5f211-ceb3-42d3-bb99-d41af7772be6-kube-api-access-x8gz6\") pod \"community-operators-7b5w5\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.898511 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-catalog-content\") pod \"community-operators-7b5w5\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.898955 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-utilities\") pod \"community-operators-7b5w5\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.898968 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-catalog-content\") pod \"community-operators-7b5w5\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:12 crc kubenswrapper[5002]: I1209 11:56:12.926218 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8gz6\" (UniqueName: \"kubernetes.io/projected/35c5f211-ceb3-42d3-bb99-d41af7772be6-kube-api-access-x8gz6\") pod \"community-operators-7b5w5\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:13 crc kubenswrapper[5002]: I1209 11:56:13.039911 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:13 crc kubenswrapper[5002]: I1209 11:56:13.624892 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7b5w5"] Dec 09 11:56:13 crc kubenswrapper[5002]: I1209 11:56:13.714091 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b5w5" event={"ID":"35c5f211-ceb3-42d3-bb99-d41af7772be6","Type":"ContainerStarted","Data":"161e5ad7441cec2f7a49268f5f3b32e47b33cb1de83a60dfc5cee585ff4438ad"} Dec 09 11:56:14 crc kubenswrapper[5002]: I1209 11:56:14.725622 5002 generic.go:334] "Generic (PLEG): container finished" podID="35c5f211-ceb3-42d3-bb99-d41af7772be6" containerID="a73f0d35a64ccdc87ed4a161a2cf778c76481c9816437f3dcaf2f5eaccf0998a" exitCode=0 Dec 09 11:56:14 crc kubenswrapper[5002]: I1209 11:56:14.725966 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b5w5" event={"ID":"35c5f211-ceb3-42d3-bb99-d41af7772be6","Type":"ContainerDied","Data":"a73f0d35a64ccdc87ed4a161a2cf778c76481c9816437f3dcaf2f5eaccf0998a"} Dec 09 11:56:16 crc kubenswrapper[5002]: I1209 11:56:16.748011 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b5w5" event={"ID":"35c5f211-ceb3-42d3-bb99-d41af7772be6","Type":"ContainerStarted","Data":"b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048"} Dec 09 11:56:17 crc kubenswrapper[5002]: I1209 11:56:17.758348 5002 generic.go:334] "Generic (PLEG): container finished" podID="35c5f211-ceb3-42d3-bb99-d41af7772be6" containerID="b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048" exitCode=0 Dec 09 11:56:17 crc kubenswrapper[5002]: I1209 11:56:17.758391 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b5w5" event={"ID":"35c5f211-ceb3-42d3-bb99-d41af7772be6","Type":"ContainerDied","Data":"b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048"} Dec 09 11:56:19 crc kubenswrapper[5002]: I1209 11:56:19.803702 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b5w5" event={"ID":"35c5f211-ceb3-42d3-bb99-d41af7772be6","Type":"ContainerStarted","Data":"4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a"} Dec 09 11:56:19 crc kubenswrapper[5002]: I1209 11:56:19.833530 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7b5w5" podStartSLOduration=4.033583179 podStartE2EDuration="7.833491957s" podCreationTimestamp="2025-12-09 11:56:12 +0000 UTC" firstStartedPulling="2025-12-09 11:56:14.729008637 +0000 UTC m=+6907.121059718" lastFinishedPulling="2025-12-09 11:56:18.528917405 +0000 UTC m=+6910.920968496" observedRunningTime="2025-12-09 11:56:19.826368346 +0000 UTC m=+6912.218419427" watchObservedRunningTime="2025-12-09 11:56:19.833491957 +0000 UTC m=+6912.225543038" Dec 09 11:56:21 crc kubenswrapper[5002]: I1209 11:56:21.061278 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:56:21 crc kubenswrapper[5002]: E1209 11:56:21.061825 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:56:23 crc kubenswrapper[5002]: I1209 11:56:23.040112 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:23 crc kubenswrapper[5002]: I1209 11:56:23.040460 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:23 crc kubenswrapper[5002]: I1209 11:56:23.086611 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:23 crc kubenswrapper[5002]: I1209 11:56:23.897506 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:23 crc kubenswrapper[5002]: I1209 11:56:23.947516 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7b5w5"] Dec 09 11:56:25 crc kubenswrapper[5002]: I1209 11:56:25.868497 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7b5w5" podUID="35c5f211-ceb3-42d3-bb99-d41af7772be6" containerName="registry-server" containerID="cri-o://4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a" gracePeriod=2 Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.449289 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.640331 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-catalog-content\") pod \"35c5f211-ceb3-42d3-bb99-d41af7772be6\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.640487 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-utilities\") pod \"35c5f211-ceb3-42d3-bb99-d41af7772be6\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.641349 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-utilities" (OuterVolumeSpecName: "utilities") pod "35c5f211-ceb3-42d3-bb99-d41af7772be6" (UID: "35c5f211-ceb3-42d3-bb99-d41af7772be6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.641429 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8gz6\" (UniqueName: \"kubernetes.io/projected/35c5f211-ceb3-42d3-bb99-d41af7772be6-kube-api-access-x8gz6\") pod \"35c5f211-ceb3-42d3-bb99-d41af7772be6\" (UID: \"35c5f211-ceb3-42d3-bb99-d41af7772be6\") " Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.642762 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.647623 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c5f211-ceb3-42d3-bb99-d41af7772be6-kube-api-access-x8gz6" (OuterVolumeSpecName: "kube-api-access-x8gz6") pod "35c5f211-ceb3-42d3-bb99-d41af7772be6" (UID: "35c5f211-ceb3-42d3-bb99-d41af7772be6"). InnerVolumeSpecName "kube-api-access-x8gz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.692495 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35c5f211-ceb3-42d3-bb99-d41af7772be6" (UID: "35c5f211-ceb3-42d3-bb99-d41af7772be6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.744688 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8gz6\" (UniqueName: \"kubernetes.io/projected/35c5f211-ceb3-42d3-bb99-d41af7772be6-kube-api-access-x8gz6\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.744728 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35c5f211-ceb3-42d3-bb99-d41af7772be6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.880626 5002 generic.go:334] "Generic (PLEG): container finished" podID="35c5f211-ceb3-42d3-bb99-d41af7772be6" containerID="4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a" exitCode=0 Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.880667 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b5w5" event={"ID":"35c5f211-ceb3-42d3-bb99-d41af7772be6","Type":"ContainerDied","Data":"4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a"} Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.880710 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b5w5" event={"ID":"35c5f211-ceb3-42d3-bb99-d41af7772be6","Type":"ContainerDied","Data":"161e5ad7441cec2f7a49268f5f3b32e47b33cb1de83a60dfc5cee585ff4438ad"} Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.880727 5002 scope.go:117] "RemoveContainer" containerID="4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a" Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.880879 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7b5w5" Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.928315 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7b5w5"] Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.928831 5002 scope.go:117] "RemoveContainer" containerID="b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048" Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.941218 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7b5w5"] Dec 09 11:56:26 crc kubenswrapper[5002]: I1209 11:56:26.950333 5002 scope.go:117] "RemoveContainer" containerID="a73f0d35a64ccdc87ed4a161a2cf778c76481c9816437f3dcaf2f5eaccf0998a" Dec 09 11:56:27 crc kubenswrapper[5002]: I1209 11:56:27.004407 5002 scope.go:117] "RemoveContainer" containerID="4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a" Dec 09 11:56:27 crc kubenswrapper[5002]: E1209 11:56:27.005078 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a\": container with ID starting with 4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a not found: ID does not exist" containerID="4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a" Dec 09 11:56:27 crc kubenswrapper[5002]: I1209 11:56:27.005110 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a"} err="failed to get container status \"4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a\": rpc error: code = NotFound desc = could not find container \"4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a\": container with ID starting with 4d84caf080dbf29f77a0217099d4ed0d349c70a56910f3d328707ff92061256a not found: ID does not exist" Dec 09 11:56:27 crc kubenswrapper[5002]: I1209 11:56:27.005136 5002 scope.go:117] "RemoveContainer" containerID="b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048" Dec 09 11:56:27 crc kubenswrapper[5002]: E1209 11:56:27.007270 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048\": container with ID starting with b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048 not found: ID does not exist" containerID="b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048" Dec 09 11:56:27 crc kubenswrapper[5002]: I1209 11:56:27.007320 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048"} err="failed to get container status \"b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048\": rpc error: code = NotFound desc = could not find container \"b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048\": container with ID starting with b3bad492fc2fd8455c9ff1444b7e3b06dd8392ed0820ae2bfe96e67b9c63f048 not found: ID does not exist" Dec 09 11:56:27 crc kubenswrapper[5002]: I1209 11:56:27.007337 5002 scope.go:117] "RemoveContainer" containerID="a73f0d35a64ccdc87ed4a161a2cf778c76481c9816437f3dcaf2f5eaccf0998a" Dec 09 11:56:27 crc kubenswrapper[5002]: E1209 11:56:27.007657 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73f0d35a64ccdc87ed4a161a2cf778c76481c9816437f3dcaf2f5eaccf0998a\": container with ID starting with a73f0d35a64ccdc87ed4a161a2cf778c76481c9816437f3dcaf2f5eaccf0998a not found: ID does not exist" containerID="a73f0d35a64ccdc87ed4a161a2cf778c76481c9816437f3dcaf2f5eaccf0998a" Dec 09 11:56:27 crc kubenswrapper[5002]: I1209 11:56:27.007679 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73f0d35a64ccdc87ed4a161a2cf778c76481c9816437f3dcaf2f5eaccf0998a"} err="failed to get container status \"a73f0d35a64ccdc87ed4a161a2cf778c76481c9816437f3dcaf2f5eaccf0998a\": rpc error: code = NotFound desc = could not find container \"a73f0d35a64ccdc87ed4a161a2cf778c76481c9816437f3dcaf2f5eaccf0998a\": container with ID starting with a73f0d35a64ccdc87ed4a161a2cf778c76481c9816437f3dcaf2f5eaccf0998a not found: ID does not exist" Dec 09 11:56:28 crc kubenswrapper[5002]: I1209 11:56:28.073840 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c5f211-ceb3-42d3-bb99-d41af7772be6" path="/var/lib/kubelet/pods/35c5f211-ceb3-42d3-bb99-d41af7772be6/volumes" Dec 09 11:56:34 crc kubenswrapper[5002]: I1209 11:56:34.062053 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:56:34 crc kubenswrapper[5002]: E1209 11:56:34.063003 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:56:49 crc kubenswrapper[5002]: I1209 11:56:49.062155 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:56:49 crc kubenswrapper[5002]: E1209 11:56:49.062888 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:56:56 crc kubenswrapper[5002]: I1209 11:56:56.148705 5002 scope.go:117] "RemoveContainer" containerID="ea06cabb52460abfed294bbd120945f25757f9c70c094be62966355530e890ec" Dec 09 11:57:02 crc kubenswrapper[5002]: I1209 11:57:02.079054 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:57:02 crc kubenswrapper[5002]: E1209 11:57:02.081767 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:57:17 crc kubenswrapper[5002]: I1209 11:57:17.060183 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:57:17 crc kubenswrapper[5002]: E1209 11:57:17.060944 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:57:29 crc kubenswrapper[5002]: I1209 11:57:29.059902 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:57:29 crc kubenswrapper[5002]: E1209 11:57:29.060736 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:57:40 crc kubenswrapper[5002]: I1209 11:57:40.060320 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:57:40 crc kubenswrapper[5002]: E1209 11:57:40.061303 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:57:53 crc kubenswrapper[5002]: I1209 11:57:53.060710 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:57:53 crc kubenswrapper[5002]: E1209 11:57:53.061458 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:58:07 crc kubenswrapper[5002]: I1209 11:58:07.061385 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:58:07 crc kubenswrapper[5002]: E1209 11:58:07.062406 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 11:58:18 crc kubenswrapper[5002]: I1209 11:58:18.067636 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 11:58:18 crc kubenswrapper[5002]: I1209 11:58:18.930567 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"6226b704e39d43eed606b7631dfc787a50a79bc246be5230700c7cb27da7c704"} Dec 09 11:58:35 crc kubenswrapper[5002]: I1209 11:58:35.987719 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lmgwg"] Dec 09 11:58:35 crc kubenswrapper[5002]: E1209 11:58:35.988618 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c5f211-ceb3-42d3-bb99-d41af7772be6" containerName="extract-content" Dec 09 11:58:35 crc kubenswrapper[5002]: I1209 11:58:35.988630 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c5f211-ceb3-42d3-bb99-d41af7772be6" containerName="extract-content" Dec 09 11:58:35 crc kubenswrapper[5002]: E1209 11:58:35.988649 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c5f211-ceb3-42d3-bb99-d41af7772be6" containerName="registry-server" Dec 09 11:58:35 crc kubenswrapper[5002]: I1209 11:58:35.988654 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c5f211-ceb3-42d3-bb99-d41af7772be6" containerName="registry-server" Dec 09 11:58:35 crc kubenswrapper[5002]: E1209 11:58:35.988682 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c5f211-ceb3-42d3-bb99-d41af7772be6" containerName="extract-utilities" Dec 09 11:58:35 crc kubenswrapper[5002]: I1209 11:58:35.988689 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c5f211-ceb3-42d3-bb99-d41af7772be6" containerName="extract-utilities" Dec 09 11:58:35 crc kubenswrapper[5002]: I1209 11:58:35.988910 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c5f211-ceb3-42d3-bb99-d41af7772be6" containerName="registry-server" Dec 09 11:58:35 crc kubenswrapper[5002]: I1209 11:58:35.990409 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:35 crc kubenswrapper[5002]: I1209 11:58:35.997028 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmgwg"] Dec 09 11:58:36 crc kubenswrapper[5002]: I1209 11:58:36.153795 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-catalog-content\") pod \"redhat-marketplace-lmgwg\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:36 crc kubenswrapper[5002]: I1209 11:58:36.153999 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-utilities\") pod \"redhat-marketplace-lmgwg\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:36 crc kubenswrapper[5002]: I1209 11:58:36.154390 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jddkl\" (UniqueName: \"kubernetes.io/projected/9037ed1f-97ad-4793-a98d-51a8507e95f1-kube-api-access-jddkl\") pod \"redhat-marketplace-lmgwg\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:36 crc kubenswrapper[5002]: I1209 11:58:36.256504 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-utilities\") pod \"redhat-marketplace-lmgwg\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:36 crc kubenswrapper[5002]: I1209 11:58:36.256644 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jddkl\" (UniqueName: \"kubernetes.io/projected/9037ed1f-97ad-4793-a98d-51a8507e95f1-kube-api-access-jddkl\") pod \"redhat-marketplace-lmgwg\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:36 crc kubenswrapper[5002]: I1209 11:58:36.256700 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-catalog-content\") pod \"redhat-marketplace-lmgwg\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:36 crc kubenswrapper[5002]: I1209 11:58:36.257060 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-utilities\") pod \"redhat-marketplace-lmgwg\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:36 crc kubenswrapper[5002]: I1209 11:58:36.257128 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-catalog-content\") pod \"redhat-marketplace-lmgwg\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:36 crc kubenswrapper[5002]: I1209 11:58:36.286882 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jddkl\" (UniqueName: \"kubernetes.io/projected/9037ed1f-97ad-4793-a98d-51a8507e95f1-kube-api-access-jddkl\") pod \"redhat-marketplace-lmgwg\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:36 crc kubenswrapper[5002]: I1209 11:58:36.315666 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:36 crc kubenswrapper[5002]: I1209 11:58:36.783607 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmgwg"] Dec 09 11:58:36 crc kubenswrapper[5002]: W1209 11:58:36.786161 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9037ed1f_97ad_4793_a98d_51a8507e95f1.slice/crio-a8f09f43b2200d22e3d03b6c2e45c8b37da55d69f89807900a86ecc8ec3cba14 WatchSource:0}: Error finding container a8f09f43b2200d22e3d03b6c2e45c8b37da55d69f89807900a86ecc8ec3cba14: Status 404 returned error can't find the container with id a8f09f43b2200d22e3d03b6c2e45c8b37da55d69f89807900a86ecc8ec3cba14 Dec 09 11:58:37 crc kubenswrapper[5002]: I1209 11:58:37.141446 5002 generic.go:334] "Generic (PLEG): container finished" podID="9037ed1f-97ad-4793-a98d-51a8507e95f1" containerID="9d29dfaba9c36792a39aa5a317de5042d19a16117b755e8a020cbacbcee08c44" exitCode=0 Dec 09 11:58:37 crc kubenswrapper[5002]: I1209 11:58:37.141542 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmgwg" event={"ID":"9037ed1f-97ad-4793-a98d-51a8507e95f1","Type":"ContainerDied","Data":"9d29dfaba9c36792a39aa5a317de5042d19a16117b755e8a020cbacbcee08c44"} Dec 09 11:58:37 crc kubenswrapper[5002]: I1209 11:58:37.142683 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmgwg" event={"ID":"9037ed1f-97ad-4793-a98d-51a8507e95f1","Type":"ContainerStarted","Data":"a8f09f43b2200d22e3d03b6c2e45c8b37da55d69f89807900a86ecc8ec3cba14"} Dec 09 11:58:39 crc kubenswrapper[5002]: I1209 11:58:39.165159 5002 generic.go:334] "Generic (PLEG): container finished" podID="9037ed1f-97ad-4793-a98d-51a8507e95f1" containerID="e874ac37363c43ac042059ff380432826e57d3b1dd2131f8cdff08b944657fcc" exitCode=0 Dec 09 11:58:39 crc kubenswrapper[5002]: I1209 11:58:39.165257 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmgwg" event={"ID":"9037ed1f-97ad-4793-a98d-51a8507e95f1","Type":"ContainerDied","Data":"e874ac37363c43ac042059ff380432826e57d3b1dd2131f8cdff08b944657fcc"} Dec 09 11:58:40 crc kubenswrapper[5002]: I1209 11:58:40.178413 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmgwg" event={"ID":"9037ed1f-97ad-4793-a98d-51a8507e95f1","Type":"ContainerStarted","Data":"9ffef3dfcbf448f5771eef2c3dc70e1e729ca29041dd14bd997c3a5eb572b02f"} Dec 09 11:58:46 crc kubenswrapper[5002]: I1209 11:58:46.316038 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:46 crc kubenswrapper[5002]: I1209 11:58:46.316697 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:46 crc kubenswrapper[5002]: I1209 11:58:46.385907 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:46 crc kubenswrapper[5002]: I1209 11:58:46.415738 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lmgwg" podStartSLOduration=8.857603597 podStartE2EDuration="11.415721405s" podCreationTimestamp="2025-12-09 11:58:35 +0000 UTC" firstStartedPulling="2025-12-09 11:58:37.143757281 +0000 UTC m=+7049.535808372" lastFinishedPulling="2025-12-09 11:58:39.701875099 +0000 UTC m=+7052.093926180" observedRunningTime="2025-12-09 11:58:40.201708807 +0000 UTC m=+7052.593759888" watchObservedRunningTime="2025-12-09 11:58:46.415721405 +0000 UTC m=+7058.807772486" Dec 09 11:58:47 crc kubenswrapper[5002]: I1209 11:58:47.312731 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:49 crc kubenswrapper[5002]: I1209 11:58:49.976306 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmgwg"] Dec 09 11:58:49 crc kubenswrapper[5002]: I1209 11:58:49.977275 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lmgwg" podUID="9037ed1f-97ad-4793-a98d-51a8507e95f1" containerName="registry-server" containerID="cri-o://9ffef3dfcbf448f5771eef2c3dc70e1e729ca29041dd14bd997c3a5eb572b02f" gracePeriod=2 Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.270574 5002 generic.go:334] "Generic (PLEG): container finished" podID="9037ed1f-97ad-4793-a98d-51a8507e95f1" containerID="9ffef3dfcbf448f5771eef2c3dc70e1e729ca29041dd14bd997c3a5eb572b02f" exitCode=0 Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.270620 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmgwg" event={"ID":"9037ed1f-97ad-4793-a98d-51a8507e95f1","Type":"ContainerDied","Data":"9ffef3dfcbf448f5771eef2c3dc70e1e729ca29041dd14bd997c3a5eb572b02f"} Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.464577 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.574929 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jddkl\" (UniqueName: \"kubernetes.io/projected/9037ed1f-97ad-4793-a98d-51a8507e95f1-kube-api-access-jddkl\") pod \"9037ed1f-97ad-4793-a98d-51a8507e95f1\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.575054 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-utilities\") pod \"9037ed1f-97ad-4793-a98d-51a8507e95f1\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.575086 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-catalog-content\") pod \"9037ed1f-97ad-4793-a98d-51a8507e95f1\" (UID: \"9037ed1f-97ad-4793-a98d-51a8507e95f1\") " Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.576255 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-utilities" (OuterVolumeSpecName: "utilities") pod "9037ed1f-97ad-4793-a98d-51a8507e95f1" (UID: "9037ed1f-97ad-4793-a98d-51a8507e95f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.585087 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9037ed1f-97ad-4793-a98d-51a8507e95f1-kube-api-access-jddkl" (OuterVolumeSpecName: "kube-api-access-jddkl") pod "9037ed1f-97ad-4793-a98d-51a8507e95f1" (UID: "9037ed1f-97ad-4793-a98d-51a8507e95f1"). InnerVolumeSpecName "kube-api-access-jddkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.604827 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9037ed1f-97ad-4793-a98d-51a8507e95f1" (UID: "9037ed1f-97ad-4793-a98d-51a8507e95f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.677144 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jddkl\" (UniqueName: \"kubernetes.io/projected/9037ed1f-97ad-4793-a98d-51a8507e95f1-kube-api-access-jddkl\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.677191 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:50 crc kubenswrapper[5002]: I1209 11:58:50.677203 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9037ed1f-97ad-4793-a98d-51a8507e95f1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:51 crc kubenswrapper[5002]: I1209 11:58:51.281580 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmgwg" event={"ID":"9037ed1f-97ad-4793-a98d-51a8507e95f1","Type":"ContainerDied","Data":"a8f09f43b2200d22e3d03b6c2e45c8b37da55d69f89807900a86ecc8ec3cba14"} Dec 09 11:58:51 crc kubenswrapper[5002]: I1209 11:58:51.281627 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmgwg" Dec 09 11:58:51 crc kubenswrapper[5002]: I1209 11:58:51.281633 5002 scope.go:117] "RemoveContainer" containerID="9ffef3dfcbf448f5771eef2c3dc70e1e729ca29041dd14bd997c3a5eb572b02f" Dec 09 11:58:51 crc kubenswrapper[5002]: I1209 11:58:51.308450 5002 scope.go:117] "RemoveContainer" containerID="e874ac37363c43ac042059ff380432826e57d3b1dd2131f8cdff08b944657fcc" Dec 09 11:58:51 crc kubenswrapper[5002]: I1209 11:58:51.318581 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmgwg"] Dec 09 11:58:51 crc kubenswrapper[5002]: I1209 11:58:51.327379 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmgwg"] Dec 09 11:58:51 crc kubenswrapper[5002]: I1209 11:58:51.332993 5002 scope.go:117] "RemoveContainer" containerID="9d29dfaba9c36792a39aa5a317de5042d19a16117b755e8a020cbacbcee08c44" Dec 09 11:58:52 crc kubenswrapper[5002]: I1209 11:58:52.077624 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9037ed1f-97ad-4793-a98d-51a8507e95f1" path="/var/lib/kubelet/pods/9037ed1f-97ad-4793-a98d-51a8507e95f1/volumes" Dec 09 11:58:53 crc kubenswrapper[5002]: I1209 11:58:53.336317 5002 generic.go:334] "Generic (PLEG): container finished" podID="17cd99c9-1797-4270-9cdb-0589c1797d27" containerID="ff956f6c172e46d4bf8bd46747c0659c0db9544509e7602b0913d5c434524afc" exitCode=0 Dec 09 11:58:53 crc kubenswrapper[5002]: I1209 11:58:53.336383 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" event={"ID":"17cd99c9-1797-4270-9cdb-0589c1797d27","Type":"ContainerDied","Data":"ff956f6c172e46d4bf8bd46747c0659c0db9544509e7602b0913d5c434524afc"} Dec 09 11:58:54 crc kubenswrapper[5002]: I1209 11:58:54.799541 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:58:54 crc kubenswrapper[5002]: I1209 11:58:54.969517 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vz5t\" (UniqueName: \"kubernetes.io/projected/17cd99c9-1797-4270-9cdb-0589c1797d27-kube-api-access-2vz5t\") pod \"17cd99c9-1797-4270-9cdb-0589c1797d27\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " Dec 09 11:58:54 crc kubenswrapper[5002]: I1209 11:58:54.969604 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-tripleo-cleanup-combined-ca-bundle\") pod \"17cd99c9-1797-4270-9cdb-0589c1797d27\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " Dec 09 11:58:54 crc kubenswrapper[5002]: I1209 11:58:54.969670 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ceph\") pod \"17cd99c9-1797-4270-9cdb-0589c1797d27\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " Dec 09 11:58:54 crc kubenswrapper[5002]: I1209 11:58:54.969735 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ssh-key\") pod \"17cd99c9-1797-4270-9cdb-0589c1797d27\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " Dec 09 11:58:54 crc kubenswrapper[5002]: I1209 11:58:54.969899 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-inventory\") pod \"17cd99c9-1797-4270-9cdb-0589c1797d27\" (UID: \"17cd99c9-1797-4270-9cdb-0589c1797d27\") " Dec 09 11:58:54 crc kubenswrapper[5002]: I1209 11:58:54.975790 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17cd99c9-1797-4270-9cdb-0589c1797d27-kube-api-access-2vz5t" (OuterVolumeSpecName: "kube-api-access-2vz5t") pod "17cd99c9-1797-4270-9cdb-0589c1797d27" (UID: "17cd99c9-1797-4270-9cdb-0589c1797d27"). InnerVolumeSpecName "kube-api-access-2vz5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:58:54 crc kubenswrapper[5002]: I1209 11:58:54.982993 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "17cd99c9-1797-4270-9cdb-0589c1797d27" (UID: "17cd99c9-1797-4270-9cdb-0589c1797d27"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:58:54 crc kubenswrapper[5002]: I1209 11:58:54.983044 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ceph" (OuterVolumeSpecName: "ceph") pod "17cd99c9-1797-4270-9cdb-0589c1797d27" (UID: "17cd99c9-1797-4270-9cdb-0589c1797d27"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:58:55 crc kubenswrapper[5002]: I1209 11:58:55.000460 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17cd99c9-1797-4270-9cdb-0589c1797d27" (UID: "17cd99c9-1797-4270-9cdb-0589c1797d27"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:58:55 crc kubenswrapper[5002]: I1209 11:58:55.019996 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-inventory" (OuterVolumeSpecName: "inventory") pod "17cd99c9-1797-4270-9cdb-0589c1797d27" (UID: "17cd99c9-1797-4270-9cdb-0589c1797d27"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:58:55 crc kubenswrapper[5002]: I1209 11:58:55.073867 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:55 crc kubenswrapper[5002]: I1209 11:58:55.073907 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vz5t\" (UniqueName: \"kubernetes.io/projected/17cd99c9-1797-4270-9cdb-0589c1797d27-kube-api-access-2vz5t\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:55 crc kubenswrapper[5002]: I1209 11:58:55.073930 5002 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:55 crc kubenswrapper[5002]: I1209 11:58:55.073943 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:55 crc kubenswrapper[5002]: I1209 11:58:55.073959 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17cd99c9-1797-4270-9cdb-0589c1797d27-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:55 crc kubenswrapper[5002]: I1209 11:58:55.379051 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" event={"ID":"17cd99c9-1797-4270-9cdb-0589c1797d27","Type":"ContainerDied","Data":"82d036f8296354d45f75d96b565605300536dd4cc977c1449062021f7ad7898a"} Dec 09 11:58:55 crc kubenswrapper[5002]: I1209 11:58:55.379101 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82d036f8296354d45f75d96b565605300536dd4cc977c1449062021f7ad7898a" Dec 09 11:58:55 crc kubenswrapper[5002]: I1209 11:58:55.379161 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.900371 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-lhhxc"] Dec 09 11:59:00 crc kubenswrapper[5002]: E1209 11:59:00.901181 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9037ed1f-97ad-4793-a98d-51a8507e95f1" containerName="registry-server" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.901194 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9037ed1f-97ad-4793-a98d-51a8507e95f1" containerName="registry-server" Dec 09 11:59:00 crc kubenswrapper[5002]: E1209 11:59:00.901219 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17cd99c9-1797-4270-9cdb-0589c1797d27" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.901226 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="17cd99c9-1797-4270-9cdb-0589c1797d27" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 09 11:59:00 crc kubenswrapper[5002]: E1209 11:59:00.901239 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9037ed1f-97ad-4793-a98d-51a8507e95f1" containerName="extract-content" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.901245 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9037ed1f-97ad-4793-a98d-51a8507e95f1" containerName="extract-content" Dec 09 11:59:00 crc kubenswrapper[5002]: E1209 11:59:00.901263 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9037ed1f-97ad-4793-a98d-51a8507e95f1" containerName="extract-utilities" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.901268 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9037ed1f-97ad-4793-a98d-51a8507e95f1" containerName="extract-utilities" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.901459 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="17cd99c9-1797-4270-9cdb-0589c1797d27" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.901478 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="9037ed1f-97ad-4793-a98d-51a8507e95f1" containerName="registry-server" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.902190 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.906249 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.906271 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.906646 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.906687 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.930482 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-inventory\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.930705 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ceph\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.930833 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr284\" (UniqueName: \"kubernetes.io/projected/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-kube-api-access-pr284\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.930943 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.931373 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:00 crc kubenswrapper[5002]: I1209 11:59:00.946443 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-lhhxc"] Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.035893 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-inventory\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.035944 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ceph\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.035982 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr284\" (UniqueName: \"kubernetes.io/projected/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-kube-api-access-pr284\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.036333 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.036611 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.041552 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-inventory\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.042084 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.043790 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.050971 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ceph\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.056072 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr284\" (UniqueName: \"kubernetes.io/projected/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-kube-api-access-pr284\") pod \"bootstrap-openstack-openstack-cell1-lhhxc\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.242844 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 11:59:01 crc kubenswrapper[5002]: I1209 11:59:01.822263 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-lhhxc"] Dec 09 11:59:02 crc kubenswrapper[5002]: I1209 11:59:02.464052 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" event={"ID":"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6","Type":"ContainerStarted","Data":"1d22b94f8b85f5b40e7a2e4f15c408dbc2c313792b231090de8963c72ec1d5e2"} Dec 09 11:59:02 crc kubenswrapper[5002]: I1209 11:59:02.464987 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" event={"ID":"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6","Type":"ContainerStarted","Data":"26ed2d44aca7afcddfe98c93dc5bce9e40b8490c95e85b427a6a8303b4751cd7"} Dec 09 11:59:02 crc kubenswrapper[5002]: I1209 11:59:02.483136 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" podStartSLOduration=2.113099729 podStartE2EDuration="2.48311663s" podCreationTimestamp="2025-12-09 11:59:00 +0000 UTC" firstStartedPulling="2025-12-09 11:59:01.830401557 +0000 UTC m=+7074.222452638" lastFinishedPulling="2025-12-09 11:59:02.200418448 +0000 UTC m=+7074.592469539" observedRunningTime="2025-12-09 11:59:02.480857069 +0000 UTC m=+7074.872908180" watchObservedRunningTime="2025-12-09 11:59:02.48311663 +0000 UTC m=+7074.875167711" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.155603 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh"] Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.158044 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.167918 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.168174 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.177366 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh"] Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.222957 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e8fe753-2f2c-4a3d-9287-6854676262e7-secret-volume\") pod \"collect-profiles-29421360-lqkxh\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.223004 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m5c4\" (UniqueName: \"kubernetes.io/projected/8e8fe753-2f2c-4a3d-9287-6854676262e7-kube-api-access-9m5c4\") pod \"collect-profiles-29421360-lqkxh\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.223399 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e8fe753-2f2c-4a3d-9287-6854676262e7-config-volume\") pod \"collect-profiles-29421360-lqkxh\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.325585 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e8fe753-2f2c-4a3d-9287-6854676262e7-config-volume\") pod \"collect-profiles-29421360-lqkxh\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.326687 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e8fe753-2f2c-4a3d-9287-6854676262e7-config-volume\") pod \"collect-profiles-29421360-lqkxh\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.326956 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m5c4\" (UniqueName: \"kubernetes.io/projected/8e8fe753-2f2c-4a3d-9287-6854676262e7-kube-api-access-9m5c4\") pod \"collect-profiles-29421360-lqkxh\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.326986 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e8fe753-2f2c-4a3d-9287-6854676262e7-secret-volume\") pod \"collect-profiles-29421360-lqkxh\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.334215 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e8fe753-2f2c-4a3d-9287-6854676262e7-secret-volume\") pod \"collect-profiles-29421360-lqkxh\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.357903 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m5c4\" (UniqueName: \"kubernetes.io/projected/8e8fe753-2f2c-4a3d-9287-6854676262e7-kube-api-access-9m5c4\") pod \"collect-profiles-29421360-lqkxh\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.486150 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.942919 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh"] Dec 09 12:00:00 crc kubenswrapper[5002]: I1209 12:00:00.981478 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" event={"ID":"8e8fe753-2f2c-4a3d-9287-6854676262e7","Type":"ContainerStarted","Data":"ac961e2a7b6704f77702c8fec028feac4a5e832d1db2db0e95d77878d0fe7b5e"} Dec 09 12:00:01 crc kubenswrapper[5002]: I1209 12:00:01.992412 5002 generic.go:334] "Generic (PLEG): container finished" podID="8e8fe753-2f2c-4a3d-9287-6854676262e7" containerID="9dd0b6e228adb197f9ddcf57ceee5e17f5be7e55833e7c886bfd14d89e09a9ae" exitCode=0 Dec 09 12:00:01 crc kubenswrapper[5002]: I1209 12:00:01.992520 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" event={"ID":"8e8fe753-2f2c-4a3d-9287-6854676262e7","Type":"ContainerDied","Data":"9dd0b6e228adb197f9ddcf57ceee5e17f5be7e55833e7c886bfd14d89e09a9ae"} Dec 09 12:00:03 crc kubenswrapper[5002]: I1209 12:00:03.376509 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:03 crc kubenswrapper[5002]: I1209 12:00:03.389705 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m5c4\" (UniqueName: \"kubernetes.io/projected/8e8fe753-2f2c-4a3d-9287-6854676262e7-kube-api-access-9m5c4\") pod \"8e8fe753-2f2c-4a3d-9287-6854676262e7\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " Dec 09 12:00:03 crc kubenswrapper[5002]: I1209 12:00:03.389903 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e8fe753-2f2c-4a3d-9287-6854676262e7-secret-volume\") pod \"8e8fe753-2f2c-4a3d-9287-6854676262e7\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " Dec 09 12:00:03 crc kubenswrapper[5002]: I1209 12:00:03.389950 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e8fe753-2f2c-4a3d-9287-6854676262e7-config-volume\") pod \"8e8fe753-2f2c-4a3d-9287-6854676262e7\" (UID: \"8e8fe753-2f2c-4a3d-9287-6854676262e7\") " Dec 09 12:00:03 crc kubenswrapper[5002]: I1209 12:00:03.390743 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8fe753-2f2c-4a3d-9287-6854676262e7-config-volume" (OuterVolumeSpecName: "config-volume") pod "8e8fe753-2f2c-4a3d-9287-6854676262e7" (UID: "8e8fe753-2f2c-4a3d-9287-6854676262e7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:00:03 crc kubenswrapper[5002]: I1209 12:00:03.400880 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e8fe753-2f2c-4a3d-9287-6854676262e7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8e8fe753-2f2c-4a3d-9287-6854676262e7" (UID: "8e8fe753-2f2c-4a3d-9287-6854676262e7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:00:03 crc kubenswrapper[5002]: I1209 12:00:03.401066 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8fe753-2f2c-4a3d-9287-6854676262e7-kube-api-access-9m5c4" (OuterVolumeSpecName: "kube-api-access-9m5c4") pod "8e8fe753-2f2c-4a3d-9287-6854676262e7" (UID: "8e8fe753-2f2c-4a3d-9287-6854676262e7"). InnerVolumeSpecName "kube-api-access-9m5c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:00:03 crc kubenswrapper[5002]: I1209 12:00:03.493449 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m5c4\" (UniqueName: \"kubernetes.io/projected/8e8fe753-2f2c-4a3d-9287-6854676262e7-kube-api-access-9m5c4\") on node \"crc\" DevicePath \"\"" Dec 09 12:00:03 crc kubenswrapper[5002]: I1209 12:00:03.493480 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e8fe753-2f2c-4a3d-9287-6854676262e7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:00:03 crc kubenswrapper[5002]: I1209 12:00:03.493490 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e8fe753-2f2c-4a3d-9287-6854676262e7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:00:04 crc kubenswrapper[5002]: I1209 12:00:04.017453 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" event={"ID":"8e8fe753-2f2c-4a3d-9287-6854676262e7","Type":"ContainerDied","Data":"ac961e2a7b6704f77702c8fec028feac4a5e832d1db2db0e95d77878d0fe7b5e"} Dec 09 12:00:04 crc kubenswrapper[5002]: I1209 12:00:04.017751 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac961e2a7b6704f77702c8fec028feac4a5e832d1db2db0e95d77878d0fe7b5e" Dec 09 12:00:04 crc kubenswrapper[5002]: I1209 12:00:04.017545 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh" Dec 09 12:00:04 crc kubenswrapper[5002]: I1209 12:00:04.468542 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9"] Dec 09 12:00:04 crc kubenswrapper[5002]: I1209 12:00:04.477654 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421315-pxbj9"] Dec 09 12:00:06 crc kubenswrapper[5002]: I1209 12:00:06.074054 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ef1b3d-7380-4ad4-83c5-783d87789929" path="/var/lib/kubelet/pods/60ef1b3d-7380-4ad4-83c5-783d87789929/volumes" Dec 09 12:00:09 crc kubenswrapper[5002]: I1209 12:00:09.916367 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vpqj4"] Dec 09 12:00:09 crc kubenswrapper[5002]: E1209 12:00:09.917447 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8fe753-2f2c-4a3d-9287-6854676262e7" containerName="collect-profiles" Dec 09 12:00:09 crc kubenswrapper[5002]: I1209 12:00:09.917463 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8fe753-2f2c-4a3d-9287-6854676262e7" containerName="collect-profiles" Dec 09 12:00:09 crc kubenswrapper[5002]: I1209 12:00:09.917774 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8fe753-2f2c-4a3d-9287-6854676262e7" containerName="collect-profiles" Dec 09 12:00:09 crc kubenswrapper[5002]: I1209 12:00:09.919935 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:09 crc kubenswrapper[5002]: I1209 12:00:09.944760 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-utilities\") pod \"certified-operators-vpqj4\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:09 crc kubenswrapper[5002]: I1209 12:00:09.945204 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4445h\" (UniqueName: \"kubernetes.io/projected/dd438039-c1ce-47f5-8626-a7067e0593ec-kube-api-access-4445h\") pod \"certified-operators-vpqj4\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:09 crc kubenswrapper[5002]: I1209 12:00:09.945304 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-catalog-content\") pod \"certified-operators-vpqj4\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:09 crc kubenswrapper[5002]: I1209 12:00:09.948067 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vpqj4"] Dec 09 12:00:10 crc kubenswrapper[5002]: I1209 12:00:10.047524 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4445h\" (UniqueName: \"kubernetes.io/projected/dd438039-c1ce-47f5-8626-a7067e0593ec-kube-api-access-4445h\") pod \"certified-operators-vpqj4\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:10 crc kubenswrapper[5002]: I1209 12:00:10.047586 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-catalog-content\") pod \"certified-operators-vpqj4\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:10 crc kubenswrapper[5002]: I1209 12:00:10.047668 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-utilities\") pod \"certified-operators-vpqj4\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:10 crc kubenswrapper[5002]: I1209 12:00:10.048146 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-catalog-content\") pod \"certified-operators-vpqj4\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:10 crc kubenswrapper[5002]: I1209 12:00:10.048266 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-utilities\") pod \"certified-operators-vpqj4\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:10 crc kubenswrapper[5002]: I1209 12:00:10.073475 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4445h\" (UniqueName: \"kubernetes.io/projected/dd438039-c1ce-47f5-8626-a7067e0593ec-kube-api-access-4445h\") pod \"certified-operators-vpqj4\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:10 crc kubenswrapper[5002]: I1209 12:00:10.264502 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:10 crc kubenswrapper[5002]: I1209 12:00:10.812230 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vpqj4"] Dec 09 12:00:11 crc kubenswrapper[5002]: I1209 12:00:11.097134 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpqj4" event={"ID":"dd438039-c1ce-47f5-8626-a7067e0593ec","Type":"ContainerStarted","Data":"c1100cd1456c99938e1f45086ac2402de1e93be7e17cc84a901f6df72609bcbb"} Dec 09 12:00:11 crc kubenswrapper[5002]: I1209 12:00:11.097184 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpqj4" event={"ID":"dd438039-c1ce-47f5-8626-a7067e0593ec","Type":"ContainerStarted","Data":"5ab4585f9a2cb92d06dc958be97358fa75e4855dfba3caaedc89caa253b1c573"} Dec 09 12:00:11 crc kubenswrapper[5002]: I1209 12:00:11.100756 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:00:12 crc kubenswrapper[5002]: I1209 12:00:12.109616 5002 generic.go:334] "Generic (PLEG): container finished" podID="dd438039-c1ce-47f5-8626-a7067e0593ec" containerID="c1100cd1456c99938e1f45086ac2402de1e93be7e17cc84a901f6df72609bcbb" exitCode=0 Dec 09 12:00:12 crc kubenswrapper[5002]: I1209 12:00:12.109677 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpqj4" event={"ID":"dd438039-c1ce-47f5-8626-a7067e0593ec","Type":"ContainerDied","Data":"c1100cd1456c99938e1f45086ac2402de1e93be7e17cc84a901f6df72609bcbb"} Dec 09 12:00:12 crc kubenswrapper[5002]: I1209 12:00:12.110036 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpqj4" event={"ID":"dd438039-c1ce-47f5-8626-a7067e0593ec","Type":"ContainerStarted","Data":"b3d6e42c0726679d60f9c22eabac6ecccb3fa91bcac3adb7105c4767b413a8ba"} Dec 09 12:00:13 crc kubenswrapper[5002]: I1209 12:00:13.128641 5002 generic.go:334] "Generic (PLEG): container finished" podID="dd438039-c1ce-47f5-8626-a7067e0593ec" containerID="b3d6e42c0726679d60f9c22eabac6ecccb3fa91bcac3adb7105c4767b413a8ba" exitCode=0 Dec 09 12:00:13 crc kubenswrapper[5002]: I1209 12:00:13.128793 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpqj4" event={"ID":"dd438039-c1ce-47f5-8626-a7067e0593ec","Type":"ContainerDied","Data":"b3d6e42c0726679d60f9c22eabac6ecccb3fa91bcac3adb7105c4767b413a8ba"} Dec 09 12:00:14 crc kubenswrapper[5002]: I1209 12:00:14.140007 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpqj4" event={"ID":"dd438039-c1ce-47f5-8626-a7067e0593ec","Type":"ContainerStarted","Data":"47527bc7a7064455b3619435d335e612e30d12198c55f84f36fce945e0257833"} Dec 09 12:00:14 crc kubenswrapper[5002]: I1209 12:00:14.169607 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vpqj4" podStartSLOduration=2.720465681 podStartE2EDuration="5.169584738s" podCreationTimestamp="2025-12-09 12:00:09 +0000 UTC" firstStartedPulling="2025-12-09 12:00:11.100507505 +0000 UTC m=+7143.492558586" lastFinishedPulling="2025-12-09 12:00:13.549626552 +0000 UTC m=+7145.941677643" observedRunningTime="2025-12-09 12:00:14.166067924 +0000 UTC m=+7146.558119005" watchObservedRunningTime="2025-12-09 12:00:14.169584738 +0000 UTC m=+7146.561635819" Dec 09 12:00:20 crc kubenswrapper[5002]: I1209 12:00:20.265493 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:20 crc kubenswrapper[5002]: I1209 12:00:20.267434 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:20 crc kubenswrapper[5002]: I1209 12:00:20.331119 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:21 crc kubenswrapper[5002]: I1209 12:00:21.263945 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:21 crc kubenswrapper[5002]: I1209 12:00:21.319100 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vpqj4"] Dec 09 12:00:23 crc kubenswrapper[5002]: I1209 12:00:23.235127 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vpqj4" podUID="dd438039-c1ce-47f5-8626-a7067e0593ec" containerName="registry-server" containerID="cri-o://47527bc7a7064455b3619435d335e612e30d12198c55f84f36fce945e0257833" gracePeriod=2 Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.244583 5002 generic.go:334] "Generic (PLEG): container finished" podID="dd438039-c1ce-47f5-8626-a7067e0593ec" containerID="47527bc7a7064455b3619435d335e612e30d12198c55f84f36fce945e0257833" exitCode=0 Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.244673 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpqj4" event={"ID":"dd438039-c1ce-47f5-8626-a7067e0593ec","Type":"ContainerDied","Data":"47527bc7a7064455b3619435d335e612e30d12198c55f84f36fce945e0257833"} Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.244970 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpqj4" event={"ID":"dd438039-c1ce-47f5-8626-a7067e0593ec","Type":"ContainerDied","Data":"5ab4585f9a2cb92d06dc958be97358fa75e4855dfba3caaedc89caa253b1c573"} Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.244993 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ab4585f9a2cb92d06dc958be97358fa75e4855dfba3caaedc89caa253b1c573" Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.256881 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.426362 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4445h\" (UniqueName: \"kubernetes.io/projected/dd438039-c1ce-47f5-8626-a7067e0593ec-kube-api-access-4445h\") pod \"dd438039-c1ce-47f5-8626-a7067e0593ec\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.426489 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-utilities\") pod \"dd438039-c1ce-47f5-8626-a7067e0593ec\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.426627 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-catalog-content\") pod \"dd438039-c1ce-47f5-8626-a7067e0593ec\" (UID: \"dd438039-c1ce-47f5-8626-a7067e0593ec\") " Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.428151 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-utilities" (OuterVolumeSpecName: "utilities") pod "dd438039-c1ce-47f5-8626-a7067e0593ec" (UID: "dd438039-c1ce-47f5-8626-a7067e0593ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.433182 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd438039-c1ce-47f5-8626-a7067e0593ec-kube-api-access-4445h" (OuterVolumeSpecName: "kube-api-access-4445h") pod "dd438039-c1ce-47f5-8626-a7067e0593ec" (UID: "dd438039-c1ce-47f5-8626-a7067e0593ec"). InnerVolumeSpecName "kube-api-access-4445h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.492312 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd438039-c1ce-47f5-8626-a7067e0593ec" (UID: "dd438039-c1ce-47f5-8626-a7067e0593ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.529697 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4445h\" (UniqueName: \"kubernetes.io/projected/dd438039-c1ce-47f5-8626-a7067e0593ec-kube-api-access-4445h\") on node \"crc\" DevicePath \"\"" Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.529911 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:00:24 crc kubenswrapper[5002]: I1209 12:00:24.529976 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd438039-c1ce-47f5-8626-a7067e0593ec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:00:25 crc kubenswrapper[5002]: I1209 12:00:25.256287 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpqj4" Dec 09 12:00:25 crc kubenswrapper[5002]: I1209 12:00:25.310675 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vpqj4"] Dec 09 12:00:25 crc kubenswrapper[5002]: I1209 12:00:25.321854 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vpqj4"] Dec 09 12:00:26 crc kubenswrapper[5002]: I1209 12:00:26.079607 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd438039-c1ce-47f5-8626-a7067e0593ec" path="/var/lib/kubelet/pods/dd438039-c1ce-47f5-8626-a7067e0593ec/volumes" Dec 09 12:00:37 crc kubenswrapper[5002]: I1209 12:00:37.965166 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:00:37 crc kubenswrapper[5002]: I1209 12:00:37.965635 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:00:56 crc kubenswrapper[5002]: I1209 12:00:56.326521 5002 scope.go:117] "RemoveContainer" containerID="b74609cd71d8288e561ff40f177b861af5d648570db5eb2c8e3bcde13287198d" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.152344 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29421361-mhxrg"] Dec 09 12:01:00 crc kubenswrapper[5002]: E1209 12:01:00.153422 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd438039-c1ce-47f5-8626-a7067e0593ec" containerName="registry-server" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.153438 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd438039-c1ce-47f5-8626-a7067e0593ec" containerName="registry-server" Dec 09 12:01:00 crc kubenswrapper[5002]: E1209 12:01:00.153471 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd438039-c1ce-47f5-8626-a7067e0593ec" containerName="extract-content" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.153477 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd438039-c1ce-47f5-8626-a7067e0593ec" containerName="extract-content" Dec 09 12:01:00 crc kubenswrapper[5002]: E1209 12:01:00.153493 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd438039-c1ce-47f5-8626-a7067e0593ec" containerName="extract-utilities" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.153500 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd438039-c1ce-47f5-8626-a7067e0593ec" containerName="extract-utilities" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.153694 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd438039-c1ce-47f5-8626-a7067e0593ec" containerName="registry-server" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.154499 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.179420 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29421361-mhxrg"] Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.314830 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl9bl\" (UniqueName: \"kubernetes.io/projected/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-kube-api-access-sl9bl\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.315434 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-config-data\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.315648 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-combined-ca-bundle\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.315833 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-fernet-keys\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.417568 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl9bl\" (UniqueName: \"kubernetes.io/projected/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-kube-api-access-sl9bl\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.417646 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-config-data\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.417740 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-combined-ca-bundle\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.417803 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-fernet-keys\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.423603 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-combined-ca-bundle\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.424250 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-fernet-keys\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.425584 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-config-data\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.435737 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl9bl\" (UniqueName: \"kubernetes.io/projected/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-kube-api-access-sl9bl\") pod \"keystone-cron-29421361-mhxrg\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.482258 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:00 crc kubenswrapper[5002]: I1209 12:01:00.978180 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29421361-mhxrg"] Dec 09 12:01:01 crc kubenswrapper[5002]: I1209 12:01:01.373106 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29421361-mhxrg" event={"ID":"905f5826-4f4a-4ec8-8c82-b109a32cb9bf","Type":"ContainerStarted","Data":"ade8171636e76328e086eea9dff833c1997a0e6bce5b3a3911713b98323cc6f4"} Dec 09 12:01:01 crc kubenswrapper[5002]: I1209 12:01:01.373435 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29421361-mhxrg" event={"ID":"905f5826-4f4a-4ec8-8c82-b109a32cb9bf","Type":"ContainerStarted","Data":"c42071ee201b572c2c2788ac1bbece0408c06bcb555fa9b758986c57e1f4e120"} Dec 09 12:01:01 crc kubenswrapper[5002]: I1209 12:01:01.390275 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29421361-mhxrg" podStartSLOduration=1.390255069 podStartE2EDuration="1.390255069s" podCreationTimestamp="2025-12-09 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 12:01:01.386989111 +0000 UTC m=+7193.779040202" watchObservedRunningTime="2025-12-09 12:01:01.390255069 +0000 UTC m=+7193.782306150" Dec 09 12:01:04 crc kubenswrapper[5002]: I1209 12:01:04.401393 5002 generic.go:334] "Generic (PLEG): container finished" podID="905f5826-4f4a-4ec8-8c82-b109a32cb9bf" containerID="ade8171636e76328e086eea9dff833c1997a0e6bce5b3a3911713b98323cc6f4" exitCode=0 Dec 09 12:01:04 crc kubenswrapper[5002]: I1209 12:01:04.401499 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29421361-mhxrg" event={"ID":"905f5826-4f4a-4ec8-8c82-b109a32cb9bf","Type":"ContainerDied","Data":"ade8171636e76328e086eea9dff833c1997a0e6bce5b3a3911713b98323cc6f4"} Dec 09 12:01:05 crc kubenswrapper[5002]: I1209 12:01:05.801907 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:05 crc kubenswrapper[5002]: I1209 12:01:05.937016 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-combined-ca-bundle\") pod \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " Dec 09 12:01:05 crc kubenswrapper[5002]: I1209 12:01:05.937196 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-config-data\") pod \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " Dec 09 12:01:05 crc kubenswrapper[5002]: I1209 12:01:05.937317 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-fernet-keys\") pod \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " Dec 09 12:01:05 crc kubenswrapper[5002]: I1209 12:01:05.937389 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl9bl\" (UniqueName: \"kubernetes.io/projected/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-kube-api-access-sl9bl\") pod \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\" (UID: \"905f5826-4f4a-4ec8-8c82-b109a32cb9bf\") " Dec 09 12:01:05 crc kubenswrapper[5002]: I1209 12:01:05.943370 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "905f5826-4f4a-4ec8-8c82-b109a32cb9bf" (UID: "905f5826-4f4a-4ec8-8c82-b109a32cb9bf"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:01:05 crc kubenswrapper[5002]: I1209 12:01:05.944384 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-kube-api-access-sl9bl" (OuterVolumeSpecName: "kube-api-access-sl9bl") pod "905f5826-4f4a-4ec8-8c82-b109a32cb9bf" (UID: "905f5826-4f4a-4ec8-8c82-b109a32cb9bf"). InnerVolumeSpecName "kube-api-access-sl9bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:01:06 crc kubenswrapper[5002]: I1209 12:01:06.044557 5002 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 12:01:06 crc kubenswrapper[5002]: I1209 12:01:06.044592 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl9bl\" (UniqueName: \"kubernetes.io/projected/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-kube-api-access-sl9bl\") on node \"crc\" DevicePath \"\"" Dec 09 12:01:06 crc kubenswrapper[5002]: I1209 12:01:06.059475 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "905f5826-4f4a-4ec8-8c82-b109a32cb9bf" (UID: "905f5826-4f4a-4ec8-8c82-b109a32cb9bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:01:06 crc kubenswrapper[5002]: I1209 12:01:06.060040 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-config-data" (OuterVolumeSpecName: "config-data") pod "905f5826-4f4a-4ec8-8c82-b109a32cb9bf" (UID: "905f5826-4f4a-4ec8-8c82-b109a32cb9bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:01:06 crc kubenswrapper[5002]: I1209 12:01:06.166225 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 12:01:06 crc kubenswrapper[5002]: I1209 12:01:06.166280 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905f5826-4f4a-4ec8-8c82-b109a32cb9bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:01:06 crc kubenswrapper[5002]: I1209 12:01:06.420502 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29421361-mhxrg" event={"ID":"905f5826-4f4a-4ec8-8c82-b109a32cb9bf","Type":"ContainerDied","Data":"c42071ee201b572c2c2788ac1bbece0408c06bcb555fa9b758986c57e1f4e120"} Dec 09 12:01:06 crc kubenswrapper[5002]: I1209 12:01:06.420861 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c42071ee201b572c2c2788ac1bbece0408c06bcb555fa9b758986c57e1f4e120" Dec 09 12:01:06 crc kubenswrapper[5002]: I1209 12:01:06.420563 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29421361-mhxrg" Dec 09 12:01:07 crc kubenswrapper[5002]: I1209 12:01:07.964999 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:01:07 crc kubenswrapper[5002]: I1209 12:01:07.965421 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:01:37 crc kubenswrapper[5002]: I1209 12:01:37.964836 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:01:37 crc kubenswrapper[5002]: I1209 12:01:37.965349 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:01:37 crc kubenswrapper[5002]: I1209 12:01:37.965404 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 12:01:37 crc kubenswrapper[5002]: I1209 12:01:37.966225 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6226b704e39d43eed606b7631dfc787a50a79bc246be5230700c7cb27da7c704"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:01:37 crc kubenswrapper[5002]: I1209 12:01:37.966279 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://6226b704e39d43eed606b7631dfc787a50a79bc246be5230700c7cb27da7c704" gracePeriod=600 Dec 09 12:01:38 crc kubenswrapper[5002]: I1209 12:01:38.738251 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="6226b704e39d43eed606b7631dfc787a50a79bc246be5230700c7cb27da7c704" exitCode=0 Dec 09 12:01:38 crc kubenswrapper[5002]: I1209 12:01:38.738332 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"6226b704e39d43eed606b7631dfc787a50a79bc246be5230700c7cb27da7c704"} Dec 09 12:01:38 crc kubenswrapper[5002]: I1209 12:01:38.738778 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6"} Dec 09 12:01:38 crc kubenswrapper[5002]: I1209 12:01:38.738799 5002 scope.go:117] "RemoveContainer" containerID="743dea4cb011efac40dec51eea5c3a2c001d7c8c55dd0ca3738e5f1789063389" Dec 09 12:02:22 crc kubenswrapper[5002]: I1209 12:02:22.221409 5002 generic.go:334] "Generic (PLEG): container finished" podID="ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6" containerID="1d22b94f8b85f5b40e7a2e4f15c408dbc2c313792b231090de8963c72ec1d5e2" exitCode=0 Dec 09 12:02:22 crc kubenswrapper[5002]: I1209 12:02:22.221539 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" event={"ID":"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6","Type":"ContainerDied","Data":"1d22b94f8b85f5b40e7a2e4f15c408dbc2c313792b231090de8963c72ec1d5e2"} Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.725544 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.830420 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr284\" (UniqueName: \"kubernetes.io/projected/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-kube-api-access-pr284\") pod \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.830483 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-bootstrap-combined-ca-bundle\") pod \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.830683 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-inventory\") pod \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.830740 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ssh-key\") pod \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.830791 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ceph\") pod \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\" (UID: \"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6\") " Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.836917 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6" (UID: "ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.840936 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-kube-api-access-pr284" (OuterVolumeSpecName: "kube-api-access-pr284") pod "ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6" (UID: "ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6"). InnerVolumeSpecName "kube-api-access-pr284". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.857667 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ceph" (OuterVolumeSpecName: "ceph") pod "ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6" (UID: "ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.867943 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6" (UID: "ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.879004 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-inventory" (OuterVolumeSpecName: "inventory") pod "ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6" (UID: "ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.935499 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr284\" (UniqueName: \"kubernetes.io/projected/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-kube-api-access-pr284\") on node \"crc\" DevicePath \"\"" Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.935546 5002 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.935559 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.935571 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:02:23 crc kubenswrapper[5002]: I1209 12:02:23.935583 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.249440 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" event={"ID":"ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6","Type":"ContainerDied","Data":"26ed2d44aca7afcddfe98c93dc5bce9e40b8490c95e85b427a6a8303b4751cd7"} Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.249480 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26ed2d44aca7afcddfe98c93dc5bce9e40b8490c95e85b427a6a8303b4751cd7" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.249567 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lhhxc" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.349523 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fz8lt"] Dec 09 12:02:24 crc kubenswrapper[5002]: E1209 12:02:24.350229 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6" containerName="bootstrap-openstack-openstack-cell1" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.350248 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6" containerName="bootstrap-openstack-openstack-cell1" Dec 09 12:02:24 crc kubenswrapper[5002]: E1209 12:02:24.350294 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905f5826-4f4a-4ec8-8c82-b109a32cb9bf" containerName="keystone-cron" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.350302 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="905f5826-4f4a-4ec8-8c82-b109a32cb9bf" containerName="keystone-cron" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.350570 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="905f5826-4f4a-4ec8-8c82-b109a32cb9bf" containerName="keystone-cron" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.350588 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6" containerName="bootstrap-openstack-openstack-cell1" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.351750 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.353602 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.353847 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.354059 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.355072 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.357517 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fz8lt"] Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.447206 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-inventory\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.447263 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ssh-key\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.447360 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ceph\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.447758 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bh6t\" (UniqueName: \"kubernetes.io/projected/0c070b4d-58da-444a-ba0c-2dbc72842283-kube-api-access-4bh6t\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.550234 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ceph\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.550479 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bh6t\" (UniqueName: \"kubernetes.io/projected/0c070b4d-58da-444a-ba0c-2dbc72842283-kube-api-access-4bh6t\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.550608 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-inventory\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.550695 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ssh-key\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.554292 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-inventory\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.555057 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ceph\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.557287 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ssh-key\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.568977 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bh6t\" (UniqueName: \"kubernetes.io/projected/0c070b4d-58da-444a-ba0c-2dbc72842283-kube-api-access-4bh6t\") pod \"download-cache-openstack-openstack-cell1-fz8lt\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:24 crc kubenswrapper[5002]: I1209 12:02:24.670736 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:02:25 crc kubenswrapper[5002]: I1209 12:02:25.227191 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fz8lt"] Dec 09 12:02:25 crc kubenswrapper[5002]: I1209 12:02:25.260875 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" event={"ID":"0c070b4d-58da-444a-ba0c-2dbc72842283","Type":"ContainerStarted","Data":"3997af4027b3d1300d3dd4ea1c85b26f35b23cdc01ad906d67f58a9b8c46cacc"} Dec 09 12:02:26 crc kubenswrapper[5002]: I1209 12:02:26.272134 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" event={"ID":"0c070b4d-58da-444a-ba0c-2dbc72842283","Type":"ContainerStarted","Data":"25c2a6844494ab407296cdbc418397c690899c284c8e2d390b4fd7ea3da95b70"} Dec 09 12:02:26 crc kubenswrapper[5002]: I1209 12:02:26.296532 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" podStartSLOduration=1.7551344530000001 podStartE2EDuration="2.296515125s" podCreationTimestamp="2025-12-09 12:02:24 +0000 UTC" firstStartedPulling="2025-12-09 12:02:25.235305568 +0000 UTC m=+7277.627356649" lastFinishedPulling="2025-12-09 12:02:25.77668624 +0000 UTC m=+7278.168737321" observedRunningTime="2025-12-09 12:02:26.295244481 +0000 UTC m=+7278.687295562" watchObservedRunningTime="2025-12-09 12:02:26.296515125 +0000 UTC m=+7278.688566206" Dec 09 12:04:00 crc kubenswrapper[5002]: I1209 12:04:00.277948 5002 generic.go:334] "Generic (PLEG): container finished" podID="0c070b4d-58da-444a-ba0c-2dbc72842283" containerID="25c2a6844494ab407296cdbc418397c690899c284c8e2d390b4fd7ea3da95b70" exitCode=0 Dec 09 12:04:00 crc kubenswrapper[5002]: I1209 12:04:00.278665 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" event={"ID":"0c070b4d-58da-444a-ba0c-2dbc72842283","Type":"ContainerDied","Data":"25c2a6844494ab407296cdbc418397c690899c284c8e2d390b4fd7ea3da95b70"} Dec 09 12:04:01 crc kubenswrapper[5002]: I1209 12:04:01.764252 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:04:01 crc kubenswrapper[5002]: I1209 12:04:01.902184 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bh6t\" (UniqueName: \"kubernetes.io/projected/0c070b4d-58da-444a-ba0c-2dbc72842283-kube-api-access-4bh6t\") pod \"0c070b4d-58da-444a-ba0c-2dbc72842283\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " Dec 09 12:04:01 crc kubenswrapper[5002]: I1209 12:04:01.902357 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-inventory\") pod \"0c070b4d-58da-444a-ba0c-2dbc72842283\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " Dec 09 12:04:01 crc kubenswrapper[5002]: I1209 12:04:01.902491 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ssh-key\") pod \"0c070b4d-58da-444a-ba0c-2dbc72842283\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " Dec 09 12:04:01 crc kubenswrapper[5002]: I1209 12:04:01.902526 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ceph\") pod \"0c070b4d-58da-444a-ba0c-2dbc72842283\" (UID: \"0c070b4d-58da-444a-ba0c-2dbc72842283\") " Dec 09 12:04:01 crc kubenswrapper[5002]: I1209 12:04:01.908297 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ceph" (OuterVolumeSpecName: "ceph") pod "0c070b4d-58da-444a-ba0c-2dbc72842283" (UID: "0c070b4d-58da-444a-ba0c-2dbc72842283"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:04:01 crc kubenswrapper[5002]: I1209 12:04:01.908582 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c070b4d-58da-444a-ba0c-2dbc72842283-kube-api-access-4bh6t" (OuterVolumeSpecName: "kube-api-access-4bh6t") pod "0c070b4d-58da-444a-ba0c-2dbc72842283" (UID: "0c070b4d-58da-444a-ba0c-2dbc72842283"). InnerVolumeSpecName "kube-api-access-4bh6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:04:01 crc kubenswrapper[5002]: I1209 12:04:01.931604 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0c070b4d-58da-444a-ba0c-2dbc72842283" (UID: "0c070b4d-58da-444a-ba0c-2dbc72842283"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:04:01 crc kubenswrapper[5002]: I1209 12:04:01.955627 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-inventory" (OuterVolumeSpecName: "inventory") pod "0c070b4d-58da-444a-ba0c-2dbc72842283" (UID: "0c070b4d-58da-444a-ba0c-2dbc72842283"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.005649 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.005986 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.006002 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bh6t\" (UniqueName: \"kubernetes.io/projected/0c070b4d-58da-444a-ba0c-2dbc72842283-kube-api-access-4bh6t\") on node \"crc\" DevicePath \"\"" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.006018 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c070b4d-58da-444a-ba0c-2dbc72842283-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.307351 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" event={"ID":"0c070b4d-58da-444a-ba0c-2dbc72842283","Type":"ContainerDied","Data":"3997af4027b3d1300d3dd4ea1c85b26f35b23cdc01ad906d67f58a9b8c46cacc"} Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.307416 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3997af4027b3d1300d3dd4ea1c85b26f35b23cdc01ad906d67f58a9b8c46cacc" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.307497 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fz8lt" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.474935 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-nzm87"] Dec 09 12:04:02 crc kubenswrapper[5002]: E1209 12:04:02.475507 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c070b4d-58da-444a-ba0c-2dbc72842283" containerName="download-cache-openstack-openstack-cell1" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.475526 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c070b4d-58da-444a-ba0c-2dbc72842283" containerName="download-cache-openstack-openstack-cell1" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.475732 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c070b4d-58da-444a-ba0c-2dbc72842283" containerName="download-cache-openstack-openstack-cell1" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.476593 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.484938 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.485283 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.485626 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.485763 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.486854 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-nzm87"] Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.633177 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ssh-key\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.633300 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ceph\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.633350 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9d9q\" (UniqueName: \"kubernetes.io/projected/9e049b56-1b0d-47da-a400-5bc3cd964980-kube-api-access-f9d9q\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.633525 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-inventory\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.735487 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-inventory\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.735651 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ssh-key\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.735707 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ceph\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.735723 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9d9q\" (UniqueName: \"kubernetes.io/projected/9e049b56-1b0d-47da-a400-5bc3cd964980-kube-api-access-f9d9q\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.740954 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ceph\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.742482 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ssh-key\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.748103 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-inventory\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.750873 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9d9q\" (UniqueName: \"kubernetes.io/projected/9e049b56-1b0d-47da-a400-5bc3cd964980-kube-api-access-f9d9q\") pod \"configure-network-openstack-openstack-cell1-nzm87\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:02 crc kubenswrapper[5002]: I1209 12:04:02.799554 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:04:03 crc kubenswrapper[5002]: I1209 12:04:03.377782 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-nzm87"] Dec 09 12:04:04 crc kubenswrapper[5002]: I1209 12:04:04.326934 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-nzm87" event={"ID":"9e049b56-1b0d-47da-a400-5bc3cd964980","Type":"ContainerStarted","Data":"2372bf828665cd7416fd3e391fe7df7cb560084c50380efcec20ff84f98111ec"} Dec 09 12:04:04 crc kubenswrapper[5002]: I1209 12:04:04.327383 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-nzm87" event={"ID":"9e049b56-1b0d-47da-a400-5bc3cd964980","Type":"ContainerStarted","Data":"96ff887f795a584b966eadf25b31c5d7241f4d51e0d6a4c1a7d3210bc90ed079"} Dec 09 12:04:04 crc kubenswrapper[5002]: I1209 12:04:04.350092 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-nzm87" podStartSLOduration=1.849783299 podStartE2EDuration="2.35007439s" podCreationTimestamp="2025-12-09 12:04:02 +0000 UTC" firstStartedPulling="2025-12-09 12:04:03.375709872 +0000 UTC m=+7375.767760963" lastFinishedPulling="2025-12-09 12:04:03.876000943 +0000 UTC m=+7376.268052054" observedRunningTime="2025-12-09 12:04:04.343334279 +0000 UTC m=+7376.735385380" watchObservedRunningTime="2025-12-09 12:04:04.35007439 +0000 UTC m=+7376.742125471" Dec 09 12:04:07 crc kubenswrapper[5002]: I1209 12:04:07.964902 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:04:07 crc kubenswrapper[5002]: I1209 12:04:07.965571 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:04:37 crc kubenswrapper[5002]: I1209 12:04:37.964298 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:04:37 crc kubenswrapper[5002]: I1209 12:04:37.965020 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:05:07 crc kubenswrapper[5002]: I1209 12:05:07.964597 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:05:07 crc kubenswrapper[5002]: I1209 12:05:07.965333 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:05:07 crc kubenswrapper[5002]: I1209 12:05:07.965400 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 12:05:07 crc kubenswrapper[5002]: I1209 12:05:07.966631 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:05:07 crc kubenswrapper[5002]: I1209 12:05:07.966731 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" gracePeriod=600 Dec 09 12:05:08 crc kubenswrapper[5002]: E1209 12:05:08.100071 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:05:08 crc kubenswrapper[5002]: I1209 12:05:08.977149 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" exitCode=0 Dec 09 12:05:08 crc kubenswrapper[5002]: I1209 12:05:08.977352 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6"} Dec 09 12:05:08 crc kubenswrapper[5002]: I1209 12:05:08.977518 5002 scope.go:117] "RemoveContainer" containerID="6226b704e39d43eed606b7631dfc787a50a79bc246be5230700c7cb27da7c704" Dec 09 12:05:08 crc kubenswrapper[5002]: I1209 12:05:08.978241 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:05:08 crc kubenswrapper[5002]: E1209 12:05:08.978580 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:05:22 crc kubenswrapper[5002]: I1209 12:05:22.060609 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:05:22 crc kubenswrapper[5002]: E1209 12:05:22.061392 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:05:26 crc kubenswrapper[5002]: I1209 12:05:26.182787 5002 generic.go:334] "Generic (PLEG): container finished" podID="9e049b56-1b0d-47da-a400-5bc3cd964980" containerID="2372bf828665cd7416fd3e391fe7df7cb560084c50380efcec20ff84f98111ec" exitCode=0 Dec 09 12:05:26 crc kubenswrapper[5002]: I1209 12:05:26.183257 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-nzm87" event={"ID":"9e049b56-1b0d-47da-a400-5bc3cd964980","Type":"ContainerDied","Data":"2372bf828665cd7416fd3e391fe7df7cb560084c50380efcec20ff84f98111ec"} Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.721006 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.802726 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-inventory\") pod \"9e049b56-1b0d-47da-a400-5bc3cd964980\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.802785 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9d9q\" (UniqueName: \"kubernetes.io/projected/9e049b56-1b0d-47da-a400-5bc3cd964980-kube-api-access-f9d9q\") pod \"9e049b56-1b0d-47da-a400-5bc3cd964980\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.802867 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ceph\") pod \"9e049b56-1b0d-47da-a400-5bc3cd964980\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.802905 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ssh-key\") pod \"9e049b56-1b0d-47da-a400-5bc3cd964980\" (UID: \"9e049b56-1b0d-47da-a400-5bc3cd964980\") " Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.814209 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ceph" (OuterVolumeSpecName: "ceph") pod "9e049b56-1b0d-47da-a400-5bc3cd964980" (UID: "9e049b56-1b0d-47da-a400-5bc3cd964980"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.814277 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e049b56-1b0d-47da-a400-5bc3cd964980-kube-api-access-f9d9q" (OuterVolumeSpecName: "kube-api-access-f9d9q") pod "9e049b56-1b0d-47da-a400-5bc3cd964980" (UID: "9e049b56-1b0d-47da-a400-5bc3cd964980"). InnerVolumeSpecName "kube-api-access-f9d9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.838191 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9e049b56-1b0d-47da-a400-5bc3cd964980" (UID: "9e049b56-1b0d-47da-a400-5bc3cd964980"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.838370 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-inventory" (OuterVolumeSpecName: "inventory") pod "9e049b56-1b0d-47da-a400-5bc3cd964980" (UID: "9e049b56-1b0d-47da-a400-5bc3cd964980"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.905241 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.905270 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.905283 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9d9q\" (UniqueName: \"kubernetes.io/projected/9e049b56-1b0d-47da-a400-5bc3cd964980-kube-api-access-f9d9q\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:27 crc kubenswrapper[5002]: I1209 12:05:27.905294 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e049b56-1b0d-47da-a400-5bc3cd964980-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.209344 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-nzm87" event={"ID":"9e049b56-1b0d-47da-a400-5bc3cd964980","Type":"ContainerDied","Data":"96ff887f795a584b966eadf25b31c5d7241f4d51e0d6a4c1a7d3210bc90ed079"} Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.209995 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96ff887f795a584b966eadf25b31c5d7241f4d51e0d6a4c1a7d3210bc90ed079" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.209429 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-nzm87" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.297585 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-n42l8"] Dec 09 12:05:28 crc kubenswrapper[5002]: E1209 12:05:28.298228 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e049b56-1b0d-47da-a400-5bc3cd964980" containerName="configure-network-openstack-openstack-cell1" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.298252 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e049b56-1b0d-47da-a400-5bc3cd964980" containerName="configure-network-openstack-openstack-cell1" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.298517 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e049b56-1b0d-47da-a400-5bc3cd964980" containerName="configure-network-openstack-openstack-cell1" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.299446 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.302350 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.302380 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.302399 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.305484 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.311311 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-n42l8"] Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.419215 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-inventory\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.419320 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ssh-key\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.419350 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ceph\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.419464 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwfqc\" (UniqueName: \"kubernetes.io/projected/f848efb7-cc84-45e2-930a-efd7824640cf-kube-api-access-zwfqc\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.521271 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-inventory\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.521351 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ssh-key\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.521380 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ceph\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.521478 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwfqc\" (UniqueName: \"kubernetes.io/projected/f848efb7-cc84-45e2-930a-efd7824640cf-kube-api-access-zwfqc\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.526615 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-inventory\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.530091 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ssh-key\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.539729 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ceph\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.545857 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwfqc\" (UniqueName: \"kubernetes.io/projected/f848efb7-cc84-45e2-930a-efd7824640cf-kube-api-access-zwfqc\") pod \"validate-network-openstack-openstack-cell1-n42l8\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:28 crc kubenswrapper[5002]: I1209 12:05:28.624237 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:29 crc kubenswrapper[5002]: I1209 12:05:29.187407 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-n42l8"] Dec 09 12:05:29 crc kubenswrapper[5002]: I1209 12:05:29.192624 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:05:29 crc kubenswrapper[5002]: I1209 12:05:29.237503 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-n42l8" event={"ID":"f848efb7-cc84-45e2-930a-efd7824640cf","Type":"ContainerStarted","Data":"ba0a7e941036b823a5914de3fd35e87adfe3dcd34cee95fc23d547afb778cf12"} Dec 09 12:05:30 crc kubenswrapper[5002]: I1209 12:05:30.249684 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-n42l8" event={"ID":"f848efb7-cc84-45e2-930a-efd7824640cf","Type":"ContainerStarted","Data":"2e35912f3e8f0a7e143020912c594548121b0597830d7e6ce7483010d753c4d4"} Dec 09 12:05:30 crc kubenswrapper[5002]: I1209 12:05:30.283627 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-n42l8" podStartSLOduration=1.6552465079999998 podStartE2EDuration="2.283604212s" podCreationTimestamp="2025-12-09 12:05:28 +0000 UTC" firstStartedPulling="2025-12-09 12:05:29.192365291 +0000 UTC m=+7461.584416372" lastFinishedPulling="2025-12-09 12:05:29.820722955 +0000 UTC m=+7462.212774076" observedRunningTime="2025-12-09 12:05:30.280290823 +0000 UTC m=+7462.672341994" watchObservedRunningTime="2025-12-09 12:05:30.283604212 +0000 UTC m=+7462.675655303" Dec 09 12:05:33 crc kubenswrapper[5002]: I1209 12:05:33.886397 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9xwfq"] Dec 09 12:05:33 crc kubenswrapper[5002]: I1209 12:05:33.889703 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:33 crc kubenswrapper[5002]: I1209 12:05:33.907624 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9xwfq"] Dec 09 12:05:33 crc kubenswrapper[5002]: I1209 12:05:33.951489 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8bbj\" (UniqueName: \"kubernetes.io/projected/7863fa45-a88a-4e01-b289-81f07cfec936-kube-api-access-k8bbj\") pod \"redhat-operators-9xwfq\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:33 crc kubenswrapper[5002]: I1209 12:05:33.951549 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-utilities\") pod \"redhat-operators-9xwfq\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:33 crc kubenswrapper[5002]: I1209 12:05:33.951582 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-catalog-content\") pod \"redhat-operators-9xwfq\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:34 crc kubenswrapper[5002]: I1209 12:05:34.054104 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-utilities\") pod \"redhat-operators-9xwfq\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:34 crc kubenswrapper[5002]: I1209 12:05:34.054147 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-catalog-content\") pod \"redhat-operators-9xwfq\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:34 crc kubenswrapper[5002]: I1209 12:05:34.054358 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8bbj\" (UniqueName: \"kubernetes.io/projected/7863fa45-a88a-4e01-b289-81f07cfec936-kube-api-access-k8bbj\") pod \"redhat-operators-9xwfq\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:34 crc kubenswrapper[5002]: I1209 12:05:34.054790 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-utilities\") pod \"redhat-operators-9xwfq\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:34 crc kubenswrapper[5002]: I1209 12:05:34.055380 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-catalog-content\") pod \"redhat-operators-9xwfq\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:34 crc kubenswrapper[5002]: I1209 12:05:34.074030 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8bbj\" (UniqueName: \"kubernetes.io/projected/7863fa45-a88a-4e01-b289-81f07cfec936-kube-api-access-k8bbj\") pod \"redhat-operators-9xwfq\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:34 crc kubenswrapper[5002]: I1209 12:05:34.234281 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:34 crc kubenswrapper[5002]: I1209 12:05:34.703380 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9xwfq"] Dec 09 12:05:35 crc kubenswrapper[5002]: I1209 12:05:35.317046 5002 generic.go:334] "Generic (PLEG): container finished" podID="f848efb7-cc84-45e2-930a-efd7824640cf" containerID="2e35912f3e8f0a7e143020912c594548121b0597830d7e6ce7483010d753c4d4" exitCode=0 Dec 09 12:05:35 crc kubenswrapper[5002]: I1209 12:05:35.317271 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-n42l8" event={"ID":"f848efb7-cc84-45e2-930a-efd7824640cf","Type":"ContainerDied","Data":"2e35912f3e8f0a7e143020912c594548121b0597830d7e6ce7483010d753c4d4"} Dec 09 12:05:35 crc kubenswrapper[5002]: I1209 12:05:35.319705 5002 generic.go:334] "Generic (PLEG): container finished" podID="7863fa45-a88a-4e01-b289-81f07cfec936" containerID="fbeed9c6b000894ab5f772be693596808d102ec77e13a080e12cb75919a9b5f9" exitCode=0 Dec 09 12:05:35 crc kubenswrapper[5002]: I1209 12:05:35.319739 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xwfq" event={"ID":"7863fa45-a88a-4e01-b289-81f07cfec936","Type":"ContainerDied","Data":"fbeed9c6b000894ab5f772be693596808d102ec77e13a080e12cb75919a9b5f9"} Dec 09 12:05:35 crc kubenswrapper[5002]: I1209 12:05:35.319761 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xwfq" event={"ID":"7863fa45-a88a-4e01-b289-81f07cfec936","Type":"ContainerStarted","Data":"0b713a6f920009e829bac37dcf42d15ca383e80590f2c497e29b0b25e5b06c9c"} Dec 09 12:05:36 crc kubenswrapper[5002]: I1209 12:05:36.808788 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:36 crc kubenswrapper[5002]: I1209 12:05:36.934700 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwfqc\" (UniqueName: \"kubernetes.io/projected/f848efb7-cc84-45e2-930a-efd7824640cf-kube-api-access-zwfqc\") pod \"f848efb7-cc84-45e2-930a-efd7824640cf\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " Dec 09 12:05:36 crc kubenswrapper[5002]: I1209 12:05:36.934782 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-inventory\") pod \"f848efb7-cc84-45e2-930a-efd7824640cf\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " Dec 09 12:05:36 crc kubenswrapper[5002]: I1209 12:05:36.934884 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ssh-key\") pod \"f848efb7-cc84-45e2-930a-efd7824640cf\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " Dec 09 12:05:36 crc kubenswrapper[5002]: I1209 12:05:36.934963 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ceph\") pod \"f848efb7-cc84-45e2-930a-efd7824640cf\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " Dec 09 12:05:36 crc kubenswrapper[5002]: I1209 12:05:36.939915 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f848efb7-cc84-45e2-930a-efd7824640cf-kube-api-access-zwfqc" (OuterVolumeSpecName: "kube-api-access-zwfqc") pod "f848efb7-cc84-45e2-930a-efd7824640cf" (UID: "f848efb7-cc84-45e2-930a-efd7824640cf"). InnerVolumeSpecName "kube-api-access-zwfqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:05:36 crc kubenswrapper[5002]: I1209 12:05:36.940330 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ceph" (OuterVolumeSpecName: "ceph") pod "f848efb7-cc84-45e2-930a-efd7824640cf" (UID: "f848efb7-cc84-45e2-930a-efd7824640cf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:05:36 crc kubenswrapper[5002]: E1209 12:05:36.964861 5002 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-inventory podName:f848efb7-cc84-45e2-930a-efd7824640cf nodeName:}" failed. No retries permitted until 2025-12-09 12:05:37.464804057 +0000 UTC m=+7469.856855138 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-inventory") pod "f848efb7-cc84-45e2-930a-efd7824640cf" (UID: "f848efb7-cc84-45e2-930a-efd7824640cf") : error deleting /var/lib/kubelet/pods/f848efb7-cc84-45e2-930a-efd7824640cf/volume-subpaths: remove /var/lib/kubelet/pods/f848efb7-cc84-45e2-930a-efd7824640cf/volume-subpaths: no such file or directory Dec 09 12:05:36 crc kubenswrapper[5002]: I1209 12:05:36.967793 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f848efb7-cc84-45e2-930a-efd7824640cf" (UID: "f848efb7-cc84-45e2-930a-efd7824640cf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.037772 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwfqc\" (UniqueName: \"kubernetes.io/projected/f848efb7-cc84-45e2-930a-efd7824640cf-kube-api-access-zwfqc\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.037806 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.037830 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.060758 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:05:37 crc kubenswrapper[5002]: E1209 12:05:37.061001 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.343051 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-n42l8" event={"ID":"f848efb7-cc84-45e2-930a-efd7824640cf","Type":"ContainerDied","Data":"ba0a7e941036b823a5914de3fd35e87adfe3dcd34cee95fc23d547afb778cf12"} Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.343584 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba0a7e941036b823a5914de3fd35e87adfe3dcd34cee95fc23d547afb778cf12" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.343511 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-n42l8" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.346134 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xwfq" event={"ID":"7863fa45-a88a-4e01-b289-81f07cfec936","Type":"ContainerStarted","Data":"15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268"} Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.518944 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-r26xg"] Dec 09 12:05:37 crc kubenswrapper[5002]: E1209 12:05:37.519497 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f848efb7-cc84-45e2-930a-efd7824640cf" containerName="validate-network-openstack-openstack-cell1" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.519518 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f848efb7-cc84-45e2-930a-efd7824640cf" containerName="validate-network-openstack-openstack-cell1" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.519829 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f848efb7-cc84-45e2-930a-efd7824640cf" containerName="validate-network-openstack-openstack-cell1" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.520678 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.539380 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-r26xg"] Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.548314 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-inventory\") pod \"f848efb7-cc84-45e2-930a-efd7824640cf\" (UID: \"f848efb7-cc84-45e2-930a-efd7824640cf\") " Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.552231 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-inventory" (OuterVolumeSpecName: "inventory") pod "f848efb7-cc84-45e2-930a-efd7824640cf" (UID: "f848efb7-cc84-45e2-930a-efd7824640cf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.650414 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ceph\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.650478 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9tvt\" (UniqueName: \"kubernetes.io/projected/5a0b071a-c032-4ba5-a808-afcfda05fef6-kube-api-access-w9tvt\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.650663 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-inventory\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.650714 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ssh-key\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.650928 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f848efb7-cc84-45e2-930a-efd7824640cf-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.752898 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ceph\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.752960 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9tvt\" (UniqueName: \"kubernetes.io/projected/5a0b071a-c032-4ba5-a808-afcfda05fef6-kube-api-access-w9tvt\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.753002 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-inventory\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.753055 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ssh-key\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.757357 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-inventory\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.757664 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ssh-key\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.759464 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ceph\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.770108 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9tvt\" (UniqueName: \"kubernetes.io/projected/5a0b071a-c032-4ba5-a808-afcfda05fef6-kube-api-access-w9tvt\") pod \"install-os-openstack-openstack-cell1-r26xg\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:37 crc kubenswrapper[5002]: I1209 12:05:37.851761 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:05:39 crc kubenswrapper[5002]: I1209 12:05:39.146993 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-r26xg"] Dec 09 12:05:39 crc kubenswrapper[5002]: I1209 12:05:39.370428 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-r26xg" event={"ID":"5a0b071a-c032-4ba5-a808-afcfda05fef6","Type":"ContainerStarted","Data":"772f725e5736783b74a409329be8386181c78f2a15beb5d7c657583043207ad6"} Dec 09 12:05:45 crc kubenswrapper[5002]: I1209 12:05:45.432515 5002 generic.go:334] "Generic (PLEG): container finished" podID="7863fa45-a88a-4e01-b289-81f07cfec936" containerID="15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268" exitCode=0 Dec 09 12:05:45 crc kubenswrapper[5002]: I1209 12:05:45.432632 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xwfq" event={"ID":"7863fa45-a88a-4e01-b289-81f07cfec936","Type":"ContainerDied","Data":"15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268"} Dec 09 12:05:46 crc kubenswrapper[5002]: I1209 12:05:46.446222 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-r26xg" event={"ID":"5a0b071a-c032-4ba5-a808-afcfda05fef6","Type":"ContainerStarted","Data":"b57ec7eb19269a4859bfeb3dddcbb0935f607b5ebd17997d3cd398b89a687d64"} Dec 09 12:05:46 crc kubenswrapper[5002]: I1209 12:05:46.471228 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-r26xg" podStartSLOduration=2.973949589 podStartE2EDuration="9.471207511s" podCreationTimestamp="2025-12-09 12:05:37 +0000 UTC" firstStartedPulling="2025-12-09 12:05:39.158962702 +0000 UTC m=+7471.551013783" lastFinishedPulling="2025-12-09 12:05:45.656220624 +0000 UTC m=+7478.048271705" observedRunningTime="2025-12-09 12:05:46.460548205 +0000 UTC m=+7478.852599276" watchObservedRunningTime="2025-12-09 12:05:46.471207511 +0000 UTC m=+7478.863258592" Dec 09 12:05:47 crc kubenswrapper[5002]: I1209 12:05:47.457396 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xwfq" event={"ID":"7863fa45-a88a-4e01-b289-81f07cfec936","Type":"ContainerStarted","Data":"898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595"} Dec 09 12:05:47 crc kubenswrapper[5002]: I1209 12:05:47.485866 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9xwfq" podStartSLOduration=3.280938655 podStartE2EDuration="14.485844139s" podCreationTimestamp="2025-12-09 12:05:33 +0000 UTC" firstStartedPulling="2025-12-09 12:05:35.321508717 +0000 UTC m=+7467.713559798" lastFinishedPulling="2025-12-09 12:05:46.526414201 +0000 UTC m=+7478.918465282" observedRunningTime="2025-12-09 12:05:47.479347045 +0000 UTC m=+7479.871398136" watchObservedRunningTime="2025-12-09 12:05:47.485844139 +0000 UTC m=+7479.877895220" Dec 09 12:05:51 crc kubenswrapper[5002]: I1209 12:05:51.060918 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:05:51 crc kubenswrapper[5002]: E1209 12:05:51.061700 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:05:54 crc kubenswrapper[5002]: I1209 12:05:54.234879 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:54 crc kubenswrapper[5002]: I1209 12:05:54.235406 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:54 crc kubenswrapper[5002]: I1209 12:05:54.304654 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:54 crc kubenswrapper[5002]: I1209 12:05:54.580573 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:54 crc kubenswrapper[5002]: I1209 12:05:54.632011 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9xwfq"] Dec 09 12:05:56 crc kubenswrapper[5002]: I1209 12:05:56.550384 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9xwfq" podUID="7863fa45-a88a-4e01-b289-81f07cfec936" containerName="registry-server" containerID="cri-o://898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595" gracePeriod=2 Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.070150 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.234476 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-utilities\") pod \"7863fa45-a88a-4e01-b289-81f07cfec936\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.234758 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-catalog-content\") pod \"7863fa45-a88a-4e01-b289-81f07cfec936\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.234961 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8bbj\" (UniqueName: \"kubernetes.io/projected/7863fa45-a88a-4e01-b289-81f07cfec936-kube-api-access-k8bbj\") pod \"7863fa45-a88a-4e01-b289-81f07cfec936\" (UID: \"7863fa45-a88a-4e01-b289-81f07cfec936\") " Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.235274 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-utilities" (OuterVolumeSpecName: "utilities") pod "7863fa45-a88a-4e01-b289-81f07cfec936" (UID: "7863fa45-a88a-4e01-b289-81f07cfec936"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.235595 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.240161 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7863fa45-a88a-4e01-b289-81f07cfec936-kube-api-access-k8bbj" (OuterVolumeSpecName: "kube-api-access-k8bbj") pod "7863fa45-a88a-4e01-b289-81f07cfec936" (UID: "7863fa45-a88a-4e01-b289-81f07cfec936"). InnerVolumeSpecName "kube-api-access-k8bbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.337480 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8bbj\" (UniqueName: \"kubernetes.io/projected/7863fa45-a88a-4e01-b289-81f07cfec936-kube-api-access-k8bbj\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.563658 5002 generic.go:334] "Generic (PLEG): container finished" podID="7863fa45-a88a-4e01-b289-81f07cfec936" containerID="898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595" exitCode=0 Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.563694 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xwfq" event={"ID":"7863fa45-a88a-4e01-b289-81f07cfec936","Type":"ContainerDied","Data":"898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595"} Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.563719 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xwfq" event={"ID":"7863fa45-a88a-4e01-b289-81f07cfec936","Type":"ContainerDied","Data":"0b713a6f920009e829bac37dcf42d15ca383e80590f2c497e29b0b25e5b06c9c"} Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.563739 5002 scope.go:117] "RemoveContainer" containerID="898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.566116 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xwfq" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.605050 5002 scope.go:117] "RemoveContainer" containerID="15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.637642 5002 scope.go:117] "RemoveContainer" containerID="fbeed9c6b000894ab5f772be693596808d102ec77e13a080e12cb75919a9b5f9" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.709097 5002 scope.go:117] "RemoveContainer" containerID="898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595" Dec 09 12:05:57 crc kubenswrapper[5002]: E1209 12:05:57.709679 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595\": container with ID starting with 898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595 not found: ID does not exist" containerID="898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.709747 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595"} err="failed to get container status \"898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595\": rpc error: code = NotFound desc = could not find container \"898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595\": container with ID starting with 898aa1ae70d299823625e1f270c5a6d70cd5f2ee97963b24092a7bf257f1b595 not found: ID does not exist" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.709787 5002 scope.go:117] "RemoveContainer" containerID="15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268" Dec 09 12:05:57 crc kubenswrapper[5002]: E1209 12:05:57.710267 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268\": container with ID starting with 15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268 not found: ID does not exist" containerID="15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.710311 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268"} err="failed to get container status \"15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268\": rpc error: code = NotFound desc = could not find container \"15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268\": container with ID starting with 15f0a2d87eca514468aab315e4903745801f02e7e0fee6a14f965c9dfef09268 not found: ID does not exist" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.710339 5002 scope.go:117] "RemoveContainer" containerID="fbeed9c6b000894ab5f772be693596808d102ec77e13a080e12cb75919a9b5f9" Dec 09 12:05:57 crc kubenswrapper[5002]: E1209 12:05:57.710733 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbeed9c6b000894ab5f772be693596808d102ec77e13a080e12cb75919a9b5f9\": container with ID starting with fbeed9c6b000894ab5f772be693596808d102ec77e13a080e12cb75919a9b5f9 not found: ID does not exist" containerID="fbeed9c6b000894ab5f772be693596808d102ec77e13a080e12cb75919a9b5f9" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.710760 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbeed9c6b000894ab5f772be693596808d102ec77e13a080e12cb75919a9b5f9"} err="failed to get container status \"fbeed9c6b000894ab5f772be693596808d102ec77e13a080e12cb75919a9b5f9\": rpc error: code = NotFound desc = could not find container \"fbeed9c6b000894ab5f772be693596808d102ec77e13a080e12cb75919a9b5f9\": container with ID starting with fbeed9c6b000894ab5f772be693596808d102ec77e13a080e12cb75919a9b5f9 not found: ID does not exist" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.766783 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7863fa45-a88a-4e01-b289-81f07cfec936" (UID: "7863fa45-a88a-4e01-b289-81f07cfec936"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.848880 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7863fa45-a88a-4e01-b289-81f07cfec936-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.964926 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9xwfq"] Dec 09 12:05:57 crc kubenswrapper[5002]: I1209 12:05:57.974962 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9xwfq"] Dec 09 12:05:58 crc kubenswrapper[5002]: I1209 12:05:58.077725 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7863fa45-a88a-4e01-b289-81f07cfec936" path="/var/lib/kubelet/pods/7863fa45-a88a-4e01-b289-81f07cfec936/volumes" Dec 09 12:06:04 crc kubenswrapper[5002]: I1209 12:06:04.060948 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:06:04 crc kubenswrapper[5002]: E1209 12:06:04.061758 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:06:15 crc kubenswrapper[5002]: I1209 12:06:15.060752 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:06:15 crc kubenswrapper[5002]: E1209 12:06:15.061571 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:06:27 crc kubenswrapper[5002]: I1209 12:06:27.060561 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:06:27 crc kubenswrapper[5002]: E1209 12:06:27.061252 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:06:33 crc kubenswrapper[5002]: I1209 12:06:33.985720 5002 generic.go:334] "Generic (PLEG): container finished" podID="5a0b071a-c032-4ba5-a808-afcfda05fef6" containerID="b57ec7eb19269a4859bfeb3dddcbb0935f607b5ebd17997d3cd398b89a687d64" exitCode=0 Dec 09 12:06:33 crc kubenswrapper[5002]: I1209 12:06:33.985805 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-r26xg" event={"ID":"5a0b071a-c032-4ba5-a808-afcfda05fef6","Type":"ContainerDied","Data":"b57ec7eb19269a4859bfeb3dddcbb0935f607b5ebd17997d3cd398b89a687d64"} Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.532123 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.621844 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9tvt\" (UniqueName: \"kubernetes.io/projected/5a0b071a-c032-4ba5-a808-afcfda05fef6-kube-api-access-w9tvt\") pod \"5a0b071a-c032-4ba5-a808-afcfda05fef6\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.622228 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ssh-key\") pod \"5a0b071a-c032-4ba5-a808-afcfda05fef6\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.622413 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-inventory\") pod \"5a0b071a-c032-4ba5-a808-afcfda05fef6\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.622481 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ceph\") pod \"5a0b071a-c032-4ba5-a808-afcfda05fef6\" (UID: \"5a0b071a-c032-4ba5-a808-afcfda05fef6\") " Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.627697 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ceph" (OuterVolumeSpecName: "ceph") pod "5a0b071a-c032-4ba5-a808-afcfda05fef6" (UID: "5a0b071a-c032-4ba5-a808-afcfda05fef6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.633281 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a0b071a-c032-4ba5-a808-afcfda05fef6-kube-api-access-w9tvt" (OuterVolumeSpecName: "kube-api-access-w9tvt") pod "5a0b071a-c032-4ba5-a808-afcfda05fef6" (UID: "5a0b071a-c032-4ba5-a808-afcfda05fef6"). InnerVolumeSpecName "kube-api-access-w9tvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.653667 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-inventory" (OuterVolumeSpecName: "inventory") pod "5a0b071a-c032-4ba5-a808-afcfda05fef6" (UID: "5a0b071a-c032-4ba5-a808-afcfda05fef6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.659871 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5a0b071a-c032-4ba5-a808-afcfda05fef6" (UID: "5a0b071a-c032-4ba5-a808-afcfda05fef6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.725425 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9tvt\" (UniqueName: \"kubernetes.io/projected/5a0b071a-c032-4ba5-a808-afcfda05fef6-kube-api-access-w9tvt\") on node \"crc\" DevicePath \"\"" Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.725462 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.725472 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:06:35 crc kubenswrapper[5002]: I1209 12:06:35.725481 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a0b071a-c032-4ba5-a808-afcfda05fef6-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.025437 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-r26xg" event={"ID":"5a0b071a-c032-4ba5-a808-afcfda05fef6","Type":"ContainerDied","Data":"772f725e5736783b74a409329be8386181c78f2a15beb5d7c657583043207ad6"} Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.025804 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="772f725e5736783b74a409329be8386181c78f2a15beb5d7c657583043207ad6" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.025498 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-r26xg" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.318744 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-s87gq"] Dec 09 12:06:36 crc kubenswrapper[5002]: E1209 12:06:36.319448 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7863fa45-a88a-4e01-b289-81f07cfec936" containerName="extract-utilities" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.319465 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7863fa45-a88a-4e01-b289-81f07cfec936" containerName="extract-utilities" Dec 09 12:06:36 crc kubenswrapper[5002]: E1209 12:06:36.319489 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0b071a-c032-4ba5-a808-afcfda05fef6" containerName="install-os-openstack-openstack-cell1" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.319496 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0b071a-c032-4ba5-a808-afcfda05fef6" containerName="install-os-openstack-openstack-cell1" Dec 09 12:06:36 crc kubenswrapper[5002]: E1209 12:06:36.319514 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7863fa45-a88a-4e01-b289-81f07cfec936" containerName="extract-content" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.319520 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7863fa45-a88a-4e01-b289-81f07cfec936" containerName="extract-content" Dec 09 12:06:36 crc kubenswrapper[5002]: E1209 12:06:36.319544 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7863fa45-a88a-4e01-b289-81f07cfec936" containerName="registry-server" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.319549 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7863fa45-a88a-4e01-b289-81f07cfec936" containerName="registry-server" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.319738 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a0b071a-c032-4ba5-a808-afcfda05fef6" containerName="install-os-openstack-openstack-cell1" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.319754 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7863fa45-a88a-4e01-b289-81f07cfec936" containerName="registry-server" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.323692 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.326496 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.326742 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.326750 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.334176 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.343823 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-s87gq"] Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.438413 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ceph\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.438472 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-inventory\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.438520 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqblf\" (UniqueName: \"kubernetes.io/projected/5287e90f-c20c-41df-8e2a-ced571a234d5-kube-api-access-sqblf\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.438548 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ssh-key\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.539834 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ssh-key\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.540016 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ceph\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.540048 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-inventory\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.540093 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqblf\" (UniqueName: \"kubernetes.io/projected/5287e90f-c20c-41df-8e2a-ced571a234d5-kube-api-access-sqblf\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.546243 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ssh-key\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.547038 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ceph\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.565411 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqblf\" (UniqueName: \"kubernetes.io/projected/5287e90f-c20c-41df-8e2a-ced571a234d5-kube-api-access-sqblf\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.566373 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-inventory\") pod \"configure-os-openstack-openstack-cell1-s87gq\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:36 crc kubenswrapper[5002]: I1209 12:06:36.688805 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:06:37 crc kubenswrapper[5002]: I1209 12:06:37.478048 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-s87gq"] Dec 09 12:06:38 crc kubenswrapper[5002]: I1209 12:06:38.056947 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-s87gq" event={"ID":"5287e90f-c20c-41df-8e2a-ced571a234d5","Type":"ContainerStarted","Data":"d8863b0fb1a07f0f4af8166f35ccdef8c2ac057d901c67da0148c798a0f6faac"} Dec 09 12:06:39 crc kubenswrapper[5002]: I1209 12:06:39.092007 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-s87gq" event={"ID":"5287e90f-c20c-41df-8e2a-ced571a234d5","Type":"ContainerStarted","Data":"b5a121e5ea6ce4d2080300af28645bace3748bb9acb67b8b09bb244296e64358"} Dec 09 12:06:39 crc kubenswrapper[5002]: I1209 12:06:39.117259 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-s87gq" podStartSLOduration=2.377975185 podStartE2EDuration="3.117240691s" podCreationTimestamp="2025-12-09 12:06:36 +0000 UTC" firstStartedPulling="2025-12-09 12:06:37.484077553 +0000 UTC m=+7529.876128634" lastFinishedPulling="2025-12-09 12:06:38.223343059 +0000 UTC m=+7530.615394140" observedRunningTime="2025-12-09 12:06:39.109214705 +0000 UTC m=+7531.501265786" watchObservedRunningTime="2025-12-09 12:06:39.117240691 +0000 UTC m=+7531.509291772" Dec 09 12:06:42 crc kubenswrapper[5002]: I1209 12:06:42.060401 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:06:42 crc kubenswrapper[5002]: E1209 12:06:42.061353 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:06:54 crc kubenswrapper[5002]: I1209 12:06:54.060923 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:06:54 crc kubenswrapper[5002]: E1209 12:06:54.062449 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:06:56 crc kubenswrapper[5002]: I1209 12:06:56.487430 5002 scope.go:117] "RemoveContainer" containerID="b3d6e42c0726679d60f9c22eabac6ecccb3fa91bcac3adb7105c4767b413a8ba" Dec 09 12:06:56 crc kubenswrapper[5002]: I1209 12:06:56.521867 5002 scope.go:117] "RemoveContainer" containerID="c1100cd1456c99938e1f45086ac2402de1e93be7e17cc84a901f6df72609bcbb" Dec 09 12:06:56 crc kubenswrapper[5002]: I1209 12:06:56.569698 5002 scope.go:117] "RemoveContainer" containerID="47527bc7a7064455b3619435d335e612e30d12198c55f84f36fce945e0257833" Dec 09 12:06:58 crc kubenswrapper[5002]: I1209 12:06:58.890018 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sr552"] Dec 09 12:06:58 crc kubenswrapper[5002]: I1209 12:06:58.917273 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sr552"] Dec 09 12:06:58 crc kubenswrapper[5002]: I1209 12:06:58.917411 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr552" Dec 09 12:06:59 crc kubenswrapper[5002]: I1209 12:06:59.043802 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pkhz\" (UniqueName: \"kubernetes.io/projected/99134fa3-6ebd-4ab9-8871-409b8494fbc6-kube-api-access-2pkhz\") pod \"community-operators-sr552\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " pod="openshift-marketplace/community-operators-sr552" Dec 09 12:06:59 crc kubenswrapper[5002]: I1209 12:06:59.043897 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-utilities\") pod \"community-operators-sr552\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " pod="openshift-marketplace/community-operators-sr552" Dec 09 12:06:59 crc kubenswrapper[5002]: I1209 12:06:59.044022 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-catalog-content\") pod \"community-operators-sr552\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " pod="openshift-marketplace/community-operators-sr552" Dec 09 12:06:59 crc kubenswrapper[5002]: I1209 12:06:59.145708 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-catalog-content\") pod \"community-operators-sr552\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " pod="openshift-marketplace/community-operators-sr552" Dec 09 12:06:59 crc kubenswrapper[5002]: I1209 12:06:59.146079 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pkhz\" (UniqueName: \"kubernetes.io/projected/99134fa3-6ebd-4ab9-8871-409b8494fbc6-kube-api-access-2pkhz\") pod \"community-operators-sr552\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " pod="openshift-marketplace/community-operators-sr552" Dec 09 12:06:59 crc kubenswrapper[5002]: I1209 12:06:59.146110 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-utilities\") pod \"community-operators-sr552\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " pod="openshift-marketplace/community-operators-sr552" Dec 09 12:06:59 crc kubenswrapper[5002]: I1209 12:06:59.146614 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-utilities\") pod \"community-operators-sr552\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " pod="openshift-marketplace/community-operators-sr552" Dec 09 12:06:59 crc kubenswrapper[5002]: I1209 12:06:59.146888 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-catalog-content\") pod \"community-operators-sr552\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " pod="openshift-marketplace/community-operators-sr552" Dec 09 12:06:59 crc kubenswrapper[5002]: I1209 12:06:59.165144 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pkhz\" (UniqueName: \"kubernetes.io/projected/99134fa3-6ebd-4ab9-8871-409b8494fbc6-kube-api-access-2pkhz\") pod \"community-operators-sr552\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " pod="openshift-marketplace/community-operators-sr552" Dec 09 12:06:59 crc kubenswrapper[5002]: I1209 12:06:59.249328 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr552" Dec 09 12:06:59 crc kubenswrapper[5002]: I1209 12:06:59.925049 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sr552"] Dec 09 12:07:00 crc kubenswrapper[5002]: I1209 12:07:00.297342 5002 generic.go:334] "Generic (PLEG): container finished" podID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" containerID="705eb79b6ccb0ab9f294322eee486d35bc091addce69af6a03bd240c5474a941" exitCode=0 Dec 09 12:07:00 crc kubenswrapper[5002]: I1209 12:07:00.297418 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr552" event={"ID":"99134fa3-6ebd-4ab9-8871-409b8494fbc6","Type":"ContainerDied","Data":"705eb79b6ccb0ab9f294322eee486d35bc091addce69af6a03bd240c5474a941"} Dec 09 12:07:00 crc kubenswrapper[5002]: I1209 12:07:00.297695 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr552" event={"ID":"99134fa3-6ebd-4ab9-8871-409b8494fbc6","Type":"ContainerStarted","Data":"1871fa5dbd1eb877b7ac75dc7eaf6945fec75b0fe37caa1e5d1306d2d5f520c4"} Dec 09 12:07:01 crc kubenswrapper[5002]: I1209 12:07:01.316536 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr552" event={"ID":"99134fa3-6ebd-4ab9-8871-409b8494fbc6","Type":"ContainerStarted","Data":"7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e"} Dec 09 12:07:02 crc kubenswrapper[5002]: I1209 12:07:02.330026 5002 generic.go:334] "Generic (PLEG): container finished" podID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" containerID="7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e" exitCode=0 Dec 09 12:07:02 crc kubenswrapper[5002]: I1209 12:07:02.330368 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr552" event={"ID":"99134fa3-6ebd-4ab9-8871-409b8494fbc6","Type":"ContainerDied","Data":"7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e"} Dec 09 12:07:04 crc kubenswrapper[5002]: I1209 12:07:04.357876 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr552" event={"ID":"99134fa3-6ebd-4ab9-8871-409b8494fbc6","Type":"ContainerStarted","Data":"9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e"} Dec 09 12:07:04 crc kubenswrapper[5002]: I1209 12:07:04.408110 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sr552" podStartSLOduration=2.985701868 podStartE2EDuration="6.408082398s" podCreationTimestamp="2025-12-09 12:06:58 +0000 UTC" firstStartedPulling="2025-12-09 12:07:00.29973786 +0000 UTC m=+7552.691788951" lastFinishedPulling="2025-12-09 12:07:03.7221184 +0000 UTC m=+7556.114169481" observedRunningTime="2025-12-09 12:07:04.391705529 +0000 UTC m=+7556.783756640" watchObservedRunningTime="2025-12-09 12:07:04.408082398 +0000 UTC m=+7556.800133519" Dec 09 12:07:09 crc kubenswrapper[5002]: I1209 12:07:09.060714 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:07:09 crc kubenswrapper[5002]: E1209 12:07:09.061625 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:07:09 crc kubenswrapper[5002]: I1209 12:07:09.249701 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sr552" Dec 09 12:07:09 crc kubenswrapper[5002]: I1209 12:07:09.249775 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sr552" Dec 09 12:07:09 crc kubenswrapper[5002]: I1209 12:07:09.318716 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sr552" Dec 09 12:07:09 crc kubenswrapper[5002]: I1209 12:07:09.475113 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sr552" Dec 09 12:07:10 crc kubenswrapper[5002]: I1209 12:07:10.473772 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sr552"] Dec 09 12:07:11 crc kubenswrapper[5002]: I1209 12:07:11.448617 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sr552" podUID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" containerName="registry-server" containerID="cri-o://9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e" gracePeriod=2 Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.007797 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr552" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.177561 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pkhz\" (UniqueName: \"kubernetes.io/projected/99134fa3-6ebd-4ab9-8871-409b8494fbc6-kube-api-access-2pkhz\") pod \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.177958 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-utilities\") pod \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.178071 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-catalog-content\") pod \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\" (UID: \"99134fa3-6ebd-4ab9-8871-409b8494fbc6\") " Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.179066 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-utilities" (OuterVolumeSpecName: "utilities") pod "99134fa3-6ebd-4ab9-8871-409b8494fbc6" (UID: "99134fa3-6ebd-4ab9-8871-409b8494fbc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.182876 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99134fa3-6ebd-4ab9-8871-409b8494fbc6-kube-api-access-2pkhz" (OuterVolumeSpecName: "kube-api-access-2pkhz") pod "99134fa3-6ebd-4ab9-8871-409b8494fbc6" (UID: "99134fa3-6ebd-4ab9-8871-409b8494fbc6"). InnerVolumeSpecName "kube-api-access-2pkhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.242180 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99134fa3-6ebd-4ab9-8871-409b8494fbc6" (UID: "99134fa3-6ebd-4ab9-8871-409b8494fbc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.280726 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.280768 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pkhz\" (UniqueName: \"kubernetes.io/projected/99134fa3-6ebd-4ab9-8871-409b8494fbc6-kube-api-access-2pkhz\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.280781 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99134fa3-6ebd-4ab9-8871-409b8494fbc6-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.467013 5002 generic.go:334] "Generic (PLEG): container finished" podID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" containerID="9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e" exitCode=0 Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.467095 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr552" event={"ID":"99134fa3-6ebd-4ab9-8871-409b8494fbc6","Type":"ContainerDied","Data":"9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e"} Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.467130 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr552" event={"ID":"99134fa3-6ebd-4ab9-8871-409b8494fbc6","Type":"ContainerDied","Data":"1871fa5dbd1eb877b7ac75dc7eaf6945fec75b0fe37caa1e5d1306d2d5f520c4"} Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.467153 5002 scope.go:117] "RemoveContainer" containerID="9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.467341 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr552" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.494888 5002 scope.go:117] "RemoveContainer" containerID="7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.508663 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sr552"] Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.520579 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sr552"] Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.535542 5002 scope.go:117] "RemoveContainer" containerID="705eb79b6ccb0ab9f294322eee486d35bc091addce69af6a03bd240c5474a941" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.580512 5002 scope.go:117] "RemoveContainer" containerID="9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e" Dec 09 12:07:12 crc kubenswrapper[5002]: E1209 12:07:12.581122 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e\": container with ID starting with 9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e not found: ID does not exist" containerID="9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.581165 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e"} err="failed to get container status \"9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e\": rpc error: code = NotFound desc = could not find container \"9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e\": container with ID starting with 9c8d6b4a84affcdadbdf38a934940561d44286ca2d0797e91753773e6c0eb03e not found: ID does not exist" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.581194 5002 scope.go:117] "RemoveContainer" containerID="7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e" Dec 09 12:07:12 crc kubenswrapper[5002]: E1209 12:07:12.581469 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e\": container with ID starting with 7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e not found: ID does not exist" containerID="7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.581577 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e"} err="failed to get container status \"7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e\": rpc error: code = NotFound desc = could not find container \"7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e\": container with ID starting with 7a2df4f573a3f7949c1cdaa8cbac467294ecc69207fd129ade80e6677a92eb8e not found: ID does not exist" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.581671 5002 scope.go:117] "RemoveContainer" containerID="705eb79b6ccb0ab9f294322eee486d35bc091addce69af6a03bd240c5474a941" Dec 09 12:07:12 crc kubenswrapper[5002]: E1209 12:07:12.582151 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705eb79b6ccb0ab9f294322eee486d35bc091addce69af6a03bd240c5474a941\": container with ID starting with 705eb79b6ccb0ab9f294322eee486d35bc091addce69af6a03bd240c5474a941 not found: ID does not exist" containerID="705eb79b6ccb0ab9f294322eee486d35bc091addce69af6a03bd240c5474a941" Dec 09 12:07:12 crc kubenswrapper[5002]: I1209 12:07:12.582895 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705eb79b6ccb0ab9f294322eee486d35bc091addce69af6a03bd240c5474a941"} err="failed to get container status \"705eb79b6ccb0ab9f294322eee486d35bc091addce69af6a03bd240c5474a941\": rpc error: code = NotFound desc = could not find container \"705eb79b6ccb0ab9f294322eee486d35bc091addce69af6a03bd240c5474a941\": container with ID starting with 705eb79b6ccb0ab9f294322eee486d35bc091addce69af6a03bd240c5474a941 not found: ID does not exist" Dec 09 12:07:14 crc kubenswrapper[5002]: I1209 12:07:14.076712 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" path="/var/lib/kubelet/pods/99134fa3-6ebd-4ab9-8871-409b8494fbc6/volumes" Dec 09 12:07:20 crc kubenswrapper[5002]: I1209 12:07:20.063775 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:07:20 crc kubenswrapper[5002]: E1209 12:07:20.065277 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:07:22 crc kubenswrapper[5002]: I1209 12:07:22.608151 5002 generic.go:334] "Generic (PLEG): container finished" podID="5287e90f-c20c-41df-8e2a-ced571a234d5" containerID="b5a121e5ea6ce4d2080300af28645bace3748bb9acb67b8b09bb244296e64358" exitCode=0 Dec 09 12:07:22 crc kubenswrapper[5002]: I1209 12:07:22.608218 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-s87gq" event={"ID":"5287e90f-c20c-41df-8e2a-ced571a234d5","Type":"ContainerDied","Data":"b5a121e5ea6ce4d2080300af28645bace3748bb9acb67b8b09bb244296e64358"} Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.146711 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.280281 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqblf\" (UniqueName: \"kubernetes.io/projected/5287e90f-c20c-41df-8e2a-ced571a234d5-kube-api-access-sqblf\") pod \"5287e90f-c20c-41df-8e2a-ced571a234d5\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.280417 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-inventory\") pod \"5287e90f-c20c-41df-8e2a-ced571a234d5\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.280559 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ceph\") pod \"5287e90f-c20c-41df-8e2a-ced571a234d5\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.280687 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ssh-key\") pod \"5287e90f-c20c-41df-8e2a-ced571a234d5\" (UID: \"5287e90f-c20c-41df-8e2a-ced571a234d5\") " Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.293764 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5287e90f-c20c-41df-8e2a-ced571a234d5-kube-api-access-sqblf" (OuterVolumeSpecName: "kube-api-access-sqblf") pod "5287e90f-c20c-41df-8e2a-ced571a234d5" (UID: "5287e90f-c20c-41df-8e2a-ced571a234d5"). InnerVolumeSpecName "kube-api-access-sqblf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.294413 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ceph" (OuterVolumeSpecName: "ceph") pod "5287e90f-c20c-41df-8e2a-ced571a234d5" (UID: "5287e90f-c20c-41df-8e2a-ced571a234d5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.318426 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5287e90f-c20c-41df-8e2a-ced571a234d5" (UID: "5287e90f-c20c-41df-8e2a-ced571a234d5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.319014 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-inventory" (OuterVolumeSpecName: "inventory") pod "5287e90f-c20c-41df-8e2a-ced571a234d5" (UID: "5287e90f-c20c-41df-8e2a-ced571a234d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.383639 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqblf\" (UniqueName: \"kubernetes.io/projected/5287e90f-c20c-41df-8e2a-ced571a234d5-kube-api-access-sqblf\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.383686 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.383701 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.383713 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5287e90f-c20c-41df-8e2a-ced571a234d5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.628386 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-s87gq" event={"ID":"5287e90f-c20c-41df-8e2a-ced571a234d5","Type":"ContainerDied","Data":"d8863b0fb1a07f0f4af8166f35ccdef8c2ac057d901c67da0148c798a0f6faac"} Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.628425 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8863b0fb1a07f0f4af8166f35ccdef8c2ac057d901c67da0148c798a0f6faac" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.628472 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-s87gq" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.814240 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-r2pdh"] Dec 09 12:07:24 crc kubenswrapper[5002]: E1209 12:07:24.815004 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" containerName="registry-server" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.815029 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" containerName="registry-server" Dec 09 12:07:24 crc kubenswrapper[5002]: E1209 12:07:24.815054 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" containerName="extract-content" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.815061 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" containerName="extract-content" Dec 09 12:07:24 crc kubenswrapper[5002]: E1209 12:07:24.815099 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5287e90f-c20c-41df-8e2a-ced571a234d5" containerName="configure-os-openstack-openstack-cell1" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.815105 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="5287e90f-c20c-41df-8e2a-ced571a234d5" containerName="configure-os-openstack-openstack-cell1" Dec 09 12:07:24 crc kubenswrapper[5002]: E1209 12:07:24.815114 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" containerName="extract-utilities" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.815120 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" containerName="extract-utilities" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.815350 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="99134fa3-6ebd-4ab9-8871-409b8494fbc6" containerName="registry-server" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.815376 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="5287e90f-c20c-41df-8e2a-ced571a234d5" containerName="configure-os-openstack-openstack-cell1" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.816217 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.818433 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.818455 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.818718 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.818774 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.827242 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-r2pdh"] Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.994955 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ceph\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.995041 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.995096 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw9ls\" (UniqueName: \"kubernetes.io/projected/44c30495-c199-4f48-a69d-368ff294f0a8-kube-api-access-hw9ls\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:24 crc kubenswrapper[5002]: I1209 12:07:24.995143 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-inventory-0\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:25 crc kubenswrapper[5002]: I1209 12:07:25.096917 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ceph\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:25 crc kubenswrapper[5002]: I1209 12:07:25.097008 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:25 crc kubenswrapper[5002]: I1209 12:07:25.097051 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw9ls\" (UniqueName: \"kubernetes.io/projected/44c30495-c199-4f48-a69d-368ff294f0a8-kube-api-access-hw9ls\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:25 crc kubenswrapper[5002]: I1209 12:07:25.097084 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-inventory-0\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:25 crc kubenswrapper[5002]: I1209 12:07:25.100908 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ceph\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:25 crc kubenswrapper[5002]: I1209 12:07:25.101388 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-inventory-0\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:25 crc kubenswrapper[5002]: I1209 12:07:25.103617 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:25 crc kubenswrapper[5002]: I1209 12:07:25.116895 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw9ls\" (UniqueName: \"kubernetes.io/projected/44c30495-c199-4f48-a69d-368ff294f0a8-kube-api-access-hw9ls\") pod \"ssh-known-hosts-openstack-r2pdh\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:25 crc kubenswrapper[5002]: I1209 12:07:25.142349 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:25 crc kubenswrapper[5002]: I1209 12:07:25.770619 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-r2pdh"] Dec 09 12:07:26 crc kubenswrapper[5002]: I1209 12:07:26.644891 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-r2pdh" event={"ID":"44c30495-c199-4f48-a69d-368ff294f0a8","Type":"ContainerStarted","Data":"25f035778712d3d35e3d347b2938a5e42e21defc7ca29978b8353bde7210f6b2"} Dec 09 12:07:27 crc kubenswrapper[5002]: I1209 12:07:27.664642 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-r2pdh" event={"ID":"44c30495-c199-4f48-a69d-368ff294f0a8","Type":"ContainerStarted","Data":"25dc2b5328735dbb28e96951da6d163e622efba50fc6ea88b842e54f0e322004"} Dec 09 12:07:27 crc kubenswrapper[5002]: I1209 12:07:27.696721 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-r2pdh" podStartSLOduration=3.04777856 podStartE2EDuration="3.696701735s" podCreationTimestamp="2025-12-09 12:07:24 +0000 UTC" firstStartedPulling="2025-12-09 12:07:25.776188504 +0000 UTC m=+7578.168239585" lastFinishedPulling="2025-12-09 12:07:26.425111679 +0000 UTC m=+7578.817162760" observedRunningTime="2025-12-09 12:07:27.687611931 +0000 UTC m=+7580.079663012" watchObservedRunningTime="2025-12-09 12:07:27.696701735 +0000 UTC m=+7580.088752816" Dec 09 12:07:33 crc kubenswrapper[5002]: I1209 12:07:33.061323 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:07:33 crc kubenswrapper[5002]: E1209 12:07:33.061995 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:07:35 crc kubenswrapper[5002]: I1209 12:07:35.743520 5002 generic.go:334] "Generic (PLEG): container finished" podID="44c30495-c199-4f48-a69d-368ff294f0a8" containerID="25dc2b5328735dbb28e96951da6d163e622efba50fc6ea88b842e54f0e322004" exitCode=0 Dec 09 12:07:35 crc kubenswrapper[5002]: I1209 12:07:35.743645 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-r2pdh" event={"ID":"44c30495-c199-4f48-a69d-368ff294f0a8","Type":"ContainerDied","Data":"25dc2b5328735dbb28e96951da6d163e622efba50fc6ea88b842e54f0e322004"} Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.360158 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.440262 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ssh-key-openstack-cell1\") pod \"44c30495-c199-4f48-a69d-368ff294f0a8\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.440361 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw9ls\" (UniqueName: \"kubernetes.io/projected/44c30495-c199-4f48-a69d-368ff294f0a8-kube-api-access-hw9ls\") pod \"44c30495-c199-4f48-a69d-368ff294f0a8\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.441292 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-inventory-0\") pod \"44c30495-c199-4f48-a69d-368ff294f0a8\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.441524 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ceph\") pod \"44c30495-c199-4f48-a69d-368ff294f0a8\" (UID: \"44c30495-c199-4f48-a69d-368ff294f0a8\") " Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.452844 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c30495-c199-4f48-a69d-368ff294f0a8-kube-api-access-hw9ls" (OuterVolumeSpecName: "kube-api-access-hw9ls") pod "44c30495-c199-4f48-a69d-368ff294f0a8" (UID: "44c30495-c199-4f48-a69d-368ff294f0a8"). InnerVolumeSpecName "kube-api-access-hw9ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.464116 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ceph" (OuterVolumeSpecName: "ceph") pod "44c30495-c199-4f48-a69d-368ff294f0a8" (UID: "44c30495-c199-4f48-a69d-368ff294f0a8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.493150 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "44c30495-c199-4f48-a69d-368ff294f0a8" (UID: "44c30495-c199-4f48-a69d-368ff294f0a8"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.495437 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "44c30495-c199-4f48-a69d-368ff294f0a8" (UID: "44c30495-c199-4f48-a69d-368ff294f0a8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.545358 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.545396 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw9ls\" (UniqueName: \"kubernetes.io/projected/44c30495-c199-4f48-a69d-368ff294f0a8-kube-api-access-hw9ls\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.545407 5002 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.545419 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44c30495-c199-4f48-a69d-368ff294f0a8-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.767180 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-r2pdh" event={"ID":"44c30495-c199-4f48-a69d-368ff294f0a8","Type":"ContainerDied","Data":"25f035778712d3d35e3d347b2938a5e42e21defc7ca29978b8353bde7210f6b2"} Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.767240 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f035778712d3d35e3d347b2938a5e42e21defc7ca29978b8353bde7210f6b2" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.767275 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-r2pdh" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.853650 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-f9wq6"] Dec 09 12:07:37 crc kubenswrapper[5002]: E1209 12:07:37.854129 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c30495-c199-4f48-a69d-368ff294f0a8" containerName="ssh-known-hosts-openstack" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.854149 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c30495-c199-4f48-a69d-368ff294f0a8" containerName="ssh-known-hosts-openstack" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.854367 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c30495-c199-4f48-a69d-368ff294f0a8" containerName="ssh-known-hosts-openstack" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.855130 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.861708 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.862234 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.862338 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.862512 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.881004 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-f9wq6"] Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.954470 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-inventory\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.954854 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ssh-key\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.954928 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ceph\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:37 crc kubenswrapper[5002]: I1209 12:07:37.955010 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5rsr\" (UniqueName: \"kubernetes.io/projected/01e50bcf-190a-49f2-bde4-84a59dc60d2f-kube-api-access-q5rsr\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:38 crc kubenswrapper[5002]: I1209 12:07:38.057009 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ssh-key\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:38 crc kubenswrapper[5002]: I1209 12:07:38.057083 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ceph\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:38 crc kubenswrapper[5002]: I1209 12:07:38.057149 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5rsr\" (UniqueName: \"kubernetes.io/projected/01e50bcf-190a-49f2-bde4-84a59dc60d2f-kube-api-access-q5rsr\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:38 crc kubenswrapper[5002]: I1209 12:07:38.057227 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-inventory\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:38 crc kubenswrapper[5002]: I1209 12:07:38.062870 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ssh-key\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:38 crc kubenswrapper[5002]: I1209 12:07:38.063699 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ceph\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:38 crc kubenswrapper[5002]: I1209 12:07:38.069355 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-inventory\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:38 crc kubenswrapper[5002]: I1209 12:07:38.096399 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5rsr\" (UniqueName: \"kubernetes.io/projected/01e50bcf-190a-49f2-bde4-84a59dc60d2f-kube-api-access-q5rsr\") pod \"run-os-openstack-openstack-cell1-f9wq6\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:38 crc kubenswrapper[5002]: I1209 12:07:38.186827 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:38 crc kubenswrapper[5002]: I1209 12:07:38.848198 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-f9wq6"] Dec 09 12:07:39 crc kubenswrapper[5002]: I1209 12:07:39.791133 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-f9wq6" event={"ID":"01e50bcf-190a-49f2-bde4-84a59dc60d2f","Type":"ContainerStarted","Data":"e396565d70e6b2ddf22fff736d04bd25530a5b4e8ce5d84bd3d573945c6bccb9"} Dec 09 12:07:40 crc kubenswrapper[5002]: I1209 12:07:40.802179 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-f9wq6" event={"ID":"01e50bcf-190a-49f2-bde4-84a59dc60d2f","Type":"ContainerStarted","Data":"6a9bd2a715f6ea0c798877e21a846303336dee830ff4c47861dd037234fc278d"} Dec 09 12:07:40 crc kubenswrapper[5002]: I1209 12:07:40.821743 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-f9wq6" podStartSLOduration=3.115670184 podStartE2EDuration="3.821717919s" podCreationTimestamp="2025-12-09 12:07:37 +0000 UTC" firstStartedPulling="2025-12-09 12:07:38.853458399 +0000 UTC m=+7591.245509480" lastFinishedPulling="2025-12-09 12:07:39.559506134 +0000 UTC m=+7591.951557215" observedRunningTime="2025-12-09 12:07:40.816934171 +0000 UTC m=+7593.208985252" watchObservedRunningTime="2025-12-09 12:07:40.821717919 +0000 UTC m=+7593.213769000" Dec 09 12:07:45 crc kubenswrapper[5002]: I1209 12:07:45.060334 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:07:45 crc kubenswrapper[5002]: E1209 12:07:45.061180 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:07:49 crc kubenswrapper[5002]: I1209 12:07:49.936579 5002 generic.go:334] "Generic (PLEG): container finished" podID="01e50bcf-190a-49f2-bde4-84a59dc60d2f" containerID="6a9bd2a715f6ea0c798877e21a846303336dee830ff4c47861dd037234fc278d" exitCode=0 Dec 09 12:07:49 crc kubenswrapper[5002]: I1209 12:07:49.936649 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-f9wq6" event={"ID":"01e50bcf-190a-49f2-bde4-84a59dc60d2f","Type":"ContainerDied","Data":"6a9bd2a715f6ea0c798877e21a846303336dee830ff4c47861dd037234fc278d"} Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.447275 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.565295 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-inventory\") pod \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.565375 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ssh-key\") pod \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.565513 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5rsr\" (UniqueName: \"kubernetes.io/projected/01e50bcf-190a-49f2-bde4-84a59dc60d2f-kube-api-access-q5rsr\") pod \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.565781 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ceph\") pod \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\" (UID: \"01e50bcf-190a-49f2-bde4-84a59dc60d2f\") " Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.570704 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ceph" (OuterVolumeSpecName: "ceph") pod "01e50bcf-190a-49f2-bde4-84a59dc60d2f" (UID: "01e50bcf-190a-49f2-bde4-84a59dc60d2f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.571465 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e50bcf-190a-49f2-bde4-84a59dc60d2f-kube-api-access-q5rsr" (OuterVolumeSpecName: "kube-api-access-q5rsr") pod "01e50bcf-190a-49f2-bde4-84a59dc60d2f" (UID: "01e50bcf-190a-49f2-bde4-84a59dc60d2f"). InnerVolumeSpecName "kube-api-access-q5rsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.599489 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "01e50bcf-190a-49f2-bde4-84a59dc60d2f" (UID: "01e50bcf-190a-49f2-bde4-84a59dc60d2f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.606054 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-inventory" (OuterVolumeSpecName: "inventory") pod "01e50bcf-190a-49f2-bde4-84a59dc60d2f" (UID: "01e50bcf-190a-49f2-bde4-84a59dc60d2f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.669104 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.669163 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.669183 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01e50bcf-190a-49f2-bde4-84a59dc60d2f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.669205 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5rsr\" (UniqueName: \"kubernetes.io/projected/01e50bcf-190a-49f2-bde4-84a59dc60d2f-kube-api-access-q5rsr\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.966952 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-f9wq6" event={"ID":"01e50bcf-190a-49f2-bde4-84a59dc60d2f","Type":"ContainerDied","Data":"e396565d70e6b2ddf22fff736d04bd25530a5b4e8ce5d84bd3d573945c6bccb9"} Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.966988 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e396565d70e6b2ddf22fff736d04bd25530a5b4e8ce5d84bd3d573945c6bccb9" Dec 09 12:07:51 crc kubenswrapper[5002]: I1209 12:07:51.967045 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-f9wq6" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.073944 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-5wsxk"] Dec 09 12:07:52 crc kubenswrapper[5002]: E1209 12:07:52.074369 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e50bcf-190a-49f2-bde4-84a59dc60d2f" containerName="run-os-openstack-openstack-cell1" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.074390 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e50bcf-190a-49f2-bde4-84a59dc60d2f" containerName="run-os-openstack-openstack-cell1" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.074779 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e50bcf-190a-49f2-bde4-84a59dc60d2f" containerName="run-os-openstack-openstack-cell1" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.075883 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-5wsxk"] Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.076025 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.082176 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.082302 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.082326 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.088391 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.189853 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.190229 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlf4g\" (UniqueName: \"kubernetes.io/projected/06e24666-e7a6-4d17-b4fa-237ecc575eff-kube-api-access-xlf4g\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.190294 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ceph\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.190464 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-inventory\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.292454 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ceph\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.292605 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-inventory\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.292772 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.292831 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlf4g\" (UniqueName: \"kubernetes.io/projected/06e24666-e7a6-4d17-b4fa-237ecc575eff-kube-api-access-xlf4g\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.298083 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ceph\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.298377 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.300511 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-inventory\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.315017 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlf4g\" (UniqueName: \"kubernetes.io/projected/06e24666-e7a6-4d17-b4fa-237ecc575eff-kube-api-access-xlf4g\") pod \"reboot-os-openstack-openstack-cell1-5wsxk\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:52 crc kubenswrapper[5002]: I1209 12:07:52.407025 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:07:53 crc kubenswrapper[5002]: I1209 12:07:53.130113 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-5wsxk"] Dec 09 12:07:53 crc kubenswrapper[5002]: W1209 12:07:53.138235 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e24666_e7a6_4d17_b4fa_237ecc575eff.slice/crio-4f382dff4a5b3ab5230e4c178f79d5467d388117ffdf86f770cadd4dda1ebc0f WatchSource:0}: Error finding container 4f382dff4a5b3ab5230e4c178f79d5467d388117ffdf86f770cadd4dda1ebc0f: Status 404 returned error can't find the container with id 4f382dff4a5b3ab5230e4c178f79d5467d388117ffdf86f770cadd4dda1ebc0f Dec 09 12:07:53 crc kubenswrapper[5002]: I1209 12:07:53.988585 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" event={"ID":"06e24666-e7a6-4d17-b4fa-237ecc575eff","Type":"ContainerStarted","Data":"6a3f96eadbbf9fdc3674d971dd8d464d4628cc7b383e766d3ebd45d3615c3840"} Dec 09 12:07:53 crc kubenswrapper[5002]: I1209 12:07:53.989039 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" event={"ID":"06e24666-e7a6-4d17-b4fa-237ecc575eff","Type":"ContainerStarted","Data":"4f382dff4a5b3ab5230e4c178f79d5467d388117ffdf86f770cadd4dda1ebc0f"} Dec 09 12:07:54 crc kubenswrapper[5002]: I1209 12:07:54.017216 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" podStartSLOduration=1.482721646 podStartE2EDuration="2.017197782s" podCreationTimestamp="2025-12-09 12:07:52 +0000 UTC" firstStartedPulling="2025-12-09 12:07:53.140406239 +0000 UTC m=+7605.532457320" lastFinishedPulling="2025-12-09 12:07:53.674882365 +0000 UTC m=+7606.066933456" observedRunningTime="2025-12-09 12:07:54.013367649 +0000 UTC m=+7606.405418740" watchObservedRunningTime="2025-12-09 12:07:54.017197782 +0000 UTC m=+7606.409248863" Dec 09 12:07:59 crc kubenswrapper[5002]: I1209 12:07:59.060274 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:07:59 crc kubenswrapper[5002]: E1209 12:07:59.061087 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:08:10 crc kubenswrapper[5002]: I1209 12:08:10.061462 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:08:10 crc kubenswrapper[5002]: E1209 12:08:10.062330 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:08:10 crc kubenswrapper[5002]: I1209 12:08:10.158649 5002 generic.go:334] "Generic (PLEG): container finished" podID="06e24666-e7a6-4d17-b4fa-237ecc575eff" containerID="6a3f96eadbbf9fdc3674d971dd8d464d4628cc7b383e766d3ebd45d3615c3840" exitCode=0 Dec 09 12:08:10 crc kubenswrapper[5002]: I1209 12:08:10.158704 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" event={"ID":"06e24666-e7a6-4d17-b4fa-237ecc575eff","Type":"ContainerDied","Data":"6a3f96eadbbf9fdc3674d971dd8d464d4628cc7b383e766d3ebd45d3615c3840"} Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.722093 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.869424 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ssh-key\") pod \"06e24666-e7a6-4d17-b4fa-237ecc575eff\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.869472 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-inventory\") pod \"06e24666-e7a6-4d17-b4fa-237ecc575eff\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.869600 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ceph\") pod \"06e24666-e7a6-4d17-b4fa-237ecc575eff\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.869647 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlf4g\" (UniqueName: \"kubernetes.io/projected/06e24666-e7a6-4d17-b4fa-237ecc575eff-kube-api-access-xlf4g\") pod \"06e24666-e7a6-4d17-b4fa-237ecc575eff\" (UID: \"06e24666-e7a6-4d17-b4fa-237ecc575eff\") " Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.879140 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ceph" (OuterVolumeSpecName: "ceph") pod "06e24666-e7a6-4d17-b4fa-237ecc575eff" (UID: "06e24666-e7a6-4d17-b4fa-237ecc575eff"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.879189 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e24666-e7a6-4d17-b4fa-237ecc575eff-kube-api-access-xlf4g" (OuterVolumeSpecName: "kube-api-access-xlf4g") pod "06e24666-e7a6-4d17-b4fa-237ecc575eff" (UID: "06e24666-e7a6-4d17-b4fa-237ecc575eff"). InnerVolumeSpecName "kube-api-access-xlf4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.901314 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-inventory" (OuterVolumeSpecName: "inventory") pod "06e24666-e7a6-4d17-b4fa-237ecc575eff" (UID: "06e24666-e7a6-4d17-b4fa-237ecc575eff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.903873 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "06e24666-e7a6-4d17-b4fa-237ecc575eff" (UID: "06e24666-e7a6-4d17-b4fa-237ecc575eff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.972512 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.972553 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.972574 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/06e24666-e7a6-4d17-b4fa-237ecc575eff-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:11 crc kubenswrapper[5002]: I1209 12:08:11.972590 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlf4g\" (UniqueName: \"kubernetes.io/projected/06e24666-e7a6-4d17-b4fa-237ecc575eff-kube-api-access-xlf4g\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.180071 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" event={"ID":"06e24666-e7a6-4d17-b4fa-237ecc575eff","Type":"ContainerDied","Data":"4f382dff4a5b3ab5230e4c178f79d5467d388117ffdf86f770cadd4dda1ebc0f"} Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.180112 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f382dff4a5b3ab5230e4c178f79d5467d388117ffdf86f770cadd4dda1ebc0f" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.180148 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-5wsxk" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.277410 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-j5mgh"] Dec 09 12:08:12 crc kubenswrapper[5002]: E1209 12:08:12.278253 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e24666-e7a6-4d17-b4fa-237ecc575eff" containerName="reboot-os-openstack-openstack-cell1" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.278273 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e24666-e7a6-4d17-b4fa-237ecc575eff" containerName="reboot-os-openstack-openstack-cell1" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.278526 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e24666-e7a6-4d17-b4fa-237ecc575eff" containerName="reboot-os-openstack-openstack-cell1" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.279435 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.282364 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.282561 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.282656 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.282739 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.290231 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-j5mgh"] Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380155 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-inventory\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380217 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380239 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380291 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ceph\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380349 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380368 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380543 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380622 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380721 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b4zz\" (UniqueName: \"kubernetes.io/projected/16490275-ede7-4620-aa87-69e7c31b3dec-kube-api-access-9b4zz\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380857 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ssh-key\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380906 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.380951 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.482691 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-inventory\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.482750 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.482775 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.482827 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ceph\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.482888 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.482913 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.482951 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.482968 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.483746 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b4zz\" (UniqueName: \"kubernetes.io/projected/16490275-ede7-4620-aa87-69e7c31b3dec-kube-api-access-9b4zz\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.483791 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ssh-key\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.483862 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.483897 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.488940 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ssh-key\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.489289 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.489599 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.489856 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.490052 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.490150 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.490242 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ceph\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.491470 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-inventory\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.491598 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.494636 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.498157 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.501217 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b4zz\" (UniqueName: \"kubernetes.io/projected/16490275-ede7-4620-aa87-69e7c31b3dec-kube-api-access-9b4zz\") pod \"install-certs-openstack-openstack-cell1-j5mgh\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:12 crc kubenswrapper[5002]: I1209 12:08:12.604454 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:13 crc kubenswrapper[5002]: I1209 12:08:13.185862 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-j5mgh"] Dec 09 12:08:13 crc kubenswrapper[5002]: W1209 12:08:13.195574 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16490275_ede7_4620_aa87_69e7c31b3dec.slice/crio-1a925d01dc39337ce3ba8b1e72c0d2994886f3d100239888f2a7d418c55e1ca3 WatchSource:0}: Error finding container 1a925d01dc39337ce3ba8b1e72c0d2994886f3d100239888f2a7d418c55e1ca3: Status 404 returned error can't find the container with id 1a925d01dc39337ce3ba8b1e72c0d2994886f3d100239888f2a7d418c55e1ca3 Dec 09 12:08:14 crc kubenswrapper[5002]: I1209 12:08:14.204497 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" event={"ID":"16490275-ede7-4620-aa87-69e7c31b3dec","Type":"ContainerStarted","Data":"9034ab3fc8118f4c8d6d47195bba2c805f45bff3602ec789aad93123985ac942"} Dec 09 12:08:14 crc kubenswrapper[5002]: I1209 12:08:14.205401 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" event={"ID":"16490275-ede7-4620-aa87-69e7c31b3dec","Type":"ContainerStarted","Data":"1a925d01dc39337ce3ba8b1e72c0d2994886f3d100239888f2a7d418c55e1ca3"} Dec 09 12:08:21 crc kubenswrapper[5002]: I1209 12:08:21.060961 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:08:21 crc kubenswrapper[5002]: E1209 12:08:21.061907 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:08:34 crc kubenswrapper[5002]: I1209 12:08:34.060621 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:08:34 crc kubenswrapper[5002]: E1209 12:08:34.061404 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:08:34 crc kubenswrapper[5002]: I1209 12:08:34.429046 5002 generic.go:334] "Generic (PLEG): container finished" podID="16490275-ede7-4620-aa87-69e7c31b3dec" containerID="9034ab3fc8118f4c8d6d47195bba2c805f45bff3602ec789aad93123985ac942" exitCode=0 Dec 09 12:08:34 crc kubenswrapper[5002]: I1209 12:08:34.429092 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" event={"ID":"16490275-ede7-4620-aa87-69e7c31b3dec","Type":"ContainerDied","Data":"9034ab3fc8118f4c8d6d47195bba2c805f45bff3602ec789aad93123985ac942"} Dec 09 12:08:35 crc kubenswrapper[5002]: I1209 12:08:35.980989 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046129 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-metadata-combined-ca-bundle\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046258 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-bootstrap-combined-ca-bundle\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046301 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b4zz\" (UniqueName: \"kubernetes.io/projected/16490275-ede7-4620-aa87-69e7c31b3dec-kube-api-access-9b4zz\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046325 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-sriov-combined-ca-bundle\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046367 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-dhcp-combined-ca-bundle\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046400 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-telemetry-combined-ca-bundle\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046465 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ovn-combined-ca-bundle\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046497 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-inventory\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046524 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-nova-combined-ca-bundle\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046611 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ssh-key\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046664 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-libvirt-combined-ca-bundle\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.046783 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ceph\") pod \"16490275-ede7-4620-aa87-69e7c31b3dec\" (UID: \"16490275-ede7-4620-aa87-69e7c31b3dec\") " Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.052434 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.053013 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.053372 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16490275-ede7-4620-aa87-69e7c31b3dec-kube-api-access-9b4zz" (OuterVolumeSpecName: "kube-api-access-9b4zz") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "kube-api-access-9b4zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.053537 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.053628 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.053832 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.053903 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ceph" (OuterVolumeSpecName: "ceph") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.053922 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.054128 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.055203 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.082050 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.082984 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-inventory" (OuterVolumeSpecName: "inventory") pod "16490275-ede7-4620-aa87-69e7c31b3dec" (UID: "16490275-ede7-4620-aa87-69e7c31b3dec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.149967 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.149990 5002 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.150001 5002 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.150013 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b4zz\" (UniqueName: \"kubernetes.io/projected/16490275-ede7-4620-aa87-69e7c31b3dec-kube-api-access-9b4zz\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.150022 5002 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.150030 5002 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.150039 5002 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.150048 5002 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.150057 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.150065 5002 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.150072 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.150079 5002 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16490275-ede7-4620-aa87-69e7c31b3dec-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.454159 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" event={"ID":"16490275-ede7-4620-aa87-69e7c31b3dec","Type":"ContainerDied","Data":"1a925d01dc39337ce3ba8b1e72c0d2994886f3d100239888f2a7d418c55e1ca3"} Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.454453 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a925d01dc39337ce3ba8b1e72c0d2994886f3d100239888f2a7d418c55e1ca3" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.454216 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-j5mgh" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.591608 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-hzzxs"] Dec 09 12:08:36 crc kubenswrapper[5002]: E1209 12:08:36.592233 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16490275-ede7-4620-aa87-69e7c31b3dec" containerName="install-certs-openstack-openstack-cell1" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.592274 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="16490275-ede7-4620-aa87-69e7c31b3dec" containerName="install-certs-openstack-openstack-cell1" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.592575 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="16490275-ede7-4620-aa87-69e7c31b3dec" containerName="install-certs-openstack-openstack-cell1" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.593540 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.596180 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.596499 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.596595 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.596648 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.611604 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-hzzxs"] Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.661842 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgd2j\" (UniqueName: \"kubernetes.io/projected/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-kube-api-access-qgd2j\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.662314 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-inventory\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.662441 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.662562 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ceph\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.764003 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-inventory\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.764060 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.764158 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ceph\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.764257 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgd2j\" (UniqueName: \"kubernetes.io/projected/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-kube-api-access-qgd2j\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.768774 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.768784 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ceph\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.770603 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-inventory\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.787279 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgd2j\" (UniqueName: \"kubernetes.io/projected/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-kube-api-access-qgd2j\") pod \"ceph-client-openstack-openstack-cell1-hzzxs\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:36 crc kubenswrapper[5002]: I1209 12:08:36.923087 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:37 crc kubenswrapper[5002]: I1209 12:08:37.565431 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-hzzxs"] Dec 09 12:08:38 crc kubenswrapper[5002]: I1209 12:08:38.484132 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" event={"ID":"a980500d-11f1-45f3-8eeb-9bec9abe3fd2","Type":"ContainerStarted","Data":"be2de329d50a15c5cb16975e613bfc42fd0f1fda0d18855ff9b703f15a0916c8"} Dec 09 12:08:38 crc kubenswrapper[5002]: I1209 12:08:38.484565 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" event={"ID":"a980500d-11f1-45f3-8eeb-9bec9abe3fd2","Type":"ContainerStarted","Data":"212eae0e297a55d40810e677465b10157559cc28e06ef4ad492388b1849b6d6e"} Dec 09 12:08:38 crc kubenswrapper[5002]: I1209 12:08:38.510064 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" podStartSLOduration=2.067533318 podStartE2EDuration="2.51004172s" podCreationTimestamp="2025-12-09 12:08:36 +0000 UTC" firstStartedPulling="2025-12-09 12:08:37.570510155 +0000 UTC m=+7649.962561226" lastFinishedPulling="2025-12-09 12:08:38.013018547 +0000 UTC m=+7650.405069628" observedRunningTime="2025-12-09 12:08:38.506080364 +0000 UTC m=+7650.898131465" watchObservedRunningTime="2025-12-09 12:08:38.51004172 +0000 UTC m=+7650.902092811" Dec 09 12:08:43 crc kubenswrapper[5002]: I1209 12:08:43.537722 5002 generic.go:334] "Generic (PLEG): container finished" podID="a980500d-11f1-45f3-8eeb-9bec9abe3fd2" containerID="be2de329d50a15c5cb16975e613bfc42fd0f1fda0d18855ff9b703f15a0916c8" exitCode=0 Dec 09 12:08:43 crc kubenswrapper[5002]: I1209 12:08:43.537806 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" event={"ID":"a980500d-11f1-45f3-8eeb-9bec9abe3fd2","Type":"ContainerDied","Data":"be2de329d50a15c5cb16975e613bfc42fd0f1fda0d18855ff9b703f15a0916c8"} Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.112310 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.197336 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ssh-key\") pod \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.197419 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ceph\") pod \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.197461 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgd2j\" (UniqueName: \"kubernetes.io/projected/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-kube-api-access-qgd2j\") pod \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.197481 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-inventory\") pod \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\" (UID: \"a980500d-11f1-45f3-8eeb-9bec9abe3fd2\") " Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.203310 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ceph" (OuterVolumeSpecName: "ceph") pod "a980500d-11f1-45f3-8eeb-9bec9abe3fd2" (UID: "a980500d-11f1-45f3-8eeb-9bec9abe3fd2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.204009 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-kube-api-access-qgd2j" (OuterVolumeSpecName: "kube-api-access-qgd2j") pod "a980500d-11f1-45f3-8eeb-9bec9abe3fd2" (UID: "a980500d-11f1-45f3-8eeb-9bec9abe3fd2"). InnerVolumeSpecName "kube-api-access-qgd2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.230632 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a980500d-11f1-45f3-8eeb-9bec9abe3fd2" (UID: "a980500d-11f1-45f3-8eeb-9bec9abe3fd2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.240557 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-inventory" (OuterVolumeSpecName: "inventory") pod "a980500d-11f1-45f3-8eeb-9bec9abe3fd2" (UID: "a980500d-11f1-45f3-8eeb-9bec9abe3fd2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.300743 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.300780 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.300789 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgd2j\" (UniqueName: \"kubernetes.io/projected/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-kube-api-access-qgd2j\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.300798 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a980500d-11f1-45f3-8eeb-9bec9abe3fd2-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.561574 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" event={"ID":"a980500d-11f1-45f3-8eeb-9bec9abe3fd2","Type":"ContainerDied","Data":"212eae0e297a55d40810e677465b10157559cc28e06ef4ad492388b1849b6d6e"} Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.561627 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-hzzxs" Dec 09 12:08:45 crc kubenswrapper[5002]: I1209 12:08:45.561629 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="212eae0e297a55d40810e677465b10157559cc28e06ef4ad492388b1849b6d6e" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.064435 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:08:46 crc kubenswrapper[5002]: E1209 12:08:46.065548 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.079786 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-hhmxb"] Dec 09 12:08:46 crc kubenswrapper[5002]: E1209 12:08:46.080236 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a980500d-11f1-45f3-8eeb-9bec9abe3fd2" containerName="ceph-client-openstack-openstack-cell1" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.080259 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a980500d-11f1-45f3-8eeb-9bec9abe3fd2" containerName="ceph-client-openstack-openstack-cell1" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.080524 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a980500d-11f1-45f3-8eeb-9bec9abe3fd2" containerName="ceph-client-openstack-openstack-cell1" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.081535 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.083771 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.083832 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.084703 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.084937 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.086038 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.098354 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-hhmxb"] Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.220113 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-inventory\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.220246 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b544\" (UniqueName: \"kubernetes.io/projected/c17d588e-aecc-4b15-b345-a461e9e20a58-kube-api-access-4b544\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.220330 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ssh-key\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.220363 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ceph\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.220640 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.220686 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c17d588e-aecc-4b15-b345-a461e9e20a58-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.322296 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.322349 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c17d588e-aecc-4b15-b345-a461e9e20a58-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.322380 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-inventory\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.322427 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b544\" (UniqueName: \"kubernetes.io/projected/c17d588e-aecc-4b15-b345-a461e9e20a58-kube-api-access-4b544\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.322453 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ssh-key\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.322479 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ceph\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.323386 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c17d588e-aecc-4b15-b345-a461e9e20a58-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.329269 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ssh-key\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.330495 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ceph\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.337370 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-inventory\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.337910 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.342803 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b544\" (UniqueName: \"kubernetes.io/projected/c17d588e-aecc-4b15-b345-a461e9e20a58-kube-api-access-4b544\") pod \"ovn-openstack-openstack-cell1-hhmxb\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:46 crc kubenswrapper[5002]: I1209 12:08:46.406685 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:08:47 crc kubenswrapper[5002]: I1209 12:08:47.009088 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-hhmxb"] Dec 09 12:08:47 crc kubenswrapper[5002]: I1209 12:08:47.594165 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-hhmxb" event={"ID":"c17d588e-aecc-4b15-b345-a461e9e20a58","Type":"ContainerStarted","Data":"c2427246ab6a7be826cf1ad2e9ee45183de75582152e2adc88f8ac08ffaaf478"} Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.094839 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lf6jh"] Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.097374 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.105404 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf6jh"] Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.164437 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8565\" (UniqueName: \"kubernetes.io/projected/34e3e3c9-e348-43f4-b529-94e49179d92b-kube-api-access-n8565\") pod \"redhat-marketplace-lf6jh\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.164947 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-utilities\") pod \"redhat-marketplace-lf6jh\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.165017 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-catalog-content\") pod \"redhat-marketplace-lf6jh\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.268543 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8565\" (UniqueName: \"kubernetes.io/projected/34e3e3c9-e348-43f4-b529-94e49179d92b-kube-api-access-n8565\") pod \"redhat-marketplace-lf6jh\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.268668 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-utilities\") pod \"redhat-marketplace-lf6jh\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.268717 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-catalog-content\") pod \"redhat-marketplace-lf6jh\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.269427 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-catalog-content\") pod \"redhat-marketplace-lf6jh\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.269592 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-utilities\") pod \"redhat-marketplace-lf6jh\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.295971 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8565\" (UniqueName: \"kubernetes.io/projected/34e3e3c9-e348-43f4-b529-94e49179d92b-kube-api-access-n8565\") pod \"redhat-marketplace-lf6jh\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.443006 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.612235 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-hhmxb" event={"ID":"c17d588e-aecc-4b15-b345-a461e9e20a58","Type":"ContainerStarted","Data":"d2fdc580979c399c391c8256c3eaf2cb91801501be9b9bade5523922b21f5c07"} Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.640300 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-hhmxb" podStartSLOduration=1.924368436 podStartE2EDuration="2.640274367s" podCreationTimestamp="2025-12-09 12:08:46 +0000 UTC" firstStartedPulling="2025-12-09 12:08:47.018494894 +0000 UTC m=+7659.410545975" lastFinishedPulling="2025-12-09 12:08:47.734400825 +0000 UTC m=+7660.126451906" observedRunningTime="2025-12-09 12:08:48.632779087 +0000 UTC m=+7661.024830168" watchObservedRunningTime="2025-12-09 12:08:48.640274367 +0000 UTC m=+7661.032325468" Dec 09 12:08:48 crc kubenswrapper[5002]: I1209 12:08:48.988090 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf6jh"] Dec 09 12:08:49 crc kubenswrapper[5002]: W1209 12:08:49.008904 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e3e3c9_e348_43f4_b529_94e49179d92b.slice/crio-c0a0ba7f0c523a4c73afa46c3bee975a602980dae0710e97ee2f4a15bd919dd8 WatchSource:0}: Error finding container c0a0ba7f0c523a4c73afa46c3bee975a602980dae0710e97ee2f4a15bd919dd8: Status 404 returned error can't find the container with id c0a0ba7f0c523a4c73afa46c3bee975a602980dae0710e97ee2f4a15bd919dd8 Dec 09 12:08:49 crc kubenswrapper[5002]: I1209 12:08:49.629752 5002 generic.go:334] "Generic (PLEG): container finished" podID="34e3e3c9-e348-43f4-b529-94e49179d92b" containerID="c6273d6042e741ea4d54b68c52f4b3a1fe014e22fd27589ce55de754d2761fbb" exitCode=0 Dec 09 12:08:49 crc kubenswrapper[5002]: I1209 12:08:49.629792 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf6jh" event={"ID":"34e3e3c9-e348-43f4-b529-94e49179d92b","Type":"ContainerDied","Data":"c6273d6042e741ea4d54b68c52f4b3a1fe014e22fd27589ce55de754d2761fbb"} Dec 09 12:08:49 crc kubenswrapper[5002]: I1209 12:08:49.630202 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf6jh" event={"ID":"34e3e3c9-e348-43f4-b529-94e49179d92b","Type":"ContainerStarted","Data":"c0a0ba7f0c523a4c73afa46c3bee975a602980dae0710e97ee2f4a15bd919dd8"} Dec 09 12:08:50 crc kubenswrapper[5002]: I1209 12:08:50.644572 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf6jh" event={"ID":"34e3e3c9-e348-43f4-b529-94e49179d92b","Type":"ContainerStarted","Data":"aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa"} Dec 09 12:08:51 crc kubenswrapper[5002]: I1209 12:08:51.660455 5002 generic.go:334] "Generic (PLEG): container finished" podID="34e3e3c9-e348-43f4-b529-94e49179d92b" containerID="aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa" exitCode=0 Dec 09 12:08:51 crc kubenswrapper[5002]: I1209 12:08:51.660614 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf6jh" event={"ID":"34e3e3c9-e348-43f4-b529-94e49179d92b","Type":"ContainerDied","Data":"aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa"} Dec 09 12:08:52 crc kubenswrapper[5002]: I1209 12:08:52.675072 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf6jh" event={"ID":"34e3e3c9-e348-43f4-b529-94e49179d92b","Type":"ContainerStarted","Data":"f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d"} Dec 09 12:08:52 crc kubenswrapper[5002]: I1209 12:08:52.697513 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lf6jh" podStartSLOduration=2.273881367 podStartE2EDuration="4.697493803s" podCreationTimestamp="2025-12-09 12:08:48 +0000 UTC" firstStartedPulling="2025-12-09 12:08:49.633307296 +0000 UTC m=+7662.025358417" lastFinishedPulling="2025-12-09 12:08:52.056919782 +0000 UTC m=+7664.448970853" observedRunningTime="2025-12-09 12:08:52.692598142 +0000 UTC m=+7665.084649243" watchObservedRunningTime="2025-12-09 12:08:52.697493803 +0000 UTC m=+7665.089544884" Dec 09 12:08:57 crc kubenswrapper[5002]: I1209 12:08:57.061088 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:08:57 crc kubenswrapper[5002]: E1209 12:08:57.062069 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:08:58 crc kubenswrapper[5002]: I1209 12:08:58.443425 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:58 crc kubenswrapper[5002]: I1209 12:08:58.443489 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:58 crc kubenswrapper[5002]: I1209 12:08:58.508334 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:58 crc kubenswrapper[5002]: I1209 12:08:58.793676 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:08:58 crc kubenswrapper[5002]: I1209 12:08:58.867881 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf6jh"] Dec 09 12:09:00 crc kubenswrapper[5002]: I1209 12:09:00.761970 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lf6jh" podUID="34e3e3c9-e348-43f4-b529-94e49179d92b" containerName="registry-server" containerID="cri-o://f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d" gracePeriod=2 Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.560045 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.709082 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8565\" (UniqueName: \"kubernetes.io/projected/34e3e3c9-e348-43f4-b529-94e49179d92b-kube-api-access-n8565\") pod \"34e3e3c9-e348-43f4-b529-94e49179d92b\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.709226 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-utilities\") pod \"34e3e3c9-e348-43f4-b529-94e49179d92b\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.709335 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-catalog-content\") pod \"34e3e3c9-e348-43f4-b529-94e49179d92b\" (UID: \"34e3e3c9-e348-43f4-b529-94e49179d92b\") " Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.710860 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-utilities" (OuterVolumeSpecName: "utilities") pod "34e3e3c9-e348-43f4-b529-94e49179d92b" (UID: "34e3e3c9-e348-43f4-b529-94e49179d92b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.722721 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e3e3c9-e348-43f4-b529-94e49179d92b-kube-api-access-n8565" (OuterVolumeSpecName: "kube-api-access-n8565") pod "34e3e3c9-e348-43f4-b529-94e49179d92b" (UID: "34e3e3c9-e348-43f4-b529-94e49179d92b"). InnerVolumeSpecName "kube-api-access-n8565". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.750493 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34e3e3c9-e348-43f4-b529-94e49179d92b" (UID: "34e3e3c9-e348-43f4-b529-94e49179d92b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.773460 5002 generic.go:334] "Generic (PLEG): container finished" podID="34e3e3c9-e348-43f4-b529-94e49179d92b" containerID="f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d" exitCode=0 Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.773505 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf6jh" event={"ID":"34e3e3c9-e348-43f4-b529-94e49179d92b","Type":"ContainerDied","Data":"f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d"} Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.773544 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf6jh" event={"ID":"34e3e3c9-e348-43f4-b529-94e49179d92b","Type":"ContainerDied","Data":"c0a0ba7f0c523a4c73afa46c3bee975a602980dae0710e97ee2f4a15bd919dd8"} Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.773561 5002 scope.go:117] "RemoveContainer" containerID="f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.773564 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf6jh" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.807945 5002 scope.go:117] "RemoveContainer" containerID="aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.813360 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8565\" (UniqueName: \"kubernetes.io/projected/34e3e3c9-e348-43f4-b529-94e49179d92b-kube-api-access-n8565\") on node \"crc\" DevicePath \"\"" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.813400 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.813418 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e3e3c9-e348-43f4-b529-94e49179d92b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.822939 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf6jh"] Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.834841 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf6jh"] Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.843247 5002 scope.go:117] "RemoveContainer" containerID="c6273d6042e741ea4d54b68c52f4b3a1fe014e22fd27589ce55de754d2761fbb" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.880725 5002 scope.go:117] "RemoveContainer" containerID="f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d" Dec 09 12:09:01 crc kubenswrapper[5002]: E1209 12:09:01.881205 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d\": container with ID starting with f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d not found: ID does not exist" containerID="f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.881256 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d"} err="failed to get container status \"f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d\": rpc error: code = NotFound desc = could not find container \"f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d\": container with ID starting with f9e3f36a4e8e49b34455dbb873a17468d3ba59f29ea4c26cee5d33259ed6553d not found: ID does not exist" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.881284 5002 scope.go:117] "RemoveContainer" containerID="aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa" Dec 09 12:09:01 crc kubenswrapper[5002]: E1209 12:09:01.881862 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa\": container with ID starting with aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa not found: ID does not exist" containerID="aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.881936 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa"} err="failed to get container status \"aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa\": rpc error: code = NotFound desc = could not find container \"aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa\": container with ID starting with aba7664e2e927fc134616067e986dc990cf82ba764da4f3a380a9b0580b8e4fa not found: ID does not exist" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.881976 5002 scope.go:117] "RemoveContainer" containerID="c6273d6042e741ea4d54b68c52f4b3a1fe014e22fd27589ce55de754d2761fbb" Dec 09 12:09:01 crc kubenswrapper[5002]: E1209 12:09:01.882379 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6273d6042e741ea4d54b68c52f4b3a1fe014e22fd27589ce55de754d2761fbb\": container with ID starting with c6273d6042e741ea4d54b68c52f4b3a1fe014e22fd27589ce55de754d2761fbb not found: ID does not exist" containerID="c6273d6042e741ea4d54b68c52f4b3a1fe014e22fd27589ce55de754d2761fbb" Dec 09 12:09:01 crc kubenswrapper[5002]: I1209 12:09:01.882409 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6273d6042e741ea4d54b68c52f4b3a1fe014e22fd27589ce55de754d2761fbb"} err="failed to get container status \"c6273d6042e741ea4d54b68c52f4b3a1fe014e22fd27589ce55de754d2761fbb\": rpc error: code = NotFound desc = could not find container \"c6273d6042e741ea4d54b68c52f4b3a1fe014e22fd27589ce55de754d2761fbb\": container with ID starting with c6273d6042e741ea4d54b68c52f4b3a1fe014e22fd27589ce55de754d2761fbb not found: ID does not exist" Dec 09 12:09:02 crc kubenswrapper[5002]: I1209 12:09:02.088566 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e3e3c9-e348-43f4-b529-94e49179d92b" path="/var/lib/kubelet/pods/34e3e3c9-e348-43f4-b529-94e49179d92b/volumes" Dec 09 12:09:09 crc kubenswrapper[5002]: I1209 12:09:09.060257 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:09:09 crc kubenswrapper[5002]: E1209 12:09:09.061030 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:09:24 crc kubenswrapper[5002]: I1209 12:09:24.063092 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:09:24 crc kubenswrapper[5002]: E1209 12:09:24.063977 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:09:39 crc kubenswrapper[5002]: I1209 12:09:39.061127 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:09:39 crc kubenswrapper[5002]: E1209 12:09:39.062170 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:09:51 crc kubenswrapper[5002]: I1209 12:09:51.059684 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:09:51 crc kubenswrapper[5002]: E1209 12:09:51.060589 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:09:54 crc kubenswrapper[5002]: E1209 12:09:54.217169 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc17d588e_aecc_4b15_b345_a461e9e20a58.slice/crio-conmon-d2fdc580979c399c391c8256c3eaf2cb91801501be9b9bade5523922b21f5c07.scope\": RecentStats: unable to find data in memory cache]" Dec 09 12:09:54 crc kubenswrapper[5002]: I1209 12:09:54.295588 5002 generic.go:334] "Generic (PLEG): container finished" podID="c17d588e-aecc-4b15-b345-a461e9e20a58" containerID="d2fdc580979c399c391c8256c3eaf2cb91801501be9b9bade5523922b21f5c07" exitCode=0 Dec 09 12:09:54 crc kubenswrapper[5002]: I1209 12:09:54.295920 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-hhmxb" event={"ID":"c17d588e-aecc-4b15-b345-a461e9e20a58","Type":"ContainerDied","Data":"d2fdc580979c399c391c8256c3eaf2cb91801501be9b9bade5523922b21f5c07"} Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.752959 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.857798 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ceph\") pod \"c17d588e-aecc-4b15-b345-a461e9e20a58\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.857868 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b544\" (UniqueName: \"kubernetes.io/projected/c17d588e-aecc-4b15-b345-a461e9e20a58-kube-api-access-4b544\") pod \"c17d588e-aecc-4b15-b345-a461e9e20a58\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.858098 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ovn-combined-ca-bundle\") pod \"c17d588e-aecc-4b15-b345-a461e9e20a58\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.858121 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c17d588e-aecc-4b15-b345-a461e9e20a58-ovncontroller-config-0\") pod \"c17d588e-aecc-4b15-b345-a461e9e20a58\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.858145 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ssh-key\") pod \"c17d588e-aecc-4b15-b345-a461e9e20a58\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.858207 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-inventory\") pod \"c17d588e-aecc-4b15-b345-a461e9e20a58\" (UID: \"c17d588e-aecc-4b15-b345-a461e9e20a58\") " Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.873078 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ceph" (OuterVolumeSpecName: "ceph") pod "c17d588e-aecc-4b15-b345-a461e9e20a58" (UID: "c17d588e-aecc-4b15-b345-a461e9e20a58"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.873094 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c17d588e-aecc-4b15-b345-a461e9e20a58" (UID: "c17d588e-aecc-4b15-b345-a461e9e20a58"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.879011 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c17d588e-aecc-4b15-b345-a461e9e20a58-kube-api-access-4b544" (OuterVolumeSpecName: "kube-api-access-4b544") pod "c17d588e-aecc-4b15-b345-a461e9e20a58" (UID: "c17d588e-aecc-4b15-b345-a461e9e20a58"). InnerVolumeSpecName "kube-api-access-4b544". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.886710 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c17d588e-aecc-4b15-b345-a461e9e20a58-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "c17d588e-aecc-4b15-b345-a461e9e20a58" (UID: "c17d588e-aecc-4b15-b345-a461e9e20a58"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.902631 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c17d588e-aecc-4b15-b345-a461e9e20a58" (UID: "c17d588e-aecc-4b15-b345-a461e9e20a58"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.911461 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-inventory" (OuterVolumeSpecName: "inventory") pod "c17d588e-aecc-4b15-b345-a461e9e20a58" (UID: "c17d588e-aecc-4b15-b345-a461e9e20a58"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.960731 5002 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.960769 5002 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c17d588e-aecc-4b15-b345-a461e9e20a58-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.960780 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.960792 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.960800 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c17d588e-aecc-4b15-b345-a461e9e20a58-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:09:55 crc kubenswrapper[5002]: I1209 12:09:55.960823 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b544\" (UniqueName: \"kubernetes.io/projected/c17d588e-aecc-4b15-b345-a461e9e20a58-kube-api-access-4b544\") on node \"crc\" DevicePath \"\"" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.321625 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-hhmxb" event={"ID":"c17d588e-aecc-4b15-b345-a461e9e20a58","Type":"ContainerDied","Data":"c2427246ab6a7be826cf1ad2e9ee45183de75582152e2adc88f8ac08ffaaf478"} Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.321686 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2427246ab6a7be826cf1ad2e9ee45183de75582152e2adc88f8ac08ffaaf478" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.321721 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-hhmxb" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.496698 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-wctks"] Dec 09 12:09:56 crc kubenswrapper[5002]: E1209 12:09:56.497153 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e3e3c9-e348-43f4-b529-94e49179d92b" containerName="extract-content" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.497170 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e3e3c9-e348-43f4-b529-94e49179d92b" containerName="extract-content" Dec 09 12:09:56 crc kubenswrapper[5002]: E1209 12:09:56.497191 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17d588e-aecc-4b15-b345-a461e9e20a58" containerName="ovn-openstack-openstack-cell1" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.497198 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17d588e-aecc-4b15-b345-a461e9e20a58" containerName="ovn-openstack-openstack-cell1" Dec 09 12:09:56 crc kubenswrapper[5002]: E1209 12:09:56.497218 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e3e3c9-e348-43f4-b529-94e49179d92b" containerName="extract-utilities" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.497226 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e3e3c9-e348-43f4-b529-94e49179d92b" containerName="extract-utilities" Dec 09 12:09:56 crc kubenswrapper[5002]: E1209 12:09:56.497241 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e3e3c9-e348-43f4-b529-94e49179d92b" containerName="registry-server" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.497248 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e3e3c9-e348-43f4-b529-94e49179d92b" containerName="registry-server" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.497435 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e3e3c9-e348-43f4-b529-94e49179d92b" containerName="registry-server" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.497452 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c17d588e-aecc-4b15-b345-a461e9e20a58" containerName="ovn-openstack-openstack-cell1" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.498219 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.501105 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.501286 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.501337 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.501371 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.501300 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.511735 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.522864 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-wctks"] Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.680011 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.680072 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.680157 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.680313 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.680404 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4ccf\" (UniqueName: \"kubernetes.io/projected/7d587a7b-aec3-4210-b8d7-b428bcf38686-kube-api-access-n4ccf\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.680449 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.680677 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.782303 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.782377 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.782487 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.782552 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.782602 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4ccf\" (UniqueName: \"kubernetes.io/projected/7d587a7b-aec3-4210-b8d7-b428bcf38686-kube-api-access-n4ccf\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.782642 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.782719 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.787181 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.787276 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.787781 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.788114 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.788670 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.790743 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.803727 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4ccf\" (UniqueName: \"kubernetes.io/projected/7d587a7b-aec3-4210-b8d7-b428bcf38686-kube-api-access-n4ccf\") pod \"neutron-metadata-openstack-openstack-cell1-wctks\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:56 crc kubenswrapper[5002]: I1209 12:09:56.820871 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:09:57 crc kubenswrapper[5002]: I1209 12:09:57.380368 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-wctks"] Dec 09 12:09:58 crc kubenswrapper[5002]: I1209 12:09:58.337273 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" event={"ID":"7d587a7b-aec3-4210-b8d7-b428bcf38686","Type":"ContainerStarted","Data":"8bca25cdb8c40686538019381760286686f410c07dd8d19403ef10cb8b01d9e4"} Dec 09 12:09:58 crc kubenswrapper[5002]: I1209 12:09:58.337783 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" event={"ID":"7d587a7b-aec3-4210-b8d7-b428bcf38686","Type":"ContainerStarted","Data":"947102950a6281ffcc4a02ebe4e1eb3d31fe5599374027d3d9b41171d7710e87"} Dec 09 12:09:58 crc kubenswrapper[5002]: I1209 12:09:58.358119 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" podStartSLOduration=1.789084063 podStartE2EDuration="2.358101096s" podCreationTimestamp="2025-12-09 12:09:56 +0000 UTC" firstStartedPulling="2025-12-09 12:09:57.394942678 +0000 UTC m=+7729.786993759" lastFinishedPulling="2025-12-09 12:09:57.963959701 +0000 UTC m=+7730.356010792" observedRunningTime="2025-12-09 12:09:58.353493182 +0000 UTC m=+7730.745544263" watchObservedRunningTime="2025-12-09 12:09:58.358101096 +0000 UTC m=+7730.750152177" Dec 09 12:10:02 crc kubenswrapper[5002]: I1209 12:10:02.061429 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:10:02 crc kubenswrapper[5002]: E1209 12:10:02.062396 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:10:17 crc kubenswrapper[5002]: I1209 12:10:17.060655 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:10:17 crc kubenswrapper[5002]: I1209 12:10:17.552704 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"65f13888df0eef768bcada705a22c1b4b809be0c915a856082867f96060629f5"} Dec 09 12:10:51 crc kubenswrapper[5002]: I1209 12:10:51.930595 5002 generic.go:334] "Generic (PLEG): container finished" podID="7d587a7b-aec3-4210-b8d7-b428bcf38686" containerID="8bca25cdb8c40686538019381760286686f410c07dd8d19403ef10cb8b01d9e4" exitCode=0 Dec 09 12:10:51 crc kubenswrapper[5002]: I1209 12:10:51.930681 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" event={"ID":"7d587a7b-aec3-4210-b8d7-b428bcf38686","Type":"ContainerDied","Data":"8bca25cdb8c40686538019381760286686f410c07dd8d19403ef10cb8b01d9e4"} Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.412045 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.443346 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-metadata-combined-ca-bundle\") pod \"7d587a7b-aec3-4210-b8d7-b428bcf38686\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.443493 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4ccf\" (UniqueName: \"kubernetes.io/projected/7d587a7b-aec3-4210-b8d7-b428bcf38686-kube-api-access-n4ccf\") pod \"7d587a7b-aec3-4210-b8d7-b428bcf38686\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.443521 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ssh-key\") pod \"7d587a7b-aec3-4210-b8d7-b428bcf38686\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.443542 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-inventory\") pod \"7d587a7b-aec3-4210-b8d7-b428bcf38686\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.443573 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-ovn-metadata-agent-neutron-config-0\") pod \"7d587a7b-aec3-4210-b8d7-b428bcf38686\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.443666 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ceph\") pod \"7d587a7b-aec3-4210-b8d7-b428bcf38686\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.443704 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-nova-metadata-neutron-config-0\") pod \"7d587a7b-aec3-4210-b8d7-b428bcf38686\" (UID: \"7d587a7b-aec3-4210-b8d7-b428bcf38686\") " Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.452556 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7d587a7b-aec3-4210-b8d7-b428bcf38686" (UID: "7d587a7b-aec3-4210-b8d7-b428bcf38686"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.460435 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d587a7b-aec3-4210-b8d7-b428bcf38686-kube-api-access-n4ccf" (OuterVolumeSpecName: "kube-api-access-n4ccf") pod "7d587a7b-aec3-4210-b8d7-b428bcf38686" (UID: "7d587a7b-aec3-4210-b8d7-b428bcf38686"). InnerVolumeSpecName "kube-api-access-n4ccf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.463856 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ceph" (OuterVolumeSpecName: "ceph") pod "7d587a7b-aec3-4210-b8d7-b428bcf38686" (UID: "7d587a7b-aec3-4210-b8d7-b428bcf38686"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.480248 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7d587a7b-aec3-4210-b8d7-b428bcf38686" (UID: "7d587a7b-aec3-4210-b8d7-b428bcf38686"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.487973 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "7d587a7b-aec3-4210-b8d7-b428bcf38686" (UID: "7d587a7b-aec3-4210-b8d7-b428bcf38686"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.498979 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-inventory" (OuterVolumeSpecName: "inventory") pod "7d587a7b-aec3-4210-b8d7-b428bcf38686" (UID: "7d587a7b-aec3-4210-b8d7-b428bcf38686"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.517149 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "7d587a7b-aec3-4210-b8d7-b428bcf38686" (UID: "7d587a7b-aec3-4210-b8d7-b428bcf38686"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.546615 5002 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.546658 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4ccf\" (UniqueName: \"kubernetes.io/projected/7d587a7b-aec3-4210-b8d7-b428bcf38686-kube-api-access-n4ccf\") on node \"crc\" DevicePath \"\"" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.546670 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.546678 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.546688 5002 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.546699 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.546709 5002 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7d587a7b-aec3-4210-b8d7-b428bcf38686-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.950748 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" event={"ID":"7d587a7b-aec3-4210-b8d7-b428bcf38686","Type":"ContainerDied","Data":"947102950a6281ffcc4a02ebe4e1eb3d31fe5599374027d3d9b41171d7710e87"} Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.951361 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="947102950a6281ffcc4a02ebe4e1eb3d31fe5599374027d3d9b41171d7710e87" Dec 09 12:10:53 crc kubenswrapper[5002]: I1209 12:10:53.950862 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-wctks" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.073130 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-mkzb6"] Dec 09 12:10:54 crc kubenswrapper[5002]: E1209 12:10:54.073577 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d587a7b-aec3-4210-b8d7-b428bcf38686" containerName="neutron-metadata-openstack-openstack-cell1" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.073602 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d587a7b-aec3-4210-b8d7-b428bcf38686" containerName="neutron-metadata-openstack-openstack-cell1" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.073865 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d587a7b-aec3-4210-b8d7-b428bcf38686" containerName="neutron-metadata-openstack-openstack-cell1" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.074735 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-mkzb6"] Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.074854 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.078175 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.078426 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.078616 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.078725 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.078760 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.160135 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ssh-key\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.160213 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j5gx\" (UniqueName: \"kubernetes.io/projected/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-kube-api-access-5j5gx\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.160403 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.160528 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ceph\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.160630 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.160849 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-inventory\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.262648 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ssh-key\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.262836 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j5gx\" (UniqueName: \"kubernetes.io/projected/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-kube-api-access-5j5gx\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.262918 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.262972 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ceph\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.263020 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.263134 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-inventory\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.268094 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-inventory\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.268468 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.269296 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ceph\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.272115 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ssh-key\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.273977 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.278796 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j5gx\" (UniqueName: \"kubernetes.io/projected/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-kube-api-access-5j5gx\") pod \"libvirt-openstack-openstack-cell1-mkzb6\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.402402 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.960938 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-mkzb6"] Dec 09 12:10:54 crc kubenswrapper[5002]: I1209 12:10:54.965488 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:10:55 crc kubenswrapper[5002]: I1209 12:10:55.971366 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" event={"ID":"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345","Type":"ContainerStarted","Data":"e82ef753b115147ce979b77999dfa7ec87fb122e9f6a76b97edce155e9b37732"} Dec 09 12:10:55 crc kubenswrapper[5002]: I1209 12:10:55.975103 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" event={"ID":"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345","Type":"ContainerStarted","Data":"9f4bbb6944f0ad9f639e033f0e1e9e8d01edbd61866475ccf896200489e51324"} Dec 09 12:10:55 crc kubenswrapper[5002]: I1209 12:10:55.994930 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" podStartSLOduration=1.505367953 podStartE2EDuration="1.994895685s" podCreationTimestamp="2025-12-09 12:10:54 +0000 UTC" firstStartedPulling="2025-12-09 12:10:54.965128952 +0000 UTC m=+7787.357180033" lastFinishedPulling="2025-12-09 12:10:55.454656684 +0000 UTC m=+7787.846707765" observedRunningTime="2025-12-09 12:10:55.9917275 +0000 UTC m=+7788.383778581" watchObservedRunningTime="2025-12-09 12:10:55.994895685 +0000 UTC m=+7788.386946766" Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.602540 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tngvs"] Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.606056 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.616883 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tngvs"] Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.628669 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmlv4\" (UniqueName: \"kubernetes.io/projected/3a809f38-ab68-4052-9c46-8a0f46161053-kube-api-access-nmlv4\") pod \"certified-operators-tngvs\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.628942 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-catalog-content\") pod \"certified-operators-tngvs\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.629115 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-utilities\") pod \"certified-operators-tngvs\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.730984 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-utilities\") pod \"certified-operators-tngvs\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.731090 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmlv4\" (UniqueName: \"kubernetes.io/projected/3a809f38-ab68-4052-9c46-8a0f46161053-kube-api-access-nmlv4\") pod \"certified-operators-tngvs\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.731243 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-catalog-content\") pod \"certified-operators-tngvs\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.731973 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-catalog-content\") pod \"certified-operators-tngvs\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.732226 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-utilities\") pod \"certified-operators-tngvs\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.757094 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmlv4\" (UniqueName: \"kubernetes.io/projected/3a809f38-ab68-4052-9c46-8a0f46161053-kube-api-access-nmlv4\") pod \"certified-operators-tngvs\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:06 crc kubenswrapper[5002]: I1209 12:11:06.933539 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:07 crc kubenswrapper[5002]: I1209 12:11:07.540633 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tngvs"] Dec 09 12:11:08 crc kubenswrapper[5002]: I1209 12:11:08.176499 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tngvs" event={"ID":"3a809f38-ab68-4052-9c46-8a0f46161053","Type":"ContainerDied","Data":"269bb5cd278a707aff909db01af5aef5710c5f134d757fb8a5021d00a3b66eea"} Dec 09 12:11:08 crc kubenswrapper[5002]: I1209 12:11:08.179748 5002 generic.go:334] "Generic (PLEG): container finished" podID="3a809f38-ab68-4052-9c46-8a0f46161053" containerID="269bb5cd278a707aff909db01af5aef5710c5f134d757fb8a5021d00a3b66eea" exitCode=0 Dec 09 12:11:08 crc kubenswrapper[5002]: I1209 12:11:08.181683 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tngvs" event={"ID":"3a809f38-ab68-4052-9c46-8a0f46161053","Type":"ContainerStarted","Data":"cd8bea0d6b86985766ca813a5df745fdd59fa1de1ba2c955343bb29f42455a72"} Dec 09 12:11:09 crc kubenswrapper[5002]: I1209 12:11:09.193593 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tngvs" event={"ID":"3a809f38-ab68-4052-9c46-8a0f46161053","Type":"ContainerStarted","Data":"45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f"} Dec 09 12:11:10 crc kubenswrapper[5002]: I1209 12:11:10.208517 5002 generic.go:334] "Generic (PLEG): container finished" podID="3a809f38-ab68-4052-9c46-8a0f46161053" containerID="45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f" exitCode=0 Dec 09 12:11:10 crc kubenswrapper[5002]: I1209 12:11:10.208643 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tngvs" event={"ID":"3a809f38-ab68-4052-9c46-8a0f46161053","Type":"ContainerDied","Data":"45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f"} Dec 09 12:11:11 crc kubenswrapper[5002]: I1209 12:11:11.220503 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tngvs" event={"ID":"3a809f38-ab68-4052-9c46-8a0f46161053","Type":"ContainerStarted","Data":"86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf"} Dec 09 12:11:11 crc kubenswrapper[5002]: I1209 12:11:11.246934 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tngvs" podStartSLOduration=2.8015898679999998 podStartE2EDuration="5.246911136s" podCreationTimestamp="2025-12-09 12:11:06 +0000 UTC" firstStartedPulling="2025-12-09 12:11:08.179836691 +0000 UTC m=+7800.571887772" lastFinishedPulling="2025-12-09 12:11:10.625157959 +0000 UTC m=+7803.017209040" observedRunningTime="2025-12-09 12:11:11.242391284 +0000 UTC m=+7803.634442365" watchObservedRunningTime="2025-12-09 12:11:11.246911136 +0000 UTC m=+7803.638962237" Dec 09 12:11:16 crc kubenswrapper[5002]: I1209 12:11:16.933918 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:16 crc kubenswrapper[5002]: I1209 12:11:16.934480 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:16 crc kubenswrapper[5002]: I1209 12:11:16.999625 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:17 crc kubenswrapper[5002]: I1209 12:11:17.322331 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:17 crc kubenswrapper[5002]: I1209 12:11:17.375314 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tngvs"] Dec 09 12:11:19 crc kubenswrapper[5002]: I1209 12:11:19.295106 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tngvs" podUID="3a809f38-ab68-4052-9c46-8a0f46161053" containerName="registry-server" containerID="cri-o://86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf" gracePeriod=2 Dec 09 12:11:19 crc kubenswrapper[5002]: I1209 12:11:19.907870 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:19 crc kubenswrapper[5002]: I1209 12:11:19.927025 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-utilities\") pod \"3a809f38-ab68-4052-9c46-8a0f46161053\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " Dec 09 12:11:19 crc kubenswrapper[5002]: I1209 12:11:19.929442 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-utilities" (OuterVolumeSpecName: "utilities") pod "3a809f38-ab68-4052-9c46-8a0f46161053" (UID: "3a809f38-ab68-4052-9c46-8a0f46161053"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:11:19 crc kubenswrapper[5002]: I1209 12:11:19.931500 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-catalog-content\") pod \"3a809f38-ab68-4052-9c46-8a0f46161053\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " Dec 09 12:11:19 crc kubenswrapper[5002]: I1209 12:11:19.931571 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmlv4\" (UniqueName: \"kubernetes.io/projected/3a809f38-ab68-4052-9c46-8a0f46161053-kube-api-access-nmlv4\") pod \"3a809f38-ab68-4052-9c46-8a0f46161053\" (UID: \"3a809f38-ab68-4052-9c46-8a0f46161053\") " Dec 09 12:11:19 crc kubenswrapper[5002]: I1209 12:11:19.935399 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:11:19 crc kubenswrapper[5002]: I1209 12:11:19.946145 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a809f38-ab68-4052-9c46-8a0f46161053-kube-api-access-nmlv4" (OuterVolumeSpecName: "kube-api-access-nmlv4") pod "3a809f38-ab68-4052-9c46-8a0f46161053" (UID: "3a809f38-ab68-4052-9c46-8a0f46161053"). InnerVolumeSpecName "kube-api-access-nmlv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:11:19 crc kubenswrapper[5002]: I1209 12:11:19.990777 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a809f38-ab68-4052-9c46-8a0f46161053" (UID: "3a809f38-ab68-4052-9c46-8a0f46161053"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.037015 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a809f38-ab68-4052-9c46-8a0f46161053-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.037053 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmlv4\" (UniqueName: \"kubernetes.io/projected/3a809f38-ab68-4052-9c46-8a0f46161053-kube-api-access-nmlv4\") on node \"crc\" DevicePath \"\"" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.335332 5002 generic.go:334] "Generic (PLEG): container finished" podID="3a809f38-ab68-4052-9c46-8a0f46161053" containerID="86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf" exitCode=0 Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.335399 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tngvs" event={"ID":"3a809f38-ab68-4052-9c46-8a0f46161053","Type":"ContainerDied","Data":"86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf"} Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.335433 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tngvs" event={"ID":"3a809f38-ab68-4052-9c46-8a0f46161053","Type":"ContainerDied","Data":"cd8bea0d6b86985766ca813a5df745fdd59fa1de1ba2c955343bb29f42455a72"} Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.335454 5002 scope.go:117] "RemoveContainer" containerID="86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.335422 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tngvs" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.364128 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tngvs"] Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.375791 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tngvs"] Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.378177 5002 scope.go:117] "RemoveContainer" containerID="45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.406790 5002 scope.go:117] "RemoveContainer" containerID="269bb5cd278a707aff909db01af5aef5710c5f134d757fb8a5021d00a3b66eea" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.453463 5002 scope.go:117] "RemoveContainer" containerID="86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf" Dec 09 12:11:20 crc kubenswrapper[5002]: E1209 12:11:20.456407 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf\": container with ID starting with 86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf not found: ID does not exist" containerID="86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.456445 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf"} err="failed to get container status \"86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf\": rpc error: code = NotFound desc = could not find container \"86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf\": container with ID starting with 86c924253e6586a2b0e9247fb2e0d53f512ae923b7026a884973bda6714fe2bf not found: ID does not exist" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.456469 5002 scope.go:117] "RemoveContainer" containerID="45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f" Dec 09 12:11:20 crc kubenswrapper[5002]: E1209 12:11:20.458172 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f\": container with ID starting with 45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f not found: ID does not exist" containerID="45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.458199 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f"} err="failed to get container status \"45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f\": rpc error: code = NotFound desc = could not find container \"45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f\": container with ID starting with 45dcd32ab645cade6c66c2c72247425c5765fe6791f6dc733f2e1f3826134b0f not found: ID does not exist" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.458218 5002 scope.go:117] "RemoveContainer" containerID="269bb5cd278a707aff909db01af5aef5710c5f134d757fb8a5021d00a3b66eea" Dec 09 12:11:20 crc kubenswrapper[5002]: E1209 12:11:20.458745 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"269bb5cd278a707aff909db01af5aef5710c5f134d757fb8a5021d00a3b66eea\": container with ID starting with 269bb5cd278a707aff909db01af5aef5710c5f134d757fb8a5021d00a3b66eea not found: ID does not exist" containerID="269bb5cd278a707aff909db01af5aef5710c5f134d757fb8a5021d00a3b66eea" Dec 09 12:11:20 crc kubenswrapper[5002]: I1209 12:11:20.458796 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"269bb5cd278a707aff909db01af5aef5710c5f134d757fb8a5021d00a3b66eea"} err="failed to get container status \"269bb5cd278a707aff909db01af5aef5710c5f134d757fb8a5021d00a3b66eea\": rpc error: code = NotFound desc = could not find container \"269bb5cd278a707aff909db01af5aef5710c5f134d757fb8a5021d00a3b66eea\": container with ID starting with 269bb5cd278a707aff909db01af5aef5710c5f134d757fb8a5021d00a3b66eea not found: ID does not exist" Dec 09 12:11:22 crc kubenswrapper[5002]: I1209 12:11:22.077124 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a809f38-ab68-4052-9c46-8a0f46161053" path="/var/lib/kubelet/pods/3a809f38-ab68-4052-9c46-8a0f46161053/volumes" Dec 09 12:12:37 crc kubenswrapper[5002]: I1209 12:12:37.964770 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:12:37 crc kubenswrapper[5002]: I1209 12:12:37.965334 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:13:07 crc kubenswrapper[5002]: I1209 12:13:07.965315 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:13:07 crc kubenswrapper[5002]: I1209 12:13:07.965912 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:13:37 crc kubenswrapper[5002]: I1209 12:13:37.965144 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:13:37 crc kubenswrapper[5002]: I1209 12:13:37.965656 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:13:37 crc kubenswrapper[5002]: I1209 12:13:37.965707 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 12:13:37 crc kubenswrapper[5002]: I1209 12:13:37.966568 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"65f13888df0eef768bcada705a22c1b4b809be0c915a856082867f96060629f5"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:13:37 crc kubenswrapper[5002]: I1209 12:13:37.966626 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://65f13888df0eef768bcada705a22c1b4b809be0c915a856082867f96060629f5" gracePeriod=600 Dec 09 12:13:38 crc kubenswrapper[5002]: I1209 12:13:38.833303 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="65f13888df0eef768bcada705a22c1b4b809be0c915a856082867f96060629f5" exitCode=0 Dec 09 12:13:38 crc kubenswrapper[5002]: I1209 12:13:38.833637 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"65f13888df0eef768bcada705a22c1b4b809be0c915a856082867f96060629f5"} Dec 09 12:13:38 crc kubenswrapper[5002]: I1209 12:13:38.834433 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa"} Dec 09 12:13:38 crc kubenswrapper[5002]: I1209 12:13:38.834485 5002 scope.go:117] "RemoveContainer" containerID="e2529a7a9d52b066d9a89cea25f96d0ac287ad63943546043a02bb681ab4f3e6" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.173848 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5"] Dec 09 12:15:00 crc kubenswrapper[5002]: E1209 12:15:00.174857 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a809f38-ab68-4052-9c46-8a0f46161053" containerName="extract-utilities" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.174873 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a809f38-ab68-4052-9c46-8a0f46161053" containerName="extract-utilities" Dec 09 12:15:00 crc kubenswrapper[5002]: E1209 12:15:00.174880 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a809f38-ab68-4052-9c46-8a0f46161053" containerName="registry-server" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.174885 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a809f38-ab68-4052-9c46-8a0f46161053" containerName="registry-server" Dec 09 12:15:00 crc kubenswrapper[5002]: E1209 12:15:00.174900 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a809f38-ab68-4052-9c46-8a0f46161053" containerName="extract-content" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.174907 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a809f38-ab68-4052-9c46-8a0f46161053" containerName="extract-content" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.175126 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a809f38-ab68-4052-9c46-8a0f46161053" containerName="registry-server" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.175937 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.179085 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.179362 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.213909 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5"] Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.327240 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llkmg\" (UniqueName: \"kubernetes.io/projected/06ef45c4-9c5d-4008-9c09-9e521e865225-kube-api-access-llkmg\") pod \"collect-profiles-29421375-mbbh5\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.327388 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06ef45c4-9c5d-4008-9c09-9e521e865225-secret-volume\") pod \"collect-profiles-29421375-mbbh5\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.327546 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06ef45c4-9c5d-4008-9c09-9e521e865225-config-volume\") pod \"collect-profiles-29421375-mbbh5\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.429593 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06ef45c4-9c5d-4008-9c09-9e521e865225-config-volume\") pod \"collect-profiles-29421375-mbbh5\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.430381 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llkmg\" (UniqueName: \"kubernetes.io/projected/06ef45c4-9c5d-4008-9c09-9e521e865225-kube-api-access-llkmg\") pod \"collect-profiles-29421375-mbbh5\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.430502 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06ef45c4-9c5d-4008-9c09-9e521e865225-secret-volume\") pod \"collect-profiles-29421375-mbbh5\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.432235 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06ef45c4-9c5d-4008-9c09-9e521e865225-config-volume\") pod \"collect-profiles-29421375-mbbh5\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.441672 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06ef45c4-9c5d-4008-9c09-9e521e865225-secret-volume\") pod \"collect-profiles-29421375-mbbh5\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.453057 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llkmg\" (UniqueName: \"kubernetes.io/projected/06ef45c4-9c5d-4008-9c09-9e521e865225-kube-api-access-llkmg\") pod \"collect-profiles-29421375-mbbh5\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.498085 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:00 crc kubenswrapper[5002]: I1209 12:15:00.951635 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5"] Dec 09 12:15:01 crc kubenswrapper[5002]: I1209 12:15:01.057713 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" event={"ID":"06ef45c4-9c5d-4008-9c09-9e521e865225","Type":"ContainerStarted","Data":"1682e0156c05587c876d65440959c23d4b489a5e809909682907e81f51b8c5ab"} Dec 09 12:15:02 crc kubenswrapper[5002]: I1209 12:15:02.073588 5002 generic.go:334] "Generic (PLEG): container finished" podID="06ef45c4-9c5d-4008-9c09-9e521e865225" containerID="77d80c17caaac4bf99d2fa7411a4400442715426d9c463128d1dbef9ccaba3ea" exitCode=0 Dec 09 12:15:02 crc kubenswrapper[5002]: I1209 12:15:02.075072 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" event={"ID":"06ef45c4-9c5d-4008-9c09-9e521e865225","Type":"ContainerDied","Data":"77d80c17caaac4bf99d2fa7411a4400442715426d9c463128d1dbef9ccaba3ea"} Dec 09 12:15:03 crc kubenswrapper[5002]: I1209 12:15:03.461004 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:03 crc kubenswrapper[5002]: I1209 12:15:03.599145 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06ef45c4-9c5d-4008-9c09-9e521e865225-config-volume\") pod \"06ef45c4-9c5d-4008-9c09-9e521e865225\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " Dec 09 12:15:03 crc kubenswrapper[5002]: I1209 12:15:03.599222 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06ef45c4-9c5d-4008-9c09-9e521e865225-secret-volume\") pod \"06ef45c4-9c5d-4008-9c09-9e521e865225\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " Dec 09 12:15:03 crc kubenswrapper[5002]: I1209 12:15:03.599255 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llkmg\" (UniqueName: \"kubernetes.io/projected/06ef45c4-9c5d-4008-9c09-9e521e865225-kube-api-access-llkmg\") pod \"06ef45c4-9c5d-4008-9c09-9e521e865225\" (UID: \"06ef45c4-9c5d-4008-9c09-9e521e865225\") " Dec 09 12:15:03 crc kubenswrapper[5002]: I1209 12:15:03.599715 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06ef45c4-9c5d-4008-9c09-9e521e865225-config-volume" (OuterVolumeSpecName: "config-volume") pod "06ef45c4-9c5d-4008-9c09-9e521e865225" (UID: "06ef45c4-9c5d-4008-9c09-9e521e865225"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:15:03 crc kubenswrapper[5002]: I1209 12:15:03.608906 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06ef45c4-9c5d-4008-9c09-9e521e865225-kube-api-access-llkmg" (OuterVolumeSpecName: "kube-api-access-llkmg") pod "06ef45c4-9c5d-4008-9c09-9e521e865225" (UID: "06ef45c4-9c5d-4008-9c09-9e521e865225"). InnerVolumeSpecName "kube-api-access-llkmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:15:03 crc kubenswrapper[5002]: I1209 12:15:03.609038 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ef45c4-9c5d-4008-9c09-9e521e865225-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06ef45c4-9c5d-4008-9c09-9e521e865225" (UID: "06ef45c4-9c5d-4008-9c09-9e521e865225"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:15:03 crc kubenswrapper[5002]: I1209 12:15:03.701855 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06ef45c4-9c5d-4008-9c09-9e521e865225-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:03 crc kubenswrapper[5002]: I1209 12:15:03.701908 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06ef45c4-9c5d-4008-9c09-9e521e865225-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:03 crc kubenswrapper[5002]: I1209 12:15:03.701921 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llkmg\" (UniqueName: \"kubernetes.io/projected/06ef45c4-9c5d-4008-9c09-9e521e865225-kube-api-access-llkmg\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:04 crc kubenswrapper[5002]: I1209 12:15:04.095533 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" event={"ID":"06ef45c4-9c5d-4008-9c09-9e521e865225","Type":"ContainerDied","Data":"1682e0156c05587c876d65440959c23d4b489a5e809909682907e81f51b8c5ab"} Dec 09 12:15:04 crc kubenswrapper[5002]: I1209 12:15:04.095748 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1682e0156c05587c876d65440959c23d4b489a5e809909682907e81f51b8c5ab" Dec 09 12:15:04 crc kubenswrapper[5002]: I1209 12:15:04.095593 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-mbbh5" Dec 09 12:15:04 crc kubenswrapper[5002]: I1209 12:15:04.547984 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd"] Dec 09 12:15:04 crc kubenswrapper[5002]: I1209 12:15:04.561549 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421330-gjnvd"] Dec 09 12:15:06 crc kubenswrapper[5002]: I1209 12:15:06.071241 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a5df655-0ab1-4b91-8815-ef28939edb6b" path="/var/lib/kubelet/pods/3a5df655-0ab1-4b91-8815-ef28939edb6b/volumes" Dec 09 12:15:43 crc kubenswrapper[5002]: I1209 12:15:43.518947 5002 generic.go:334] "Generic (PLEG): container finished" podID="b8783f64-c4b2-43fa-bc56-b5eb5d5c0345" containerID="e82ef753b115147ce979b77999dfa7ec87fb122e9f6a76b97edce155e9b37732" exitCode=0 Dec 09 12:15:43 crc kubenswrapper[5002]: I1209 12:15:43.519061 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" event={"ID":"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345","Type":"ContainerDied","Data":"e82ef753b115147ce979b77999dfa7ec87fb122e9f6a76b97edce155e9b37732"} Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.033797 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.155474 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-inventory\") pod \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.155778 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-secret-0\") pod \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.155864 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ceph\") pod \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.155889 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-combined-ca-bundle\") pod \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.155943 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ssh-key\") pod \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.156068 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j5gx\" (UniqueName: \"kubernetes.io/projected/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-kube-api-access-5j5gx\") pod \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\" (UID: \"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345\") " Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.160958 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345" (UID: "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.161580 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ceph" (OuterVolumeSpecName: "ceph") pod "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345" (UID: "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.181010 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-kube-api-access-5j5gx" (OuterVolumeSpecName: "kube-api-access-5j5gx") pod "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345" (UID: "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345"). InnerVolumeSpecName "kube-api-access-5j5gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.186259 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345" (UID: "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.194008 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345" (UID: "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.200542 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-inventory" (OuterVolumeSpecName: "inventory") pod "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345" (UID: "b8783f64-c4b2-43fa-bc56-b5eb5d5c0345"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.258100 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j5gx\" (UniqueName: \"kubernetes.io/projected/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-kube-api-access-5j5gx\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.258134 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.258145 5002 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.258153 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.258163 5002 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.258172 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8783f64-c4b2-43fa-bc56-b5eb5d5c0345-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.552334 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" event={"ID":"b8783f64-c4b2-43fa-bc56-b5eb5d5c0345","Type":"ContainerDied","Data":"9f4bbb6944f0ad9f639e033f0e1e9e8d01edbd61866475ccf896200489e51324"} Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.552382 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f4bbb6944f0ad9f639e033f0e1e9e8d01edbd61866475ccf896200489e51324" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.552483 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-mkzb6" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.672933 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-mtwlk"] Dec 09 12:15:45 crc kubenswrapper[5002]: E1209 12:15:45.673905 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ef45c4-9c5d-4008-9c09-9e521e865225" containerName="collect-profiles" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.673936 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ef45c4-9c5d-4008-9c09-9e521e865225" containerName="collect-profiles" Dec 09 12:15:45 crc kubenswrapper[5002]: E1209 12:15:45.673976 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8783f64-c4b2-43fa-bc56-b5eb5d5c0345" containerName="libvirt-openstack-openstack-cell1" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.673986 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8783f64-c4b2-43fa-bc56-b5eb5d5c0345" containerName="libvirt-openstack-openstack-cell1" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.674266 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8783f64-c4b2-43fa-bc56-b5eb5d5c0345" containerName="libvirt-openstack-openstack-cell1" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.674310 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ef45c4-9c5d-4008-9c09-9e521e865225" containerName="collect-profiles" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.675280 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.679282 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.679509 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.679508 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.679784 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.679966 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.684560 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.685258 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.689515 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-mtwlk"] Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.769227 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.769288 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.769321 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.769363 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.769408 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.769528 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.769709 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.769768 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq5gr\" (UniqueName: \"kubernetes.io/projected/1bb12456-0517-4952-a5ef-b2a06433e6b1-kube-api-access-hq5gr\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.769788 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.769955 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.770069 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.872329 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.872421 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.872466 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.872525 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.872574 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.872611 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.872679 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.872713 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq5gr\" (UniqueName: \"kubernetes.io/projected/1bb12456-0517-4952-a5ef-b2a06433e6b1-kube-api-access-hq5gr\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.872740 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.872779 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.872875 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.873625 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.874276 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.879387 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.880393 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.880531 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.880629 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.881183 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.881568 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.882089 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.900873 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:45 crc kubenswrapper[5002]: I1209 12:15:45.901168 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq5gr\" (UniqueName: \"kubernetes.io/projected/1bb12456-0517-4952-a5ef-b2a06433e6b1-kube-api-access-hq5gr\") pod \"nova-cell1-openstack-openstack-cell1-mtwlk\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:46 crc kubenswrapper[5002]: I1209 12:15:46.043156 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:15:46 crc kubenswrapper[5002]: I1209 12:15:46.593362 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-mtwlk"] Dec 09 12:15:47 crc kubenswrapper[5002]: I1209 12:15:47.586415 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" event={"ID":"1bb12456-0517-4952-a5ef-b2a06433e6b1","Type":"ContainerStarted","Data":"6dc4c6cf2cbce76cc0db6a2e6db1adb8319c34b64a256174231ea5bfd22c5c7a"} Dec 09 12:15:47 crc kubenswrapper[5002]: I1209 12:15:47.586702 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" event={"ID":"1bb12456-0517-4952-a5ef-b2a06433e6b1","Type":"ContainerStarted","Data":"27f50ac30508d9693f03f624f3339531784fc5780538f0ffe8e9777ac9ea4817"} Dec 09 12:15:47 crc kubenswrapper[5002]: I1209 12:15:47.621897 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" podStartSLOduration=2.148269473 podStartE2EDuration="2.621873879s" podCreationTimestamp="2025-12-09 12:15:45 +0000 UTC" firstStartedPulling="2025-12-09 12:15:46.606880341 +0000 UTC m=+8078.998931422" lastFinishedPulling="2025-12-09 12:15:47.080484747 +0000 UTC m=+8079.472535828" observedRunningTime="2025-12-09 12:15:47.613298619 +0000 UTC m=+8080.005349710" watchObservedRunningTime="2025-12-09 12:15:47.621873879 +0000 UTC m=+8080.013924960" Dec 09 12:15:56 crc kubenswrapper[5002]: I1209 12:15:56.871720 5002 scope.go:117] "RemoveContainer" containerID="7540502a9ab8de6a3a0ca4c160bf882c0f2c1d0e46a8d0f7bb9653ac963c8da5" Dec 09 12:16:07 crc kubenswrapper[5002]: I1209 12:16:07.965233 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:16:07 crc kubenswrapper[5002]: I1209 12:16:07.965885 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.184678 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pjgxt"] Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.188160 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.203491 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjgxt"] Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.341120 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbp89\" (UniqueName: \"kubernetes.io/projected/f98aeab2-acca-483b-86d6-22e294b6598b-kube-api-access-xbp89\") pod \"redhat-operators-pjgxt\" (UID: \"f98aeab2-acca-483b-86d6-22e294b6598b\") " pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.341221 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98aeab2-acca-483b-86d6-22e294b6598b-utilities\") pod \"redhat-operators-pjgxt\" (UID: \"f98aeab2-acca-483b-86d6-22e294b6598b\") " pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.341350 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98aeab2-acca-483b-86d6-22e294b6598b-catalog-content\") pod \"redhat-operators-pjgxt\" (UID: \"f98aeab2-acca-483b-86d6-22e294b6598b\") " pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.443523 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98aeab2-acca-483b-86d6-22e294b6598b-utilities\") pod \"redhat-operators-pjgxt\" (UID: \"f98aeab2-acca-483b-86d6-22e294b6598b\") " pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.443582 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98aeab2-acca-483b-86d6-22e294b6598b-catalog-content\") pod \"redhat-operators-pjgxt\" (UID: \"f98aeab2-acca-483b-86d6-22e294b6598b\") " pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.443770 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbp89\" (UniqueName: \"kubernetes.io/projected/f98aeab2-acca-483b-86d6-22e294b6598b-kube-api-access-xbp89\") pod \"redhat-operators-pjgxt\" (UID: \"f98aeab2-acca-483b-86d6-22e294b6598b\") " pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.444030 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98aeab2-acca-483b-86d6-22e294b6598b-utilities\") pod \"redhat-operators-pjgxt\" (UID: \"f98aeab2-acca-483b-86d6-22e294b6598b\") " pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.444096 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98aeab2-acca-483b-86d6-22e294b6598b-catalog-content\") pod \"redhat-operators-pjgxt\" (UID: \"f98aeab2-acca-483b-86d6-22e294b6598b\") " pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.462495 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbp89\" (UniqueName: \"kubernetes.io/projected/f98aeab2-acca-483b-86d6-22e294b6598b-kube-api-access-xbp89\") pod \"redhat-operators-pjgxt\" (UID: \"f98aeab2-acca-483b-86d6-22e294b6598b\") " pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.515046 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:11 crc kubenswrapper[5002]: I1209 12:16:11.969585 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjgxt"] Dec 09 12:16:12 crc kubenswrapper[5002]: I1209 12:16:12.842646 5002 generic.go:334] "Generic (PLEG): container finished" podID="f98aeab2-acca-483b-86d6-22e294b6598b" containerID="2c46957e65b9a991576dded6cfaa8b585f9ad4621d28563d673bcea1b7b2b591" exitCode=0 Dec 09 12:16:12 crc kubenswrapper[5002]: I1209 12:16:12.842698 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjgxt" event={"ID":"f98aeab2-acca-483b-86d6-22e294b6598b","Type":"ContainerDied","Data":"2c46957e65b9a991576dded6cfaa8b585f9ad4621d28563d673bcea1b7b2b591"} Dec 09 12:16:12 crc kubenswrapper[5002]: I1209 12:16:12.843156 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjgxt" event={"ID":"f98aeab2-acca-483b-86d6-22e294b6598b","Type":"ContainerStarted","Data":"6ec38b55a10ebb676c5a3e3d03f07e3a09e481585a9d52df70438c8679ae10c7"} Dec 09 12:16:12 crc kubenswrapper[5002]: I1209 12:16:12.845579 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:16:24 crc kubenswrapper[5002]: I1209 12:16:24.985951 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjgxt" event={"ID":"f98aeab2-acca-483b-86d6-22e294b6598b","Type":"ContainerStarted","Data":"882803e12ed0164c1cae707c35f93fe08968f58e13e280221398ba042038f444"} Dec 09 12:16:27 crc kubenswrapper[5002]: I1209 12:16:27.008560 5002 generic.go:334] "Generic (PLEG): container finished" podID="f98aeab2-acca-483b-86d6-22e294b6598b" containerID="882803e12ed0164c1cae707c35f93fe08968f58e13e280221398ba042038f444" exitCode=0 Dec 09 12:16:27 crc kubenswrapper[5002]: I1209 12:16:27.008795 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjgxt" event={"ID":"f98aeab2-acca-483b-86d6-22e294b6598b","Type":"ContainerDied","Data":"882803e12ed0164c1cae707c35f93fe08968f58e13e280221398ba042038f444"} Dec 09 12:16:29 crc kubenswrapper[5002]: I1209 12:16:29.031000 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjgxt" event={"ID":"f98aeab2-acca-483b-86d6-22e294b6598b","Type":"ContainerStarted","Data":"4cf1a8a39975e10eaaf8c71e8e5aa0b8b3adf5672acea04d533549e7c03a04ed"} Dec 09 12:16:29 crc kubenswrapper[5002]: I1209 12:16:29.054083 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pjgxt" podStartSLOduration=2.9359255539999998 podStartE2EDuration="18.054064654s" podCreationTimestamp="2025-12-09 12:16:11 +0000 UTC" firstStartedPulling="2025-12-09 12:16:12.845211747 +0000 UTC m=+8105.237262838" lastFinishedPulling="2025-12-09 12:16:27.963350847 +0000 UTC m=+8120.355401938" observedRunningTime="2025-12-09 12:16:29.049372448 +0000 UTC m=+8121.441423539" watchObservedRunningTime="2025-12-09 12:16:29.054064654 +0000 UTC m=+8121.446115745" Dec 09 12:16:31 crc kubenswrapper[5002]: I1209 12:16:31.515571 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:31 crc kubenswrapper[5002]: I1209 12:16:31.516048 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:32 crc kubenswrapper[5002]: I1209 12:16:32.605102 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pjgxt" podUID="f98aeab2-acca-483b-86d6-22e294b6598b" containerName="registry-server" probeResult="failure" output=< Dec 09 12:16:32 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 12:16:32 crc kubenswrapper[5002]: > Dec 09 12:16:37 crc kubenswrapper[5002]: I1209 12:16:37.964846 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:16:37 crc kubenswrapper[5002]: I1209 12:16:37.966248 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:16:41 crc kubenswrapper[5002]: I1209 12:16:41.582244 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:41 crc kubenswrapper[5002]: I1209 12:16:41.661180 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pjgxt" Dec 09 12:16:43 crc kubenswrapper[5002]: I1209 12:16:43.459743 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjgxt"] Dec 09 12:16:43 crc kubenswrapper[5002]: I1209 12:16:43.519892 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lp6l4"] Dec 09 12:16:43 crc kubenswrapper[5002]: I1209 12:16:43.520225 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lp6l4" podUID="9dc64ca0-5e54-466d-b19f-7726b430268a" containerName="registry-server" containerID="cri-o://0f18d8edee8bc8d1606f74cb4bf2f562b8ee9b1c438cd91f2ddb43e9b8578ab6" gracePeriod=2 Dec 09 12:16:44 crc kubenswrapper[5002]: I1209 12:16:44.190827 5002 generic.go:334] "Generic (PLEG): container finished" podID="9dc64ca0-5e54-466d-b19f-7726b430268a" containerID="0f18d8edee8bc8d1606f74cb4bf2f562b8ee9b1c438cd91f2ddb43e9b8578ab6" exitCode=0 Dec 09 12:16:44 crc kubenswrapper[5002]: I1209 12:16:44.190855 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp6l4" event={"ID":"9dc64ca0-5e54-466d-b19f-7726b430268a","Type":"ContainerDied","Data":"0f18d8edee8bc8d1606f74cb4bf2f562b8ee9b1c438cd91f2ddb43e9b8578ab6"} Dec 09 12:16:44 crc kubenswrapper[5002]: I1209 12:16:44.889890 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 12:16:44 crc kubenswrapper[5002]: I1209 12:16:44.988695 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-utilities\") pod \"9dc64ca0-5e54-466d-b19f-7726b430268a\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " Dec 09 12:16:44 crc kubenswrapper[5002]: I1209 12:16:44.988739 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-catalog-content\") pod \"9dc64ca0-5e54-466d-b19f-7726b430268a\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " Dec 09 12:16:44 crc kubenswrapper[5002]: I1209 12:16:44.988787 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xb7m\" (UniqueName: \"kubernetes.io/projected/9dc64ca0-5e54-466d-b19f-7726b430268a-kube-api-access-6xb7m\") pod \"9dc64ca0-5e54-466d-b19f-7726b430268a\" (UID: \"9dc64ca0-5e54-466d-b19f-7726b430268a\") " Dec 09 12:16:44 crc kubenswrapper[5002]: I1209 12:16:44.997567 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-utilities" (OuterVolumeSpecName: "utilities") pod "9dc64ca0-5e54-466d-b19f-7726b430268a" (UID: "9dc64ca0-5e54-466d-b19f-7726b430268a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:16:44 crc kubenswrapper[5002]: I1209 12:16:44.998214 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc64ca0-5e54-466d-b19f-7726b430268a-kube-api-access-6xb7m" (OuterVolumeSpecName: "kube-api-access-6xb7m") pod "9dc64ca0-5e54-466d-b19f-7726b430268a" (UID: "9dc64ca0-5e54-466d-b19f-7726b430268a"). InnerVolumeSpecName "kube-api-access-6xb7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:16:45 crc kubenswrapper[5002]: I1209 12:16:45.091694 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:16:45 crc kubenswrapper[5002]: I1209 12:16:45.091738 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xb7m\" (UniqueName: \"kubernetes.io/projected/9dc64ca0-5e54-466d-b19f-7726b430268a-kube-api-access-6xb7m\") on node \"crc\" DevicePath \"\"" Dec 09 12:16:45 crc kubenswrapper[5002]: I1209 12:16:45.203289 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp6l4" event={"ID":"9dc64ca0-5e54-466d-b19f-7726b430268a","Type":"ContainerDied","Data":"8186823966bea611dd060448f76f12653c0b07b0f415c9dbe0f634b6d6b5da91"} Dec 09 12:16:45 crc kubenswrapper[5002]: I1209 12:16:45.203343 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lp6l4" Dec 09 12:16:45 crc kubenswrapper[5002]: I1209 12:16:45.203354 5002 scope.go:117] "RemoveContainer" containerID="0f18d8edee8bc8d1606f74cb4bf2f562b8ee9b1c438cd91f2ddb43e9b8578ab6" Dec 09 12:16:45 crc kubenswrapper[5002]: I1209 12:16:45.235796 5002 scope.go:117] "RemoveContainer" containerID="b5488d9913306cef1d39b41e2c80ea75cfef4f60cadc56af7f872aba129b5892" Dec 09 12:16:45 crc kubenswrapper[5002]: I1209 12:16:45.263404 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9dc64ca0-5e54-466d-b19f-7726b430268a" (UID: "9dc64ca0-5e54-466d-b19f-7726b430268a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:16:45 crc kubenswrapper[5002]: I1209 12:16:45.274030 5002 scope.go:117] "RemoveContainer" containerID="8ceb08ae4329a958a7aab9104ead6e429d6f81115d8c1bb32338e2b5c2dad0fd" Dec 09 12:16:45 crc kubenswrapper[5002]: I1209 12:16:45.297574 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc64ca0-5e54-466d-b19f-7726b430268a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:16:45 crc kubenswrapper[5002]: I1209 12:16:45.541690 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lp6l4"] Dec 09 12:16:45 crc kubenswrapper[5002]: I1209 12:16:45.551465 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lp6l4"] Dec 09 12:16:46 crc kubenswrapper[5002]: I1209 12:16:46.073064 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc64ca0-5e54-466d-b19f-7726b430268a" path="/var/lib/kubelet/pods/9dc64ca0-5e54-466d-b19f-7726b430268a/volumes" Dec 09 12:17:07 crc kubenswrapper[5002]: I1209 12:17:07.965068 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:17:07 crc kubenswrapper[5002]: I1209 12:17:07.965566 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:17:07 crc kubenswrapper[5002]: I1209 12:17:07.965614 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 12:17:07 crc kubenswrapper[5002]: I1209 12:17:07.966438 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:17:07 crc kubenswrapper[5002]: I1209 12:17:07.966508 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" gracePeriod=600 Dec 09 12:17:08 crc kubenswrapper[5002]: E1209 12:17:08.085978 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:17:08 crc kubenswrapper[5002]: I1209 12:17:08.447979 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" exitCode=0 Dec 09 12:17:08 crc kubenswrapper[5002]: I1209 12:17:08.448068 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa"} Dec 09 12:17:08 crc kubenswrapper[5002]: I1209 12:17:08.448317 5002 scope.go:117] "RemoveContainer" containerID="65f13888df0eef768bcada705a22c1b4b809be0c915a856082867f96060629f5" Dec 09 12:17:08 crc kubenswrapper[5002]: I1209 12:17:08.449507 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:17:08 crc kubenswrapper[5002]: E1209 12:17:08.449842 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.198878 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-prhvr"] Dec 09 12:17:10 crc kubenswrapper[5002]: E1209 12:17:10.199871 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc64ca0-5e54-466d-b19f-7726b430268a" containerName="extract-utilities" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.199891 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc64ca0-5e54-466d-b19f-7726b430268a" containerName="extract-utilities" Dec 09 12:17:10 crc kubenswrapper[5002]: E1209 12:17:10.199945 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc64ca0-5e54-466d-b19f-7726b430268a" containerName="registry-server" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.199955 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc64ca0-5e54-466d-b19f-7726b430268a" containerName="registry-server" Dec 09 12:17:10 crc kubenswrapper[5002]: E1209 12:17:10.199974 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc64ca0-5e54-466d-b19f-7726b430268a" containerName="extract-content" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.199983 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc64ca0-5e54-466d-b19f-7726b430268a" containerName="extract-content" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.200250 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc64ca0-5e54-466d-b19f-7726b430268a" containerName="registry-server" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.202272 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.212175 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-prhvr"] Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.361675 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-catalog-content\") pod \"community-operators-prhvr\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.361739 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-utilities\") pod \"community-operators-prhvr\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.361920 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxqq7\" (UniqueName: \"kubernetes.io/projected/a2c27e80-c4f3-4e58-bb65-96071dba6eef-kube-api-access-zxqq7\") pod \"community-operators-prhvr\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.463939 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-catalog-content\") pod \"community-operators-prhvr\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.463995 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-utilities\") pod \"community-operators-prhvr\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.464100 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxqq7\" (UniqueName: \"kubernetes.io/projected/a2c27e80-c4f3-4e58-bb65-96071dba6eef-kube-api-access-zxqq7\") pod \"community-operators-prhvr\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.465083 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-catalog-content\") pod \"community-operators-prhvr\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.465122 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-utilities\") pod \"community-operators-prhvr\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.495694 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxqq7\" (UniqueName: \"kubernetes.io/projected/a2c27e80-c4f3-4e58-bb65-96071dba6eef-kube-api-access-zxqq7\") pod \"community-operators-prhvr\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:10 crc kubenswrapper[5002]: I1209 12:17:10.528553 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:11 crc kubenswrapper[5002]: I1209 12:17:11.629367 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-prhvr"] Dec 09 12:17:12 crc kubenswrapper[5002]: I1209 12:17:12.498985 5002 generic.go:334] "Generic (PLEG): container finished" podID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" containerID="53bfbb90c522aa6e7285b12c9ff14700714c34779b9a595ea2d262e3159ade5a" exitCode=0 Dec 09 12:17:12 crc kubenswrapper[5002]: I1209 12:17:12.499041 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-prhvr" event={"ID":"a2c27e80-c4f3-4e58-bb65-96071dba6eef","Type":"ContainerDied","Data":"53bfbb90c522aa6e7285b12c9ff14700714c34779b9a595ea2d262e3159ade5a"} Dec 09 12:17:12 crc kubenswrapper[5002]: I1209 12:17:12.499520 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-prhvr" event={"ID":"a2c27e80-c4f3-4e58-bb65-96071dba6eef","Type":"ContainerStarted","Data":"d46d5ad1ebb084247e879dbcc3b5074caa46b49a94b00fe2598fc04ee23096d2"} Dec 09 12:17:13 crc kubenswrapper[5002]: I1209 12:17:13.513448 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-prhvr" event={"ID":"a2c27e80-c4f3-4e58-bb65-96071dba6eef","Type":"ContainerStarted","Data":"80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a"} Dec 09 12:17:14 crc kubenswrapper[5002]: I1209 12:17:14.530068 5002 generic.go:334] "Generic (PLEG): container finished" podID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" containerID="80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a" exitCode=0 Dec 09 12:17:14 crc kubenswrapper[5002]: I1209 12:17:14.530112 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-prhvr" event={"ID":"a2c27e80-c4f3-4e58-bb65-96071dba6eef","Type":"ContainerDied","Data":"80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a"} Dec 09 12:17:15 crc kubenswrapper[5002]: I1209 12:17:15.546994 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-prhvr" event={"ID":"a2c27e80-c4f3-4e58-bb65-96071dba6eef","Type":"ContainerStarted","Data":"0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0"} Dec 09 12:17:15 crc kubenswrapper[5002]: I1209 12:17:15.576266 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-prhvr" podStartSLOduration=3.093323193 podStartE2EDuration="5.57624093s" podCreationTimestamp="2025-12-09 12:17:10 +0000 UTC" firstStartedPulling="2025-12-09 12:17:12.501395077 +0000 UTC m=+8164.893446178" lastFinishedPulling="2025-12-09 12:17:14.984312844 +0000 UTC m=+8167.376363915" observedRunningTime="2025-12-09 12:17:15.569405837 +0000 UTC m=+8167.961456928" watchObservedRunningTime="2025-12-09 12:17:15.57624093 +0000 UTC m=+8167.968292021" Dec 09 12:17:20 crc kubenswrapper[5002]: I1209 12:17:20.529185 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:20 crc kubenswrapper[5002]: I1209 12:17:20.529785 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:20 crc kubenswrapper[5002]: I1209 12:17:20.589040 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:20 crc kubenswrapper[5002]: I1209 12:17:20.654492 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:20 crc kubenswrapper[5002]: I1209 12:17:20.843981 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-prhvr"] Dec 09 12:17:22 crc kubenswrapper[5002]: I1209 12:17:22.620593 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-prhvr" podUID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" containerName="registry-server" containerID="cri-o://0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0" gracePeriod=2 Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.061066 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:17:23 crc kubenswrapper[5002]: E1209 12:17:23.061672 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.195567 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.353665 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxqq7\" (UniqueName: \"kubernetes.io/projected/a2c27e80-c4f3-4e58-bb65-96071dba6eef-kube-api-access-zxqq7\") pod \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.353792 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-utilities\") pod \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.353984 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-catalog-content\") pod \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\" (UID: \"a2c27e80-c4f3-4e58-bb65-96071dba6eef\") " Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.356595 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-utilities" (OuterVolumeSpecName: "utilities") pod "a2c27e80-c4f3-4e58-bb65-96071dba6eef" (UID: "a2c27e80-c4f3-4e58-bb65-96071dba6eef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.362072 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c27e80-c4f3-4e58-bb65-96071dba6eef-kube-api-access-zxqq7" (OuterVolumeSpecName: "kube-api-access-zxqq7") pod "a2c27e80-c4f3-4e58-bb65-96071dba6eef" (UID: "a2c27e80-c4f3-4e58-bb65-96071dba6eef"). InnerVolumeSpecName "kube-api-access-zxqq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.430747 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2c27e80-c4f3-4e58-bb65-96071dba6eef" (UID: "a2c27e80-c4f3-4e58-bb65-96071dba6eef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.457106 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxqq7\" (UniqueName: \"kubernetes.io/projected/a2c27e80-c4f3-4e58-bb65-96071dba6eef-kube-api-access-zxqq7\") on node \"crc\" DevicePath \"\"" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.457171 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.457185 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2c27e80-c4f3-4e58-bb65-96071dba6eef-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.633392 5002 generic.go:334] "Generic (PLEG): container finished" podID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" containerID="0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0" exitCode=0 Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.633436 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-prhvr" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.633453 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-prhvr" event={"ID":"a2c27e80-c4f3-4e58-bb65-96071dba6eef","Type":"ContainerDied","Data":"0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0"} Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.633484 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-prhvr" event={"ID":"a2c27e80-c4f3-4e58-bb65-96071dba6eef","Type":"ContainerDied","Data":"d46d5ad1ebb084247e879dbcc3b5074caa46b49a94b00fe2598fc04ee23096d2"} Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.633517 5002 scope.go:117] "RemoveContainer" containerID="0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.670750 5002 scope.go:117] "RemoveContainer" containerID="80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.685268 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-prhvr"] Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.700692 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-prhvr"] Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.701396 5002 scope.go:117] "RemoveContainer" containerID="53bfbb90c522aa6e7285b12c9ff14700714c34779b9a595ea2d262e3159ade5a" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.745866 5002 scope.go:117] "RemoveContainer" containerID="0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0" Dec 09 12:17:23 crc kubenswrapper[5002]: E1209 12:17:23.746642 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0\": container with ID starting with 0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0 not found: ID does not exist" containerID="0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.746713 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0"} err="failed to get container status \"0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0\": rpc error: code = NotFound desc = could not find container \"0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0\": container with ID starting with 0cc4ae9e1d24a7388557d4193d816cc2572a0f805eb2a85b054f3c8f65809ec0 not found: ID does not exist" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.746754 5002 scope.go:117] "RemoveContainer" containerID="80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a" Dec 09 12:17:23 crc kubenswrapper[5002]: E1209 12:17:23.747346 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a\": container with ID starting with 80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a not found: ID does not exist" containerID="80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.747390 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a"} err="failed to get container status \"80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a\": rpc error: code = NotFound desc = could not find container \"80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a\": container with ID starting with 80e878e0782b2b5f4737efefd991e879e252204a3eaa2517ea2578deb91a9b9a not found: ID does not exist" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.747421 5002 scope.go:117] "RemoveContainer" containerID="53bfbb90c522aa6e7285b12c9ff14700714c34779b9a595ea2d262e3159ade5a" Dec 09 12:17:23 crc kubenswrapper[5002]: E1209 12:17:23.747832 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53bfbb90c522aa6e7285b12c9ff14700714c34779b9a595ea2d262e3159ade5a\": container with ID starting with 53bfbb90c522aa6e7285b12c9ff14700714c34779b9a595ea2d262e3159ade5a not found: ID does not exist" containerID="53bfbb90c522aa6e7285b12c9ff14700714c34779b9a595ea2d262e3159ade5a" Dec 09 12:17:23 crc kubenswrapper[5002]: I1209 12:17:23.747890 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53bfbb90c522aa6e7285b12c9ff14700714c34779b9a595ea2d262e3159ade5a"} err="failed to get container status \"53bfbb90c522aa6e7285b12c9ff14700714c34779b9a595ea2d262e3159ade5a\": rpc error: code = NotFound desc = could not find container \"53bfbb90c522aa6e7285b12c9ff14700714c34779b9a595ea2d262e3159ade5a\": container with ID starting with 53bfbb90c522aa6e7285b12c9ff14700714c34779b9a595ea2d262e3159ade5a not found: ID does not exist" Dec 09 12:17:24 crc kubenswrapper[5002]: I1209 12:17:24.073724 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" path="/var/lib/kubelet/pods/a2c27e80-c4f3-4e58-bb65-96071dba6eef/volumes" Dec 09 12:17:37 crc kubenswrapper[5002]: I1209 12:17:37.060641 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:17:37 crc kubenswrapper[5002]: E1209 12:17:37.061540 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:17:48 crc kubenswrapper[5002]: I1209 12:17:48.060583 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:17:48 crc kubenswrapper[5002]: E1209 12:17:48.061624 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:18:03 crc kubenswrapper[5002]: I1209 12:18:03.061023 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:18:03 crc kubenswrapper[5002]: E1209 12:18:03.061795 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:18:16 crc kubenswrapper[5002]: I1209 12:18:16.060321 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:18:16 crc kubenswrapper[5002]: E1209 12:18:16.061354 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:18:27 crc kubenswrapper[5002]: I1209 12:18:27.061839 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:18:27 crc kubenswrapper[5002]: E1209 12:18:27.062859 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:18:38 crc kubenswrapper[5002]: I1209 12:18:38.072283 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:18:38 crc kubenswrapper[5002]: E1209 12:18:38.073639 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:18:50 crc kubenswrapper[5002]: I1209 12:18:50.060613 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:18:50 crc kubenswrapper[5002]: E1209 12:18:50.061581 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:18:57 crc kubenswrapper[5002]: I1209 12:18:57.738726 5002 generic.go:334] "Generic (PLEG): container finished" podID="1bb12456-0517-4952-a5ef-b2a06433e6b1" containerID="6dc4c6cf2cbce76cc0db6a2e6db1adb8319c34b64a256174231ea5bfd22c5c7a" exitCode=0 Dec 09 12:18:57 crc kubenswrapper[5002]: I1209 12:18:57.738931 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" event={"ID":"1bb12456-0517-4952-a5ef-b2a06433e6b1","Type":"ContainerDied","Data":"6dc4c6cf2cbce76cc0db6a2e6db1adb8319c34b64a256174231ea5bfd22c5c7a"} Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.259804 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.382639 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-1\") pod \"1bb12456-0517-4952-a5ef-b2a06433e6b1\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.382679 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq5gr\" (UniqueName: \"kubernetes.io/projected/1bb12456-0517-4952-a5ef-b2a06433e6b1-kube-api-access-hq5gr\") pod \"1bb12456-0517-4952-a5ef-b2a06433e6b1\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.382708 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-combined-ca-bundle\") pod \"1bb12456-0517-4952-a5ef-b2a06433e6b1\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.382875 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ssh-key\") pod \"1bb12456-0517-4952-a5ef-b2a06433e6b1\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.382897 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ceph\") pod \"1bb12456-0517-4952-a5ef-b2a06433e6b1\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.382917 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-0\") pod \"1bb12456-0517-4952-a5ef-b2a06433e6b1\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.382932 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-0\") pod \"1bb12456-0517-4952-a5ef-b2a06433e6b1\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.383006 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-0\") pod \"1bb12456-0517-4952-a5ef-b2a06433e6b1\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.383066 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-1\") pod \"1bb12456-0517-4952-a5ef-b2a06433e6b1\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.383102 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-1\") pod \"1bb12456-0517-4952-a5ef-b2a06433e6b1\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.383131 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-inventory\") pod \"1bb12456-0517-4952-a5ef-b2a06433e6b1\" (UID: \"1bb12456-0517-4952-a5ef-b2a06433e6b1\") " Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.405585 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ceph" (OuterVolumeSpecName: "ceph") pod "1bb12456-0517-4952-a5ef-b2a06433e6b1" (UID: "1bb12456-0517-4952-a5ef-b2a06433e6b1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.412497 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb12456-0517-4952-a5ef-b2a06433e6b1-kube-api-access-hq5gr" (OuterVolumeSpecName: "kube-api-access-hq5gr") pod "1bb12456-0517-4952-a5ef-b2a06433e6b1" (UID: "1bb12456-0517-4952-a5ef-b2a06433e6b1"). InnerVolumeSpecName "kube-api-access-hq5gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.417317 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "1bb12456-0517-4952-a5ef-b2a06433e6b1" (UID: "1bb12456-0517-4952-a5ef-b2a06433e6b1"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.437336 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "1bb12456-0517-4952-a5ef-b2a06433e6b1" (UID: "1bb12456-0517-4952-a5ef-b2a06433e6b1"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.437907 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-inventory" (OuterVolumeSpecName: "inventory") pod "1bb12456-0517-4952-a5ef-b2a06433e6b1" (UID: "1bb12456-0517-4952-a5ef-b2a06433e6b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.454794 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "1bb12456-0517-4952-a5ef-b2a06433e6b1" (UID: "1bb12456-0517-4952-a5ef-b2a06433e6b1"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.455759 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "1bb12456-0517-4952-a5ef-b2a06433e6b1" (UID: "1bb12456-0517-4952-a5ef-b2a06433e6b1"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.459999 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "1bb12456-0517-4952-a5ef-b2a06433e6b1" (UID: "1bb12456-0517-4952-a5ef-b2a06433e6b1"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.460246 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1bb12456-0517-4952-a5ef-b2a06433e6b1" (UID: "1bb12456-0517-4952-a5ef-b2a06433e6b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.462085 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "1bb12456-0517-4952-a5ef-b2a06433e6b1" (UID: "1bb12456-0517-4952-a5ef-b2a06433e6b1"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.473717 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "1bb12456-0517-4952-a5ef-b2a06433e6b1" (UID: "1bb12456-0517-4952-a5ef-b2a06433e6b1"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.485842 5002 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.485870 5002 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.485880 5002 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.485892 5002 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.485901 5002 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.485910 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.485918 5002 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.485926 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq5gr\" (UniqueName: \"kubernetes.io/projected/1bb12456-0517-4952-a5ef-b2a06433e6b1-kube-api-access-hq5gr\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.485934 5002 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.485943 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.485952 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bb12456-0517-4952-a5ef-b2a06433e6b1-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.761291 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" event={"ID":"1bb12456-0517-4952-a5ef-b2a06433e6b1","Type":"ContainerDied","Data":"27f50ac30508d9693f03f624f3339531784fc5780538f0ffe8e9777ac9ea4817"} Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.761531 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f50ac30508d9693f03f624f3339531784fc5780538f0ffe8e9777ac9ea4817" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.761349 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-mtwlk" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.897781 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-7nglp"] Dec 09 12:18:59 crc kubenswrapper[5002]: E1209 12:18:59.898391 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" containerName="extract-utilities" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.898416 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" containerName="extract-utilities" Dec 09 12:18:59 crc kubenswrapper[5002]: E1209 12:18:59.898447 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb12456-0517-4952-a5ef-b2a06433e6b1" containerName="nova-cell1-openstack-openstack-cell1" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.898456 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb12456-0517-4952-a5ef-b2a06433e6b1" containerName="nova-cell1-openstack-openstack-cell1" Dec 09 12:18:59 crc kubenswrapper[5002]: E1209 12:18:59.898478 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" containerName="extract-content" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.898488 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" containerName="extract-content" Dec 09 12:18:59 crc kubenswrapper[5002]: E1209 12:18:59.898517 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" containerName="registry-server" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.898525 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" containerName="registry-server" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.898785 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c27e80-c4f3-4e58-bb65-96071dba6eef" containerName="registry-server" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.898839 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb12456-0517-4952-a5ef-b2a06433e6b1" containerName="nova-cell1-openstack-openstack-cell1" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.899830 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.902030 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.902493 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.902699 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.902886 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.903104 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.909867 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-7nglp"] Dec 09 12:18:59 crc kubenswrapper[5002]: E1209 12:18:59.957385 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bb12456_0517_4952_a5ef_b2a06433e6b1.slice\": RecentStats: unable to find data in memory cache]" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.995617 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.995679 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-inventory\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.995727 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n4fp\" (UniqueName: \"kubernetes.io/projected/0625b628-5e86-426a-a766-ce710d59a41e-kube-api-access-6n4fp\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.995789 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.995927 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceph\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.996016 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ssh-key\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.996055 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:18:59 crc kubenswrapper[5002]: I1209 12:18:59.996097 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.098198 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceph\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.098323 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ssh-key\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.098370 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.098416 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.098468 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.098496 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-inventory\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.098542 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n4fp\" (UniqueName: \"kubernetes.io/projected/0625b628-5e86-426a-a766-ce710d59a41e-kube-api-access-6n4fp\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.098601 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.103801 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.103791 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-inventory\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.104046 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceph\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.104372 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.104609 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ssh-key\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.105325 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.115831 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.120735 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n4fp\" (UniqueName: \"kubernetes.io/projected/0625b628-5e86-426a-a766-ce710d59a41e-kube-api-access-6n4fp\") pod \"telemetry-openstack-openstack-cell1-7nglp\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.229927 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:19:00 crc kubenswrapper[5002]: I1209 12:19:00.962074 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-7nglp"] Dec 09 12:19:01 crc kubenswrapper[5002]: I1209 12:19:01.781986 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-7nglp" event={"ID":"0625b628-5e86-426a-a766-ce710d59a41e","Type":"ContainerStarted","Data":"25f0dc80d777d0aa7ee10887c8ce5a8cf0b0b93fcc79d166cdba0b89a6b0387b"} Dec 09 12:19:01 crc kubenswrapper[5002]: I1209 12:19:01.782341 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-7nglp" event={"ID":"0625b628-5e86-426a-a766-ce710d59a41e","Type":"ContainerStarted","Data":"4ea7f66b41606be7eb77375376091637746003fc938119f64a221a5bf0ba300a"} Dec 09 12:19:01 crc kubenswrapper[5002]: I1209 12:19:01.806333 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-7nglp" podStartSLOduration=2.337480179 podStartE2EDuration="2.806307046s" podCreationTimestamp="2025-12-09 12:18:59 +0000 UTC" firstStartedPulling="2025-12-09 12:19:00.967005358 +0000 UTC m=+8273.359056439" lastFinishedPulling="2025-12-09 12:19:01.435832225 +0000 UTC m=+8273.827883306" observedRunningTime="2025-12-09 12:19:01.796547814 +0000 UTC m=+8274.188598905" watchObservedRunningTime="2025-12-09 12:19:01.806307046 +0000 UTC m=+8274.198358127" Dec 09 12:19:04 crc kubenswrapper[5002]: I1209 12:19:04.061038 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:19:04 crc kubenswrapper[5002]: E1209 12:19:04.061975 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:19:17 crc kubenswrapper[5002]: I1209 12:19:17.060884 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:19:17 crc kubenswrapper[5002]: E1209 12:19:17.061868 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:19:31 crc kubenswrapper[5002]: I1209 12:19:31.060572 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:19:31 crc kubenswrapper[5002]: E1209 12:19:31.062180 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:19:43 crc kubenswrapper[5002]: I1209 12:19:43.061162 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:19:43 crc kubenswrapper[5002]: E1209 12:19:43.062049 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:19:54 crc kubenswrapper[5002]: I1209 12:19:54.060880 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:19:54 crc kubenswrapper[5002]: E1209 12:19:54.062334 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:20:05 crc kubenswrapper[5002]: I1209 12:20:05.060550 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:20:05 crc kubenswrapper[5002]: E1209 12:20:05.062900 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:20:19 crc kubenswrapper[5002]: I1209 12:20:19.061464 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:20:19 crc kubenswrapper[5002]: E1209 12:20:19.062542 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:20:34 crc kubenswrapper[5002]: I1209 12:20:34.061521 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:20:34 crc kubenswrapper[5002]: E1209 12:20:34.062477 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:20:49 crc kubenswrapper[5002]: I1209 12:20:49.060276 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:20:49 crc kubenswrapper[5002]: E1209 12:20:49.062106 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:21:03 crc kubenswrapper[5002]: I1209 12:21:03.060111 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:21:03 crc kubenswrapper[5002]: E1209 12:21:03.060775 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:21:17 crc kubenswrapper[5002]: I1209 12:21:17.060719 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:21:17 crc kubenswrapper[5002]: E1209 12:21:17.061490 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:21:31 crc kubenswrapper[5002]: I1209 12:21:31.060077 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:21:31 crc kubenswrapper[5002]: E1209 12:21:31.061122 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.622766 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gh2p4"] Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.626547 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.639201 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gh2p4"] Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.696081 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsvwt\" (UniqueName: \"kubernetes.io/projected/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-kube-api-access-nsvwt\") pod \"certified-operators-gh2p4\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.696292 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-catalog-content\") pod \"certified-operators-gh2p4\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.696360 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-utilities\") pod \"certified-operators-gh2p4\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.797676 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-utilities\") pod \"certified-operators-gh2p4\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.797827 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsvwt\" (UniqueName: \"kubernetes.io/projected/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-kube-api-access-nsvwt\") pod \"certified-operators-gh2p4\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.797960 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-catalog-content\") pod \"certified-operators-gh2p4\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.798392 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-catalog-content\") pod \"certified-operators-gh2p4\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.798505 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-utilities\") pod \"certified-operators-gh2p4\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.822376 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsvwt\" (UniqueName: \"kubernetes.io/projected/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-kube-api-access-nsvwt\") pod \"certified-operators-gh2p4\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:35 crc kubenswrapper[5002]: I1209 12:21:35.957546 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:36 crc kubenswrapper[5002]: I1209 12:21:36.521052 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gh2p4"] Dec 09 12:21:37 crc kubenswrapper[5002]: I1209 12:21:37.516430 5002 generic.go:334] "Generic (PLEG): container finished" podID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" containerID="41401a0761db8c216b477f6410ee8368a9c19ff642eb6bc72adfa3ddd1a9c5c5" exitCode=0 Dec 09 12:21:37 crc kubenswrapper[5002]: I1209 12:21:37.516503 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gh2p4" event={"ID":"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b","Type":"ContainerDied","Data":"41401a0761db8c216b477f6410ee8368a9c19ff642eb6bc72adfa3ddd1a9c5c5"} Dec 09 12:21:37 crc kubenswrapper[5002]: I1209 12:21:37.516758 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gh2p4" event={"ID":"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b","Type":"ContainerStarted","Data":"749b6c140415cce5246fb1d5c4143e1d916c2b3e502b88e695be58dd4a724d44"} Dec 09 12:21:37 crc kubenswrapper[5002]: I1209 12:21:37.521690 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:21:38 crc kubenswrapper[5002]: I1209 12:21:38.531011 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gh2p4" event={"ID":"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b","Type":"ContainerStarted","Data":"0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2"} Dec 09 12:21:39 crc kubenswrapper[5002]: I1209 12:21:39.543238 5002 generic.go:334] "Generic (PLEG): container finished" podID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" containerID="0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2" exitCode=0 Dec 09 12:21:39 crc kubenswrapper[5002]: I1209 12:21:39.543265 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gh2p4" event={"ID":"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b","Type":"ContainerDied","Data":"0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2"} Dec 09 12:21:40 crc kubenswrapper[5002]: I1209 12:21:40.554878 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gh2p4" event={"ID":"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b","Type":"ContainerStarted","Data":"847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575"} Dec 09 12:21:40 crc kubenswrapper[5002]: I1209 12:21:40.581172 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gh2p4" podStartSLOduration=3.165565244 podStartE2EDuration="5.581150505s" podCreationTimestamp="2025-12-09 12:21:35 +0000 UTC" firstStartedPulling="2025-12-09 12:21:37.52128911 +0000 UTC m=+8429.913340231" lastFinishedPulling="2025-12-09 12:21:39.936874411 +0000 UTC m=+8432.328925492" observedRunningTime="2025-12-09 12:21:40.576976663 +0000 UTC m=+8432.969027764" watchObservedRunningTime="2025-12-09 12:21:40.581150505 +0000 UTC m=+8432.973201586" Dec 09 12:21:45 crc kubenswrapper[5002]: I1209 12:21:45.061652 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:21:45 crc kubenswrapper[5002]: E1209 12:21:45.063212 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:21:45 crc kubenswrapper[5002]: I1209 12:21:45.958412 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:45 crc kubenswrapper[5002]: I1209 12:21:45.959416 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:46 crc kubenswrapper[5002]: I1209 12:21:46.006160 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:46 crc kubenswrapper[5002]: I1209 12:21:46.675605 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:46 crc kubenswrapper[5002]: I1209 12:21:46.749496 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gh2p4"] Dec 09 12:21:48 crc kubenswrapper[5002]: I1209 12:21:48.645172 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gh2p4" podUID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" containerName="registry-server" containerID="cri-o://847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575" gracePeriod=2 Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.172389 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.317068 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsvwt\" (UniqueName: \"kubernetes.io/projected/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-kube-api-access-nsvwt\") pod \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.317551 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-catalog-content\") pod \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.317651 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-utilities\") pod \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\" (UID: \"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b\") " Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.319454 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-utilities" (OuterVolumeSpecName: "utilities") pod "ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" (UID: "ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.332195 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-kube-api-access-nsvwt" (OuterVolumeSpecName: "kube-api-access-nsvwt") pod "ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" (UID: "ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b"). InnerVolumeSpecName "kube-api-access-nsvwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.419944 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.419981 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsvwt\" (UniqueName: \"kubernetes.io/projected/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-kube-api-access-nsvwt\") on node \"crc\" DevicePath \"\"" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.535268 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" (UID: "ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.623998 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.662596 5002 generic.go:334] "Generic (PLEG): container finished" podID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" containerID="847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575" exitCode=0 Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.662634 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gh2p4" event={"ID":"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b","Type":"ContainerDied","Data":"847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575"} Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.662686 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gh2p4" event={"ID":"ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b","Type":"ContainerDied","Data":"749b6c140415cce5246fb1d5c4143e1d916c2b3e502b88e695be58dd4a724d44"} Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.662706 5002 scope.go:117] "RemoveContainer" containerID="847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.662891 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gh2p4" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.687109 5002 scope.go:117] "RemoveContainer" containerID="0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.717565 5002 scope.go:117] "RemoveContainer" containerID="41401a0761db8c216b477f6410ee8368a9c19ff642eb6bc72adfa3ddd1a9c5c5" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.736536 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gh2p4"] Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.754085 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gh2p4"] Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.785454 5002 scope.go:117] "RemoveContainer" containerID="847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575" Dec 09 12:21:49 crc kubenswrapper[5002]: E1209 12:21:49.786681 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575\": container with ID starting with 847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575 not found: ID does not exist" containerID="847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.786740 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575"} err="failed to get container status \"847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575\": rpc error: code = NotFound desc = could not find container \"847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575\": container with ID starting with 847557f2537b6b8bcb58f87a98fec4f4c9231822e8562d9717b4dbbe81448575 not found: ID does not exist" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.786776 5002 scope.go:117] "RemoveContainer" containerID="0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2" Dec 09 12:21:49 crc kubenswrapper[5002]: E1209 12:21:49.787452 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2\": container with ID starting with 0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2 not found: ID does not exist" containerID="0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.787499 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2"} err="failed to get container status \"0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2\": rpc error: code = NotFound desc = could not find container \"0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2\": container with ID starting with 0ecb546dbfac9f88c65a5a6140d79b1caf777d03cf0c0d20d8d966200039a0f2 not found: ID does not exist" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.787529 5002 scope.go:117] "RemoveContainer" containerID="41401a0761db8c216b477f6410ee8368a9c19ff642eb6bc72adfa3ddd1a9c5c5" Dec 09 12:21:49 crc kubenswrapper[5002]: E1209 12:21:49.788570 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41401a0761db8c216b477f6410ee8368a9c19ff642eb6bc72adfa3ddd1a9c5c5\": container with ID starting with 41401a0761db8c216b477f6410ee8368a9c19ff642eb6bc72adfa3ddd1a9c5c5 not found: ID does not exist" containerID="41401a0761db8c216b477f6410ee8368a9c19ff642eb6bc72adfa3ddd1a9c5c5" Dec 09 12:21:49 crc kubenswrapper[5002]: I1209 12:21:49.788600 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41401a0761db8c216b477f6410ee8368a9c19ff642eb6bc72adfa3ddd1a9c5c5"} err="failed to get container status \"41401a0761db8c216b477f6410ee8368a9c19ff642eb6bc72adfa3ddd1a9c5c5\": rpc error: code = NotFound desc = could not find container \"41401a0761db8c216b477f6410ee8368a9c19ff642eb6bc72adfa3ddd1a9c5c5\": container with ID starting with 41401a0761db8c216b477f6410ee8368a9c19ff642eb6bc72adfa3ddd1a9c5c5 not found: ID does not exist" Dec 09 12:21:50 crc kubenswrapper[5002]: I1209 12:21:50.074067 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" path="/var/lib/kubelet/pods/ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b/volumes" Dec 09 12:21:58 crc kubenswrapper[5002]: I1209 12:21:58.075428 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:21:58 crc kubenswrapper[5002]: E1209 12:21:58.077169 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:22:11 crc kubenswrapper[5002]: I1209 12:22:11.060634 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:22:11 crc kubenswrapper[5002]: I1209 12:22:11.928961 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"e9381aa30da767b73ba7a9c72912271f443653b1e1e8285560563da3163c3e90"} Dec 09 12:23:26 crc kubenswrapper[5002]: I1209 12:23:26.735344 5002 generic.go:334] "Generic (PLEG): container finished" podID="0625b628-5e86-426a-a766-ce710d59a41e" containerID="25f0dc80d777d0aa7ee10887c8ce5a8cf0b0b93fcc79d166cdba0b89a6b0387b" exitCode=0 Dec 09 12:23:26 crc kubenswrapper[5002]: I1209 12:23:26.735426 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-7nglp" event={"ID":"0625b628-5e86-426a-a766-ce710d59a41e","Type":"ContainerDied","Data":"25f0dc80d777d0aa7ee10887c8ce5a8cf0b0b93fcc79d166cdba0b89a6b0387b"} Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.173893 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.286225 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ssh-key\") pod \"0625b628-5e86-426a-a766-ce710d59a41e\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.286591 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-1\") pod \"0625b628-5e86-426a-a766-ce710d59a41e\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.286619 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-0\") pod \"0625b628-5e86-426a-a766-ce710d59a41e\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.286656 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-telemetry-combined-ca-bundle\") pod \"0625b628-5e86-426a-a766-ce710d59a41e\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.286771 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-inventory\") pod \"0625b628-5e86-426a-a766-ce710d59a41e\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.286804 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n4fp\" (UniqueName: \"kubernetes.io/projected/0625b628-5e86-426a-a766-ce710d59a41e-kube-api-access-6n4fp\") pod \"0625b628-5e86-426a-a766-ce710d59a41e\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.286853 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-2\") pod \"0625b628-5e86-426a-a766-ce710d59a41e\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.286870 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceph\") pod \"0625b628-5e86-426a-a766-ce710d59a41e\" (UID: \"0625b628-5e86-426a-a766-ce710d59a41e\") " Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.297081 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceph" (OuterVolumeSpecName: "ceph") pod "0625b628-5e86-426a-a766-ce710d59a41e" (UID: "0625b628-5e86-426a-a766-ce710d59a41e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.297204 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0625b628-5e86-426a-a766-ce710d59a41e" (UID: "0625b628-5e86-426a-a766-ce710d59a41e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.297210 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0625b628-5e86-426a-a766-ce710d59a41e-kube-api-access-6n4fp" (OuterVolumeSpecName: "kube-api-access-6n4fp") pod "0625b628-5e86-426a-a766-ce710d59a41e" (UID: "0625b628-5e86-426a-a766-ce710d59a41e"). InnerVolumeSpecName "kube-api-access-6n4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.316493 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-inventory" (OuterVolumeSpecName: "inventory") pod "0625b628-5e86-426a-a766-ce710d59a41e" (UID: "0625b628-5e86-426a-a766-ce710d59a41e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.317917 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "0625b628-5e86-426a-a766-ce710d59a41e" (UID: "0625b628-5e86-426a-a766-ce710d59a41e"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.319273 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "0625b628-5e86-426a-a766-ce710d59a41e" (UID: "0625b628-5e86-426a-a766-ce710d59a41e"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.320472 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "0625b628-5e86-426a-a766-ce710d59a41e" (UID: "0625b628-5e86-426a-a766-ce710d59a41e"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.322049 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0625b628-5e86-426a-a766-ce710d59a41e" (UID: "0625b628-5e86-426a-a766-ce710d59a41e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.389894 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.390194 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n4fp\" (UniqueName: \"kubernetes.io/projected/0625b628-5e86-426a-a766-ce710d59a41e-kube-api-access-6n4fp\") on node \"crc\" DevicePath \"\"" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.390279 5002 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.390354 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.390433 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.390505 5002 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.390580 5002 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.390653 5002 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0625b628-5e86-426a-a766-ce710d59a41e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.760402 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-7nglp" event={"ID":"0625b628-5e86-426a-a766-ce710d59a41e","Type":"ContainerDied","Data":"4ea7f66b41606be7eb77375376091637746003fc938119f64a221a5bf0ba300a"} Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.760443 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-7nglp" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.760461 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ea7f66b41606be7eb77375376091637746003fc938119f64a221a5bf0ba300a" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.895682 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-frp4b"] Dec 09 12:23:28 crc kubenswrapper[5002]: E1209 12:23:28.896200 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0625b628-5e86-426a-a766-ce710d59a41e" containerName="telemetry-openstack-openstack-cell1" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.896215 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="0625b628-5e86-426a-a766-ce710d59a41e" containerName="telemetry-openstack-openstack-cell1" Dec 09 12:23:28 crc kubenswrapper[5002]: E1209 12:23:28.896231 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" containerName="extract-utilities" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.896237 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" containerName="extract-utilities" Dec 09 12:23:28 crc kubenswrapper[5002]: E1209 12:23:28.896274 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" containerName="extract-content" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.896280 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" containerName="extract-content" Dec 09 12:23:28 crc kubenswrapper[5002]: E1209 12:23:28.896299 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" containerName="registry-server" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.896305 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" containerName="registry-server" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.896479 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee72f03e-9f9a-45d0-9aa6-3a9bffe6e73b" containerName="registry-server" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.896500 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="0625b628-5e86-426a-a766-ce710d59a41e" containerName="telemetry-openstack-openstack-cell1" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.897336 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.904581 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.904845 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.905006 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.905659 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.907950 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:23:28 crc kubenswrapper[5002]: I1209 12:23:28.908005 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-frp4b"] Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.004881 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.004951 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.005166 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd9ml\" (UniqueName: \"kubernetes.io/projected/effd7072-b70f-4208-a804-82ed8c1fca04-kube-api-access-gd9ml\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.005275 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.005378 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.005489 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.107564 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.107772 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9ml\" (UniqueName: \"kubernetes.io/projected/effd7072-b70f-4208-a804-82ed8c1fca04-kube-api-access-gd9ml\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.107886 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.108022 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.108131 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.109519 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.111781 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.113191 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.115260 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.119614 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.122023 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.134179 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd9ml\" (UniqueName: \"kubernetes.io/projected/effd7072-b70f-4208-a804-82ed8c1fca04-kube-api-access-gd9ml\") pod \"neutron-sriov-openstack-openstack-cell1-frp4b\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.225747 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:23:29 crc kubenswrapper[5002]: I1209 12:23:29.838536 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-frp4b"] Dec 09 12:23:30 crc kubenswrapper[5002]: I1209 12:23:30.787733 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" event={"ID":"effd7072-b70f-4208-a804-82ed8c1fca04","Type":"ContainerStarted","Data":"59b611fb7b36477cbd0e5a928c4fa3885c6aa252e0ef972212fb9833cda77f7e"} Dec 09 12:23:31 crc kubenswrapper[5002]: I1209 12:23:31.828539 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" event={"ID":"effd7072-b70f-4208-a804-82ed8c1fca04","Type":"ContainerStarted","Data":"3756e85e152246557bcb41354a04b2063fe3375019d35cdde339a55a98f01b46"} Dec 09 12:23:31 crc kubenswrapper[5002]: I1209 12:23:31.860142 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" podStartSLOduration=2.883548416 podStartE2EDuration="3.860117592s" podCreationTimestamp="2025-12-09 12:23:28 +0000 UTC" firstStartedPulling="2025-12-09 12:23:29.851605883 +0000 UTC m=+8542.243656964" lastFinishedPulling="2025-12-09 12:23:30.828175039 +0000 UTC m=+8543.220226140" observedRunningTime="2025-12-09 12:23:31.850177995 +0000 UTC m=+8544.242229096" watchObservedRunningTime="2025-12-09 12:23:31.860117592 +0000 UTC m=+8544.252168693" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.083248 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xbg"] Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.087737 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.108755 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xbg"] Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.183477 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-catalog-content\") pod \"redhat-marketplace-d9xbg\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.183603 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx6fk\" (UniqueName: \"kubernetes.io/projected/f6ba5490-be24-419e-8d3a-c80cb70ead3a-kube-api-access-hx6fk\") pod \"redhat-marketplace-d9xbg\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.184261 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-utilities\") pod \"redhat-marketplace-d9xbg\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.286356 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-catalog-content\") pod \"redhat-marketplace-d9xbg\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.286749 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx6fk\" (UniqueName: \"kubernetes.io/projected/f6ba5490-be24-419e-8d3a-c80cb70ead3a-kube-api-access-hx6fk\") pod \"redhat-marketplace-d9xbg\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.287020 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-catalog-content\") pod \"redhat-marketplace-d9xbg\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.287070 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-utilities\") pod \"redhat-marketplace-d9xbg\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.287348 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-utilities\") pod \"redhat-marketplace-d9xbg\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.306629 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx6fk\" (UniqueName: \"kubernetes.io/projected/f6ba5490-be24-419e-8d3a-c80cb70ead3a-kube-api-access-hx6fk\") pod \"redhat-marketplace-d9xbg\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.420840 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:13 crc kubenswrapper[5002]: I1209 12:24:13.918763 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xbg"] Dec 09 12:24:13 crc kubenswrapper[5002]: W1209 12:24:13.923301 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6ba5490_be24_419e_8d3a_c80cb70ead3a.slice/crio-45ff73a8b406f840b706519131f9d299ecbf3f62a57fa00ccd72f5a4adf71af0 WatchSource:0}: Error finding container 45ff73a8b406f840b706519131f9d299ecbf3f62a57fa00ccd72f5a4adf71af0: Status 404 returned error can't find the container with id 45ff73a8b406f840b706519131f9d299ecbf3f62a57fa00ccd72f5a4adf71af0 Dec 09 12:24:14 crc kubenswrapper[5002]: I1209 12:24:14.288223 5002 generic.go:334] "Generic (PLEG): container finished" podID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" containerID="1f04e53311017b9d87920375aa0d12762534983f12eea35957ff4e9872f701b6" exitCode=0 Dec 09 12:24:14 crc kubenswrapper[5002]: I1209 12:24:14.288274 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xbg" event={"ID":"f6ba5490-be24-419e-8d3a-c80cb70ead3a","Type":"ContainerDied","Data":"1f04e53311017b9d87920375aa0d12762534983f12eea35957ff4e9872f701b6"} Dec 09 12:24:14 crc kubenswrapper[5002]: I1209 12:24:14.288571 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xbg" event={"ID":"f6ba5490-be24-419e-8d3a-c80cb70ead3a","Type":"ContainerStarted","Data":"45ff73a8b406f840b706519131f9d299ecbf3f62a57fa00ccd72f5a4adf71af0"} Dec 09 12:24:15 crc kubenswrapper[5002]: I1209 12:24:15.303997 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xbg" event={"ID":"f6ba5490-be24-419e-8d3a-c80cb70ead3a","Type":"ContainerStarted","Data":"f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b"} Dec 09 12:24:16 crc kubenswrapper[5002]: I1209 12:24:16.314322 5002 generic.go:334] "Generic (PLEG): container finished" podID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" containerID="f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b" exitCode=0 Dec 09 12:24:16 crc kubenswrapper[5002]: I1209 12:24:16.314402 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xbg" event={"ID":"f6ba5490-be24-419e-8d3a-c80cb70ead3a","Type":"ContainerDied","Data":"f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b"} Dec 09 12:24:17 crc kubenswrapper[5002]: I1209 12:24:17.334873 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xbg" event={"ID":"f6ba5490-be24-419e-8d3a-c80cb70ead3a","Type":"ContainerStarted","Data":"4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec"} Dec 09 12:24:23 crc kubenswrapper[5002]: I1209 12:24:23.421772 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:23 crc kubenswrapper[5002]: I1209 12:24:23.422550 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:23 crc kubenswrapper[5002]: I1209 12:24:23.470637 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:23 crc kubenswrapper[5002]: I1209 12:24:23.500123 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d9xbg" podStartSLOduration=8.080230061 podStartE2EDuration="10.500098537s" podCreationTimestamp="2025-12-09 12:24:13 +0000 UTC" firstStartedPulling="2025-12-09 12:24:14.290164175 +0000 UTC m=+8586.682215256" lastFinishedPulling="2025-12-09 12:24:16.710032651 +0000 UTC m=+8589.102083732" observedRunningTime="2025-12-09 12:24:17.36115995 +0000 UTC m=+8589.753211051" watchObservedRunningTime="2025-12-09 12:24:23.500098537 +0000 UTC m=+8595.892149658" Dec 09 12:24:24 crc kubenswrapper[5002]: I1209 12:24:24.455591 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:24 crc kubenswrapper[5002]: I1209 12:24:24.665285 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xbg"] Dec 09 12:24:26 crc kubenswrapper[5002]: I1209 12:24:26.423198 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d9xbg" podUID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" containerName="registry-server" containerID="cri-o://4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec" gracePeriod=2 Dec 09 12:24:26 crc kubenswrapper[5002]: I1209 12:24:26.907464 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:26 crc kubenswrapper[5002]: I1209 12:24:26.989290 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-catalog-content\") pod \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " Dec 09 12:24:26 crc kubenswrapper[5002]: I1209 12:24:26.989384 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx6fk\" (UniqueName: \"kubernetes.io/projected/f6ba5490-be24-419e-8d3a-c80cb70ead3a-kube-api-access-hx6fk\") pod \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " Dec 09 12:24:26 crc kubenswrapper[5002]: I1209 12:24:26.989461 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-utilities\") pod \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\" (UID: \"f6ba5490-be24-419e-8d3a-c80cb70ead3a\") " Dec 09 12:24:26 crc kubenswrapper[5002]: I1209 12:24:26.991044 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-utilities" (OuterVolumeSpecName: "utilities") pod "f6ba5490-be24-419e-8d3a-c80cb70ead3a" (UID: "f6ba5490-be24-419e-8d3a-c80cb70ead3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.000052 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ba5490-be24-419e-8d3a-c80cb70ead3a-kube-api-access-hx6fk" (OuterVolumeSpecName: "kube-api-access-hx6fk") pod "f6ba5490-be24-419e-8d3a-c80cb70ead3a" (UID: "f6ba5490-be24-419e-8d3a-c80cb70ead3a"). InnerVolumeSpecName "kube-api-access-hx6fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.011006 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6ba5490-be24-419e-8d3a-c80cb70ead3a" (UID: "f6ba5490-be24-419e-8d3a-c80cb70ead3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.091914 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.091951 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx6fk\" (UniqueName: \"kubernetes.io/projected/f6ba5490-be24-419e-8d3a-c80cb70ead3a-kube-api-access-hx6fk\") on node \"crc\" DevicePath \"\"" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.091961 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ba5490-be24-419e-8d3a-c80cb70ead3a-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.433774 5002 generic.go:334] "Generic (PLEG): container finished" podID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" containerID="4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec" exitCode=0 Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.433839 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xbg" event={"ID":"f6ba5490-be24-419e-8d3a-c80cb70ead3a","Type":"ContainerDied","Data":"4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec"} Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.434110 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xbg" event={"ID":"f6ba5490-be24-419e-8d3a-c80cb70ead3a","Type":"ContainerDied","Data":"45ff73a8b406f840b706519131f9d299ecbf3f62a57fa00ccd72f5a4adf71af0"} Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.434132 5002 scope.go:117] "RemoveContainer" containerID="4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.433878 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xbg" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.474985 5002 scope.go:117] "RemoveContainer" containerID="f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.485194 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xbg"] Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.500910 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xbg"] Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.508060 5002 scope.go:117] "RemoveContainer" containerID="1f04e53311017b9d87920375aa0d12762534983f12eea35957ff4e9872f701b6" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.567793 5002 scope.go:117] "RemoveContainer" containerID="4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec" Dec 09 12:24:27 crc kubenswrapper[5002]: E1209 12:24:27.568228 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec\": container with ID starting with 4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec not found: ID does not exist" containerID="4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.568347 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec"} err="failed to get container status \"4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec\": rpc error: code = NotFound desc = could not find container \"4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec\": container with ID starting with 4839e7b6f99ca34b58d1b1158a3fe5ff01d635637bb217a41cb39f03d6407eec not found: ID does not exist" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.568454 5002 scope.go:117] "RemoveContainer" containerID="f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b" Dec 09 12:24:27 crc kubenswrapper[5002]: E1209 12:24:27.569089 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b\": container with ID starting with f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b not found: ID does not exist" containerID="f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.569133 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b"} err="failed to get container status \"f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b\": rpc error: code = NotFound desc = could not find container \"f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b\": container with ID starting with f1ad2d0288b13514be06b13b8abd5ab2209cd8ac01930d862919bb12b8c93f8b not found: ID does not exist" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.569166 5002 scope.go:117] "RemoveContainer" containerID="1f04e53311017b9d87920375aa0d12762534983f12eea35957ff4e9872f701b6" Dec 09 12:24:27 crc kubenswrapper[5002]: E1209 12:24:27.570046 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f04e53311017b9d87920375aa0d12762534983f12eea35957ff4e9872f701b6\": container with ID starting with 1f04e53311017b9d87920375aa0d12762534983f12eea35957ff4e9872f701b6 not found: ID does not exist" containerID="1f04e53311017b9d87920375aa0d12762534983f12eea35957ff4e9872f701b6" Dec 09 12:24:27 crc kubenswrapper[5002]: I1209 12:24:27.570162 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f04e53311017b9d87920375aa0d12762534983f12eea35957ff4e9872f701b6"} err="failed to get container status \"1f04e53311017b9d87920375aa0d12762534983f12eea35957ff4e9872f701b6\": rpc error: code = NotFound desc = could not find container \"1f04e53311017b9d87920375aa0d12762534983f12eea35957ff4e9872f701b6\": container with ID starting with 1f04e53311017b9d87920375aa0d12762534983f12eea35957ff4e9872f701b6 not found: ID does not exist" Dec 09 12:24:28 crc kubenswrapper[5002]: I1209 12:24:28.082993 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" path="/var/lib/kubelet/pods/f6ba5490-be24-419e-8d3a-c80cb70ead3a/volumes" Dec 09 12:24:37 crc kubenswrapper[5002]: I1209 12:24:37.964279 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:24:37 crc kubenswrapper[5002]: I1209 12:24:37.964906 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:25:07 crc kubenswrapper[5002]: I1209 12:25:07.965117 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:25:07 crc kubenswrapper[5002]: I1209 12:25:07.965653 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:25:19 crc kubenswrapper[5002]: I1209 12:25:19.982923 5002 generic.go:334] "Generic (PLEG): container finished" podID="effd7072-b70f-4208-a804-82ed8c1fca04" containerID="3756e85e152246557bcb41354a04b2063fe3375019d35cdde339a55a98f01b46" exitCode=0 Dec 09 12:25:19 crc kubenswrapper[5002]: I1209 12:25:19.983016 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" event={"ID":"effd7072-b70f-4208-a804-82ed8c1fca04","Type":"ContainerDied","Data":"3756e85e152246557bcb41354a04b2063fe3375019d35cdde339a55a98f01b46"} Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.558503 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.702950 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-inventory\") pod \"effd7072-b70f-4208-a804-82ed8c1fca04\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.703342 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-agent-neutron-config-0\") pod \"effd7072-b70f-4208-a804-82ed8c1fca04\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.703390 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ceph\") pod \"effd7072-b70f-4208-a804-82ed8c1fca04\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.703470 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd9ml\" (UniqueName: \"kubernetes.io/projected/effd7072-b70f-4208-a804-82ed8c1fca04-kube-api-access-gd9ml\") pod \"effd7072-b70f-4208-a804-82ed8c1fca04\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.703548 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-combined-ca-bundle\") pod \"effd7072-b70f-4208-a804-82ed8c1fca04\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.703600 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ssh-key\") pod \"effd7072-b70f-4208-a804-82ed8c1fca04\" (UID: \"effd7072-b70f-4208-a804-82ed8c1fca04\") " Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.729511 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ceph" (OuterVolumeSpecName: "ceph") pod "effd7072-b70f-4208-a804-82ed8c1fca04" (UID: "effd7072-b70f-4208-a804-82ed8c1fca04"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.732069 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effd7072-b70f-4208-a804-82ed8c1fca04-kube-api-access-gd9ml" (OuterVolumeSpecName: "kube-api-access-gd9ml") pod "effd7072-b70f-4208-a804-82ed8c1fca04" (UID: "effd7072-b70f-4208-a804-82ed8c1fca04"). InnerVolumeSpecName "kube-api-access-gd9ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.751051 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "effd7072-b70f-4208-a804-82ed8c1fca04" (UID: "effd7072-b70f-4208-a804-82ed8c1fca04"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.790120 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-inventory" (OuterVolumeSpecName: "inventory") pod "effd7072-b70f-4208-a804-82ed8c1fca04" (UID: "effd7072-b70f-4208-a804-82ed8c1fca04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.810849 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.810883 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.810897 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd9ml\" (UniqueName: \"kubernetes.io/projected/effd7072-b70f-4208-a804-82ed8c1fca04-kube-api-access-gd9ml\") on node \"crc\" DevicePath \"\"" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.810911 5002 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.811758 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "effd7072-b70f-4208-a804-82ed8c1fca04" (UID: "effd7072-b70f-4208-a804-82ed8c1fca04"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.815082 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "effd7072-b70f-4208-a804-82ed8c1fca04" (UID: "effd7072-b70f-4208-a804-82ed8c1fca04"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.913617 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:25:21 crc kubenswrapper[5002]: I1209 12:25:21.913658 5002 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/effd7072-b70f-4208-a804-82ed8c1fca04-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.008937 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" event={"ID":"effd7072-b70f-4208-a804-82ed8c1fca04","Type":"ContainerDied","Data":"59b611fb7b36477cbd0e5a928c4fa3885c6aa252e0ef972212fb9833cda77f7e"} Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.008974 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59b611fb7b36477cbd0e5a928c4fa3885c6aa252e0ef972212fb9833cda77f7e" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.009013 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-frp4b" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.195726 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7"] Dec 09 12:25:22 crc kubenswrapper[5002]: E1209 12:25:22.197174 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" containerName="registry-server" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.197217 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" containerName="registry-server" Dec 09 12:25:22 crc kubenswrapper[5002]: E1209 12:25:22.197242 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effd7072-b70f-4208-a804-82ed8c1fca04" containerName="neutron-sriov-openstack-openstack-cell1" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.197250 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="effd7072-b70f-4208-a804-82ed8c1fca04" containerName="neutron-sriov-openstack-openstack-cell1" Dec 09 12:25:22 crc kubenswrapper[5002]: E1209 12:25:22.197271 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" containerName="extract-content" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.197279 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" containerName="extract-content" Dec 09 12:25:22 crc kubenswrapper[5002]: E1209 12:25:22.197329 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" containerName="extract-utilities" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.197337 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" containerName="extract-utilities" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.197619 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6ba5490-be24-419e-8d3a-c80cb70ead3a" containerName="registry-server" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.197645 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="effd7072-b70f-4208-a804-82ed8c1fca04" containerName="neutron-sriov-openstack-openstack-cell1" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.198809 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.202345 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.202636 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.202902 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.203373 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.203595 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.211510 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7"] Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.321289 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.321389 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrvsn\" (UniqueName: \"kubernetes.io/projected/cb33c5e1-0410-4e89-951b-b7443e522f5f-kube-api-access-mrvsn\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.322683 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.322781 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.325139 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.325184 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.426730 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.427188 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.427225 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.427264 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.427295 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrvsn\" (UniqueName: \"kubernetes.io/projected/cb33c5e1-0410-4e89-951b-b7443e522f5f-kube-api-access-mrvsn\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.427427 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.431890 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.431890 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.432560 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.433235 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.433899 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.448651 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrvsn\" (UniqueName: \"kubernetes.io/projected/cb33c5e1-0410-4e89-951b-b7443e522f5f-kube-api-access-mrvsn\") pod \"neutron-dhcp-openstack-openstack-cell1-w6qc7\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:22 crc kubenswrapper[5002]: I1209 12:25:22.528725 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:25:23 crc kubenswrapper[5002]: I1209 12:25:23.117073 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7"] Dec 09 12:25:24 crc kubenswrapper[5002]: I1209 12:25:24.028857 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" event={"ID":"cb33c5e1-0410-4e89-951b-b7443e522f5f","Type":"ContainerStarted","Data":"1988f720d916dd9971bf10ddba1c66b77ca5b1f771524954b781e70cf1c0ba6a"} Dec 09 12:25:24 crc kubenswrapper[5002]: I1209 12:25:24.029204 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" event={"ID":"cb33c5e1-0410-4e89-951b-b7443e522f5f","Type":"ContainerStarted","Data":"9e07780d6f78cd21a29506ff6c344d9e97f0bb06b7606a227ebaabfd9506ff6b"} Dec 09 12:25:24 crc kubenswrapper[5002]: I1209 12:25:24.048924 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" podStartSLOduration=1.578296311 podStartE2EDuration="2.048903329s" podCreationTimestamp="2025-12-09 12:25:22 +0000 UTC" firstStartedPulling="2025-12-09 12:25:23.123677503 +0000 UTC m=+8655.515728584" lastFinishedPulling="2025-12-09 12:25:23.594284521 +0000 UTC m=+8655.986335602" observedRunningTime="2025-12-09 12:25:24.044148882 +0000 UTC m=+8656.436199963" watchObservedRunningTime="2025-12-09 12:25:24.048903329 +0000 UTC m=+8656.440954400" Dec 09 12:25:37 crc kubenswrapper[5002]: I1209 12:25:37.965091 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:25:37 crc kubenswrapper[5002]: I1209 12:25:37.965910 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:25:37 crc kubenswrapper[5002]: I1209 12:25:37.965962 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 12:25:37 crc kubenswrapper[5002]: I1209 12:25:37.966648 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9381aa30da767b73ba7a9c72912271f443653b1e1e8285560563da3163c3e90"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:25:37 crc kubenswrapper[5002]: I1209 12:25:37.966701 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://e9381aa30da767b73ba7a9c72912271f443653b1e1e8285560563da3163c3e90" gracePeriod=600 Dec 09 12:25:38 crc kubenswrapper[5002]: I1209 12:25:38.195549 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="e9381aa30da767b73ba7a9c72912271f443653b1e1e8285560563da3163c3e90" exitCode=0 Dec 09 12:25:38 crc kubenswrapper[5002]: I1209 12:25:38.195672 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"e9381aa30da767b73ba7a9c72912271f443653b1e1e8285560563da3163c3e90"} Dec 09 12:25:38 crc kubenswrapper[5002]: I1209 12:25:38.195890 5002 scope.go:117] "RemoveContainer" containerID="7780f1b6ff39bc4223d84da4678f12a7a9ba1a1c0041213b8f77f935907dd6aa" Dec 09 12:25:39 crc kubenswrapper[5002]: I1209 12:25:39.217335 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f"} Dec 09 12:26:38 crc kubenswrapper[5002]: I1209 12:26:38.803912 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gqhzt"] Dec 09 12:26:38 crc kubenswrapper[5002]: I1209 12:26:38.807372 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:38 crc kubenswrapper[5002]: I1209 12:26:38.816739 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqhzt"] Dec 09 12:26:38 crc kubenswrapper[5002]: I1209 12:26:38.888658 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-utilities\") pod \"redhat-operators-gqhzt\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:38 crc kubenswrapper[5002]: I1209 12:26:38.888753 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7tp9\" (UniqueName: \"kubernetes.io/projected/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-kube-api-access-t7tp9\") pod \"redhat-operators-gqhzt\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:38 crc kubenswrapper[5002]: I1209 12:26:38.888850 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-catalog-content\") pod \"redhat-operators-gqhzt\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:38 crc kubenswrapper[5002]: I1209 12:26:38.990382 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-utilities\") pod \"redhat-operators-gqhzt\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:38 crc kubenswrapper[5002]: I1209 12:26:38.990465 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7tp9\" (UniqueName: \"kubernetes.io/projected/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-kube-api-access-t7tp9\") pod \"redhat-operators-gqhzt\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:38 crc kubenswrapper[5002]: I1209 12:26:38.990522 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-catalog-content\") pod \"redhat-operators-gqhzt\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:38 crc kubenswrapper[5002]: I1209 12:26:38.991662 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-utilities\") pod \"redhat-operators-gqhzt\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:38 crc kubenswrapper[5002]: I1209 12:26:38.991686 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-catalog-content\") pod \"redhat-operators-gqhzt\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:39 crc kubenswrapper[5002]: I1209 12:26:39.019641 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7tp9\" (UniqueName: \"kubernetes.io/projected/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-kube-api-access-t7tp9\") pod \"redhat-operators-gqhzt\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:39 crc kubenswrapper[5002]: I1209 12:26:39.133208 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:39 crc kubenswrapper[5002]: I1209 12:26:39.726773 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqhzt"] Dec 09 12:26:39 crc kubenswrapper[5002]: I1209 12:26:39.832581 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqhzt" event={"ID":"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e","Type":"ContainerStarted","Data":"d2bfcdc6048ee4744d8f8c7abe5986c31c04214beeb7cbeb9b747519e9ee78c8"} Dec 09 12:26:40 crc kubenswrapper[5002]: I1209 12:26:40.847467 5002 generic.go:334] "Generic (PLEG): container finished" podID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerID="89198ed1b709eee55c8bc42495e91bedef52598bc8534e5664bf9d659302e44c" exitCode=0 Dec 09 12:26:40 crc kubenswrapper[5002]: I1209 12:26:40.847604 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqhzt" event={"ID":"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e","Type":"ContainerDied","Data":"89198ed1b709eee55c8bc42495e91bedef52598bc8534e5664bf9d659302e44c"} Dec 09 12:26:40 crc kubenswrapper[5002]: I1209 12:26:40.850200 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:26:41 crc kubenswrapper[5002]: I1209 12:26:41.859465 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqhzt" event={"ID":"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e","Type":"ContainerStarted","Data":"618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97"} Dec 09 12:26:45 crc kubenswrapper[5002]: I1209 12:26:45.907889 5002 generic.go:334] "Generic (PLEG): container finished" podID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerID="618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97" exitCode=0 Dec 09 12:26:45 crc kubenswrapper[5002]: I1209 12:26:45.907963 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqhzt" event={"ID":"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e","Type":"ContainerDied","Data":"618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97"} Dec 09 12:26:47 crc kubenswrapper[5002]: I1209 12:26:47.937319 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqhzt" event={"ID":"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e","Type":"ContainerStarted","Data":"8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3"} Dec 09 12:26:47 crc kubenswrapper[5002]: I1209 12:26:47.967912 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gqhzt" podStartSLOduration=3.270097807 podStartE2EDuration="9.967893674s" podCreationTimestamp="2025-12-09 12:26:38 +0000 UTC" firstStartedPulling="2025-12-09 12:26:40.849890355 +0000 UTC m=+8733.241941436" lastFinishedPulling="2025-12-09 12:26:47.547686222 +0000 UTC m=+8739.939737303" observedRunningTime="2025-12-09 12:26:47.957426033 +0000 UTC m=+8740.349477134" watchObservedRunningTime="2025-12-09 12:26:47.967893674 +0000 UTC m=+8740.359944745" Dec 09 12:26:49 crc kubenswrapper[5002]: I1209 12:26:49.133608 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:49 crc kubenswrapper[5002]: I1209 12:26:49.134089 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:50 crc kubenswrapper[5002]: I1209 12:26:50.187669 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gqhzt" podUID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerName="registry-server" probeResult="failure" output=< Dec 09 12:26:50 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 12:26:50 crc kubenswrapper[5002]: > Dec 09 12:26:59 crc kubenswrapper[5002]: I1209 12:26:59.192176 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:26:59 crc kubenswrapper[5002]: I1209 12:26:59.251088 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:27:01 crc kubenswrapper[5002]: I1209 12:27:01.187799 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqhzt"] Dec 09 12:27:01 crc kubenswrapper[5002]: I1209 12:27:01.188278 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gqhzt" podUID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerName="registry-server" containerID="cri-o://8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3" gracePeriod=2 Dec 09 12:27:01 crc kubenswrapper[5002]: I1209 12:27:01.905247 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.069008 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7tp9\" (UniqueName: \"kubernetes.io/projected/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-kube-api-access-t7tp9\") pod \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.069080 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-catalog-content\") pod \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.069152 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-utilities\") pod \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\" (UID: \"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e\") " Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.070443 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-utilities" (OuterVolumeSpecName: "utilities") pod "e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" (UID: "e4e4c4b9-1818-4f83-87d6-21f219b7ea7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.077094 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-kube-api-access-t7tp9" (OuterVolumeSpecName: "kube-api-access-t7tp9") pod "e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" (UID: "e4e4c4b9-1818-4f83-87d6-21f219b7ea7e"). InnerVolumeSpecName "kube-api-access-t7tp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.125363 5002 generic.go:334] "Generic (PLEG): container finished" podID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerID="8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3" exitCode=0 Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.125432 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqhzt" event={"ID":"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e","Type":"ContainerDied","Data":"8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3"} Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.125448 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqhzt" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.125467 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqhzt" event={"ID":"e4e4c4b9-1818-4f83-87d6-21f219b7ea7e","Type":"ContainerDied","Data":"d2bfcdc6048ee4744d8f8c7abe5986c31c04214beeb7cbeb9b747519e9ee78c8"} Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.125487 5002 scope.go:117] "RemoveContainer" containerID="8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.148761 5002 scope.go:117] "RemoveContainer" containerID="618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.171556 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7tp9\" (UniqueName: \"kubernetes.io/projected/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-kube-api-access-t7tp9\") on node \"crc\" DevicePath \"\"" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.171909 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.172294 5002 scope.go:117] "RemoveContainer" containerID="89198ed1b709eee55c8bc42495e91bedef52598bc8534e5664bf9d659302e44c" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.180340 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" (UID: "e4e4c4b9-1818-4f83-87d6-21f219b7ea7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.225960 5002 scope.go:117] "RemoveContainer" containerID="8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3" Dec 09 12:27:02 crc kubenswrapper[5002]: E1209 12:27:02.226679 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3\": container with ID starting with 8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3 not found: ID does not exist" containerID="8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.226733 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3"} err="failed to get container status \"8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3\": rpc error: code = NotFound desc = could not find container \"8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3\": container with ID starting with 8aa918740c2a4ace9357d41a9a0dd2af3e041d19b2f7640817b7d95215d8e5c3 not found: ID does not exist" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.226764 5002 scope.go:117] "RemoveContainer" containerID="618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97" Dec 09 12:27:02 crc kubenswrapper[5002]: E1209 12:27:02.227432 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97\": container with ID starting with 618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97 not found: ID does not exist" containerID="618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.227502 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97"} err="failed to get container status \"618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97\": rpc error: code = NotFound desc = could not find container \"618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97\": container with ID starting with 618c4fd4fabf4dd6d16d41e8e2867ca0217a843a9a745ec97d673031414c0f97 not found: ID does not exist" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.227529 5002 scope.go:117] "RemoveContainer" containerID="89198ed1b709eee55c8bc42495e91bedef52598bc8534e5664bf9d659302e44c" Dec 09 12:27:02 crc kubenswrapper[5002]: E1209 12:27:02.227933 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89198ed1b709eee55c8bc42495e91bedef52598bc8534e5664bf9d659302e44c\": container with ID starting with 89198ed1b709eee55c8bc42495e91bedef52598bc8534e5664bf9d659302e44c not found: ID does not exist" containerID="89198ed1b709eee55c8bc42495e91bedef52598bc8534e5664bf9d659302e44c" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.227966 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89198ed1b709eee55c8bc42495e91bedef52598bc8534e5664bf9d659302e44c"} err="failed to get container status \"89198ed1b709eee55c8bc42495e91bedef52598bc8534e5664bf9d659302e44c\": rpc error: code = NotFound desc = could not find container \"89198ed1b709eee55c8bc42495e91bedef52598bc8534e5664bf9d659302e44c\": container with ID starting with 89198ed1b709eee55c8bc42495e91bedef52598bc8534e5664bf9d659302e44c not found: ID does not exist" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.274233 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.468301 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqhzt"] Dec 09 12:27:02 crc kubenswrapper[5002]: I1209 12:27:02.476523 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gqhzt"] Dec 09 12:27:04 crc kubenswrapper[5002]: I1209 12:27:04.075479 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" path="/var/lib/kubelet/pods/e4e4c4b9-1818-4f83-87d6-21f219b7ea7e/volumes" Dec 09 12:28:07 crc kubenswrapper[5002]: I1209 12:28:07.964374 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:28:07 crc kubenswrapper[5002]: I1209 12:28:07.965149 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:28:37 crc kubenswrapper[5002]: I1209 12:28:37.965089 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:28:37 crc kubenswrapper[5002]: I1209 12:28:37.965709 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:29:07 crc kubenswrapper[5002]: I1209 12:29:07.964533 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:29:07 crc kubenswrapper[5002]: I1209 12:29:07.965143 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:29:07 crc kubenswrapper[5002]: I1209 12:29:07.965209 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 12:29:07 crc kubenswrapper[5002]: I1209 12:29:07.966156 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:29:07 crc kubenswrapper[5002]: I1209 12:29:07.966226 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" gracePeriod=600 Dec 09 12:29:08 crc kubenswrapper[5002]: E1209 12:29:08.088705 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:29:08 crc kubenswrapper[5002]: I1209 12:29:08.450701 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" exitCode=0 Dec 09 12:29:08 crc kubenswrapper[5002]: I1209 12:29:08.450742 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f"} Dec 09 12:29:08 crc kubenswrapper[5002]: I1209 12:29:08.450774 5002 scope.go:117] "RemoveContainer" containerID="e9381aa30da767b73ba7a9c72912271f443653b1e1e8285560563da3163c3e90" Dec 09 12:29:08 crc kubenswrapper[5002]: I1209 12:29:08.451462 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:29:08 crc kubenswrapper[5002]: E1209 12:29:08.451836 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:29:24 crc kubenswrapper[5002]: I1209 12:29:24.060623 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:29:24 crc kubenswrapper[5002]: E1209 12:29:24.061849 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:29:37 crc kubenswrapper[5002]: I1209 12:29:37.061713 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:29:37 crc kubenswrapper[5002]: E1209 12:29:37.063089 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:29:41 crc kubenswrapper[5002]: I1209 12:29:41.801034 5002 generic.go:334] "Generic (PLEG): container finished" podID="cb33c5e1-0410-4e89-951b-b7443e522f5f" containerID="1988f720d916dd9971bf10ddba1c66b77ca5b1f771524954b781e70cf1c0ba6a" exitCode=0 Dec 09 12:29:41 crc kubenswrapper[5002]: I1209 12:29:41.801117 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" event={"ID":"cb33c5e1-0410-4e89-951b-b7443e522f5f","Type":"ContainerDied","Data":"1988f720d916dd9971bf10ddba1c66b77ca5b1f771524954b781e70cf1c0ba6a"} Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.285592 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.393153 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-agent-neutron-config-0\") pod \"cb33c5e1-0410-4e89-951b-b7443e522f5f\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.393293 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ssh-key\") pod \"cb33c5e1-0410-4e89-951b-b7443e522f5f\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.393461 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-combined-ca-bundle\") pod \"cb33c5e1-0410-4e89-951b-b7443e522f5f\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.393551 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ceph\") pod \"cb33c5e1-0410-4e89-951b-b7443e522f5f\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.393656 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrvsn\" (UniqueName: \"kubernetes.io/projected/cb33c5e1-0410-4e89-951b-b7443e522f5f-kube-api-access-mrvsn\") pod \"cb33c5e1-0410-4e89-951b-b7443e522f5f\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.393763 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-inventory\") pod \"cb33c5e1-0410-4e89-951b-b7443e522f5f\" (UID: \"cb33c5e1-0410-4e89-951b-b7443e522f5f\") " Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.400572 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ceph" (OuterVolumeSpecName: "ceph") pod "cb33c5e1-0410-4e89-951b-b7443e522f5f" (UID: "cb33c5e1-0410-4e89-951b-b7443e522f5f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.400991 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb33c5e1-0410-4e89-951b-b7443e522f5f-kube-api-access-mrvsn" (OuterVolumeSpecName: "kube-api-access-mrvsn") pod "cb33c5e1-0410-4e89-951b-b7443e522f5f" (UID: "cb33c5e1-0410-4e89-951b-b7443e522f5f"). InnerVolumeSpecName "kube-api-access-mrvsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.403024 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "cb33c5e1-0410-4e89-951b-b7443e522f5f" (UID: "cb33c5e1-0410-4e89-951b-b7443e522f5f"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.429854 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cb33c5e1-0410-4e89-951b-b7443e522f5f" (UID: "cb33c5e1-0410-4e89-951b-b7443e522f5f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.431872 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-inventory" (OuterVolumeSpecName: "inventory") pod "cb33c5e1-0410-4e89-951b-b7443e522f5f" (UID: "cb33c5e1-0410-4e89-951b-b7443e522f5f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.444498 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "cb33c5e1-0410-4e89-951b-b7443e522f5f" (UID: "cb33c5e1-0410-4e89-951b-b7443e522f5f"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.497144 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.497188 5002 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.497199 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.497209 5002 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.497218 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb33c5e1-0410-4e89-951b-b7443e522f5f-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.497227 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrvsn\" (UniqueName: \"kubernetes.io/projected/cb33c5e1-0410-4e89-951b-b7443e522f5f-kube-api-access-mrvsn\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.835221 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" event={"ID":"cb33c5e1-0410-4e89-951b-b7443e522f5f","Type":"ContainerDied","Data":"9e07780d6f78cd21a29506ff6c344d9e97f0bb06b7606a227ebaabfd9506ff6b"} Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.835288 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e07780d6f78cd21a29506ff6c344d9e97f0bb06b7606a227ebaabfd9506ff6b" Dec 09 12:29:43 crc kubenswrapper[5002]: I1209 12:29:43.835426 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-w6qc7" Dec 09 12:29:51 crc kubenswrapper[5002]: I1209 12:29:51.251667 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 12:29:51 crc kubenswrapper[5002]: I1209 12:29:51.252674 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1" containerName="nova-cell0-conductor-conductor" containerID="cri-o://5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772" gracePeriod=30 Dec 09 12:29:51 crc kubenswrapper[5002]: I1209 12:29:51.287769 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 12:29:51 crc kubenswrapper[5002]: I1209 12:29:51.288265 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="7d9ef5f3-f7e3-435c-a053-f63dff9e5a09" containerName="nova-cell1-conductor-conductor" containerID="cri-o://4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f" gracePeriod=30 Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.052036 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.052577 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8" containerName="nova-scheduler-scheduler" containerID="cri-o://c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac" gracePeriod=30 Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.061027 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:29:52 crc kubenswrapper[5002]: E1209 12:29:52.061393 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.073718 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.073973 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c46fe918-c220-4c57-812e-7a085b199362" containerName="nova-api-log" containerID="cri-o://a5f4153c494e051ca3a95763a488b6d59775fcea18f2eef4a7f65e480a887a30" gracePeriod=30 Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.074319 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c46fe918-c220-4c57-812e-7a085b199362" containerName="nova-api-api" containerID="cri-o://79506c16e9a3624e2e56c4d1192962fd11abf39bba5efc9e0da33fae1bea68a5" gracePeriod=30 Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.119494 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.119723 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-log" containerID="cri-o://0ae179bd45135554ae40623e3f8436aebb71e338b5e536f221e0e5bb9258cbbc" gracePeriod=30 Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.119759 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-metadata" containerID="cri-o://458af308bf65a903ca6da8a820f6f3734105a18e2c74e18e70f8c34b16c61eaf" gracePeriod=30 Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.660990 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.716992 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-config-data\") pod \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.717271 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95h97\" (UniqueName: \"kubernetes.io/projected/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-kube-api-access-95h97\") pod \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.717335 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-combined-ca-bundle\") pod \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\" (UID: \"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09\") " Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.725241 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-kube-api-access-95h97" (OuterVolumeSpecName: "kube-api-access-95h97") pod "7d9ef5f3-f7e3-435c-a053-f63dff9e5a09" (UID: "7d9ef5f3-f7e3-435c-a053-f63dff9e5a09"). InnerVolumeSpecName "kube-api-access-95h97". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.751369 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-config-data" (OuterVolumeSpecName: "config-data") pod "7d9ef5f3-f7e3-435c-a053-f63dff9e5a09" (UID: "7d9ef5f3-f7e3-435c-a053-f63dff9e5a09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.756297 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d9ef5f3-f7e3-435c-a053-f63dff9e5a09" (UID: "7d9ef5f3-f7e3-435c-a053-f63dff9e5a09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.820470 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.820502 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95h97\" (UniqueName: \"kubernetes.io/projected/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-kube-api-access-95h97\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.820513 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.927441 5002 generic.go:334] "Generic (PLEG): container finished" podID="dd771216-04c2-4469-9146-6a732a36843d" containerID="0ae179bd45135554ae40623e3f8436aebb71e338b5e536f221e0e5bb9258cbbc" exitCode=143 Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.927542 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd771216-04c2-4469-9146-6a732a36843d","Type":"ContainerDied","Data":"0ae179bd45135554ae40623e3f8436aebb71e338b5e536f221e0e5bb9258cbbc"} Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.930376 5002 generic.go:334] "Generic (PLEG): container finished" podID="7d9ef5f3-f7e3-435c-a053-f63dff9e5a09" containerID="4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f" exitCode=0 Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.930430 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09","Type":"ContainerDied","Data":"4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f"} Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.930456 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.930488 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7d9ef5f3-f7e3-435c-a053-f63dff9e5a09","Type":"ContainerDied","Data":"59dfbed8097003e16b2f892d0f431bba75ded2ff53dd1f3352d8a6c94d85d267"} Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.930516 5002 scope.go:117] "RemoveContainer" containerID="4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.933755 5002 generic.go:334] "Generic (PLEG): container finished" podID="c46fe918-c220-4c57-812e-7a085b199362" containerID="a5f4153c494e051ca3a95763a488b6d59775fcea18f2eef4a7f65e480a887a30" exitCode=143 Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.933851 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46fe918-c220-4c57-812e-7a085b199362","Type":"ContainerDied","Data":"a5f4153c494e051ca3a95763a488b6d59775fcea18f2eef4a7f65e480a887a30"} Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.980747 5002 scope.go:117] "RemoveContainer" containerID="4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f" Dec 09 12:29:52 crc kubenswrapper[5002]: E1209 12:29:52.981576 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f\": container with ID starting with 4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f not found: ID does not exist" containerID="4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.981616 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f"} err="failed to get container status \"4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f\": rpc error: code = NotFound desc = could not find container \"4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f\": container with ID starting with 4dcc8a09bf27504e0d7254ed726d01ff3d84b152bc832cf0f854262abe188c0f not found: ID does not exist" Dec 09 12:29:52 crc kubenswrapper[5002]: I1209 12:29:52.982833 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.003645 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.019349 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 12:29:53 crc kubenswrapper[5002]: E1209 12:29:53.019868 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerName="extract-utilities" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.019893 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerName="extract-utilities" Dec 09 12:29:53 crc kubenswrapper[5002]: E1209 12:29:53.019919 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerName="extract-content" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.019926 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerName="extract-content" Dec 09 12:29:53 crc kubenswrapper[5002]: E1209 12:29:53.019956 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9ef5f3-f7e3-435c-a053-f63dff9e5a09" containerName="nova-cell1-conductor-conductor" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.019962 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9ef5f3-f7e3-435c-a053-f63dff9e5a09" containerName="nova-cell1-conductor-conductor" Dec 09 12:29:53 crc kubenswrapper[5002]: E1209 12:29:53.019981 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerName="registry-server" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.019987 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerName="registry-server" Dec 09 12:29:53 crc kubenswrapper[5002]: E1209 12:29:53.020004 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb33c5e1-0410-4e89-951b-b7443e522f5f" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.020010 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb33c5e1-0410-4e89-951b-b7443e522f5f" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.020207 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d9ef5f3-f7e3-435c-a053-f63dff9e5a09" containerName="nova-cell1-conductor-conductor" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.020232 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb33c5e1-0410-4e89-951b-b7443e522f5f" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.020244 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e4c4b9-1818-4f83-87d6-21f219b7ea7e" containerName="registry-server" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.021119 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.023761 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.032299 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.125934 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac\") " pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.126710 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac\") " pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.127492 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt6gm\" (UniqueName: \"kubernetes.io/projected/5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac-kube-api-access-pt6gm\") pod \"nova-cell1-conductor-0\" (UID: \"5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac\") " pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.229202 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt6gm\" (UniqueName: \"kubernetes.io/projected/5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac-kube-api-access-pt6gm\") pod \"nova-cell1-conductor-0\" (UID: \"5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac\") " pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.229612 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac\") " pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.230766 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac\") " pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.234147 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac\") " pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.234246 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac\") " pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.257363 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt6gm\" (UniqueName: \"kubernetes.io/projected/5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac-kube-api-access-pt6gm\") pod \"nova-cell1-conductor-0\" (UID: \"5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac\") " pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.347242 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.841250 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 12:29:53 crc kubenswrapper[5002]: I1209 12:29:53.948443 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac","Type":"ContainerStarted","Data":"0947e0aae8c84c2cd51e9f14456b8b39b4cfa874b7d762c9b8e56376de65bb23"} Dec 09 12:29:54 crc kubenswrapper[5002]: I1209 12:29:54.074979 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d9ef5f3-f7e3-435c-a053-f63dff9e5a09" path="/var/lib/kubelet/pods/7d9ef5f3-f7e3-435c-a053-f63dff9e5a09/volumes" Dec 09 12:29:54 crc kubenswrapper[5002]: E1209 12:29:54.757429 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 12:29:54 crc kubenswrapper[5002]: E1209 12:29:54.759208 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 12:29:54 crc kubenswrapper[5002]: E1209 12:29:54.760440 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 12:29:54 crc kubenswrapper[5002]: E1209 12:29:54.760488 5002 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1" containerName="nova-cell0-conductor-conductor" Dec 09 12:29:54 crc kubenswrapper[5002]: I1209 12:29:54.968755 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac","Type":"ContainerStarted","Data":"e81a60ab551ef1cf6d664e3f05628ee39157f4efc2b5c332c29ad16768cf9b3b"} Dec 09 12:29:54 crc kubenswrapper[5002]: I1209 12:29:54.969066 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 09 12:29:54 crc kubenswrapper[5002]: I1209 12:29:54.996090 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.996064732 podStartE2EDuration="2.996064732s" podCreationTimestamp="2025-12-09 12:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 12:29:54.988502379 +0000 UTC m=+8927.380553470" watchObservedRunningTime="2025-12-09 12:29:54.996064732 +0000 UTC m=+8927.388115813" Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.550764 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.79:8775/\": read tcp 10.217.0.2:40024->10.217.1.79:8775: read: connection reset by peer" Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.550805 5002 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.79:8775/\": read tcp 10.217.0.2:40028->10.217.1.79:8775: read: connection reset by peer" Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.720180 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.784746 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-config-data\") pod \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.784805 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-combined-ca-bundle\") pod \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.785078 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcv4b\" (UniqueName: \"kubernetes.io/projected/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-kube-api-access-pcv4b\") pod \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\" (UID: \"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1\") " Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.814562 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-kube-api-access-pcv4b" (OuterVolumeSpecName: "kube-api-access-pcv4b") pod "6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1" (UID: "6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1"). InnerVolumeSpecName "kube-api-access-pcv4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.821938 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-config-data" (OuterVolumeSpecName: "config-data") pod "6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1" (UID: "6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.891350 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.891384 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcv4b\" (UniqueName: \"kubernetes.io/projected/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-kube-api-access-pcv4b\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.891668 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1" (UID: "6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:55 crc kubenswrapper[5002]: I1209 12:29:55.998280 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.017332 5002 generic.go:334] "Generic (PLEG): container finished" podID="dd771216-04c2-4469-9146-6a732a36843d" containerID="458af308bf65a903ca6da8a820f6f3734105a18e2c74e18e70f8c34b16c61eaf" exitCode=0 Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.017413 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd771216-04c2-4469-9146-6a732a36843d","Type":"ContainerDied","Data":"458af308bf65a903ca6da8a820f6f3734105a18e2c74e18e70f8c34b16c61eaf"} Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.023761 5002 generic.go:334] "Generic (PLEG): container finished" podID="6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1" containerID="5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772" exitCode=0 Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.023935 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.024009 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1","Type":"ContainerDied","Data":"5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772"} Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.024048 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1","Type":"ContainerDied","Data":"9ef541577938a089596ea3b9f0db2391e229f83220bd4d1ecdefbc0450b7c9bb"} Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.024070 5002 scope.go:117] "RemoveContainer" containerID="5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.039103 5002 generic.go:334] "Generic (PLEG): container finished" podID="c46fe918-c220-4c57-812e-7a085b199362" containerID="79506c16e9a3624e2e56c4d1192962fd11abf39bba5efc9e0da33fae1bea68a5" exitCode=0 Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.039198 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46fe918-c220-4c57-812e-7a085b199362","Type":"ContainerDied","Data":"79506c16e9a3624e2e56c4d1192962fd11abf39bba5efc9e0da33fae1bea68a5"} Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.088436 5002 scope.go:117] "RemoveContainer" containerID="5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772" Dec 09 12:29:56 crc kubenswrapper[5002]: E1209 12:29:56.089752 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772\": container with ID starting with 5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772 not found: ID does not exist" containerID="5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.089790 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772"} err="failed to get container status \"5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772\": rpc error: code = NotFound desc = could not find container \"5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772\": container with ID starting with 5ebd010935b181e73099be1ce513a2ab84bb15b83e52724c0b3aac325675e772 not found: ID does not exist" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.101762 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.101911 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.114677 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 12:29:56 crc kubenswrapper[5002]: E1209 12:29:56.115294 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1" containerName="nova-cell0-conductor-conductor" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.115314 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1" containerName="nova-cell0-conductor-conductor" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.115540 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1" containerName="nova-cell0-conductor-conductor" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.116390 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.118499 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.125880 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.204445 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16a9875-2e67-4fd1-aad1-4e56d5d8f460-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e16a9875-2e67-4fd1-aad1-4e56d5d8f460\") " pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.205184 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16a9875-2e67-4fd1-aad1-4e56d5d8f460-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e16a9875-2e67-4fd1-aad1-4e56d5d8f460\") " pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.205424 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bqn4\" (UniqueName: \"kubernetes.io/projected/e16a9875-2e67-4fd1-aad1-4e56d5d8f460-kube-api-access-8bqn4\") pod \"nova-cell0-conductor-0\" (UID: \"e16a9875-2e67-4fd1-aad1-4e56d5d8f460\") " pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.307157 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bqn4\" (UniqueName: \"kubernetes.io/projected/e16a9875-2e67-4fd1-aad1-4e56d5d8f460-kube-api-access-8bqn4\") pod \"nova-cell0-conductor-0\" (UID: \"e16a9875-2e67-4fd1-aad1-4e56d5d8f460\") " pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.307637 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16a9875-2e67-4fd1-aad1-4e56d5d8f460-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e16a9875-2e67-4fd1-aad1-4e56d5d8f460\") " pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.308007 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16a9875-2e67-4fd1-aad1-4e56d5d8f460-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e16a9875-2e67-4fd1-aad1-4e56d5d8f460\") " pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.321208 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww"] Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.322858 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.333675 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.333970 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.333747 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.334379 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.335031 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.335248 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ngftr" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.340849 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.356860 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww"] Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.411637 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.411729 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.411909 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.411980 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.412089 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms7km\" (UniqueName: \"kubernetes.io/projected/e448dd02-1116-4b33-9298-7053138072e9-kube-api-access-ms7km\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.412159 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.412460 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.412555 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.412626 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.412658 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.412733 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.515243 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.515429 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.515938 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms7km\" (UniqueName: \"kubernetes.io/projected/e448dd02-1116-4b33-9298-7053138072e9-kube-api-access-ms7km\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.516015 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.516113 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.516195 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.516257 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.516281 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.516316 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.516422 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.516488 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.517800 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.518283 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.741846 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bqn4\" (UniqueName: \"kubernetes.io/projected/e16a9875-2e67-4fd1-aad1-4e56d5d8f460-kube-api-access-8bqn4\") pod \"nova-cell0-conductor-0\" (UID: \"e16a9875-2e67-4fd1-aad1-4e56d5d8f460\") " pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.742380 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16a9875-2e67-4fd1-aad1-4e56d5d8f460-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e16a9875-2e67-4fd1-aad1-4e56d5d8f460\") " pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.746579 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16a9875-2e67-4fd1-aad1-4e56d5d8f460-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e16a9875-2e67-4fd1-aad1-4e56d5d8f460\") " pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.759458 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.763090 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.763201 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.765463 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.766087 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.768365 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.768631 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.769700 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.771464 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.772673 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms7km\" (UniqueName: \"kubernetes.io/projected/e448dd02-1116-4b33-9298-7053138072e9-kube-api-access-ms7km\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:56 crc kubenswrapper[5002]: E1209 12:29:56.957731 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac is running failed: container process not found" containerID="c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 12:29:56 crc kubenswrapper[5002]: E1209 12:29:56.958229 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac is running failed: container process not found" containerID="c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 12:29:56 crc kubenswrapper[5002]: E1209 12:29:56.958485 5002 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac is running failed: container process not found" containerID="c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 12:29:56 crc kubenswrapper[5002]: E1209 12:29:56.958523 5002 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8" containerName="nova-scheduler-scheduler" Dec 09 12:29:56 crc kubenswrapper[5002]: I1209 12:29:56.982758 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.054519 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46fe918-c220-4c57-812e-7a085b199362","Type":"ContainerDied","Data":"7b92fffe4b161f51da5de6be9cf521b7451e8261d8afedf2f90ca0da8e3a0f12"} Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.054924 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b92fffe4b161f51da5de6be9cf521b7451e8261d8afedf2f90ca0da8e3a0f12" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.057328 5002 generic.go:334] "Generic (PLEG): container finished" podID="4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8" containerID="c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac" exitCode=0 Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.057415 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8","Type":"ContainerDied","Data":"c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac"} Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.066130 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd771216-04c2-4469-9146-6a732a36843d","Type":"ContainerDied","Data":"061bcacb81e49f1850ad96ad24125d6c2c4b7345885f80879aece99ed3c55362"} Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.066193 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="061bcacb81e49f1850ad96ad24125d6c2c4b7345885f80879aece99ed3c55362" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.151224 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.157861 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.234688 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-combined-ca-bundle\") pod \"dd771216-04c2-4469-9146-6a732a36843d\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.235180 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-config-data\") pod \"dd771216-04c2-4469-9146-6a732a36843d\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.235562 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hslx\" (UniqueName: \"kubernetes.io/projected/c46fe918-c220-4c57-812e-7a085b199362-kube-api-access-5hslx\") pod \"c46fe918-c220-4c57-812e-7a085b199362\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.235676 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd771216-04c2-4469-9146-6a732a36843d-logs\") pod \"dd771216-04c2-4469-9146-6a732a36843d\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.235874 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-config-data\") pod \"c46fe918-c220-4c57-812e-7a085b199362\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.236003 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq42m\" (UniqueName: \"kubernetes.io/projected/dd771216-04c2-4469-9146-6a732a36843d-kube-api-access-zq42m\") pod \"dd771216-04c2-4469-9146-6a732a36843d\" (UID: \"dd771216-04c2-4469-9146-6a732a36843d\") " Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.236129 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46fe918-c220-4c57-812e-7a085b199362-logs\") pod \"c46fe918-c220-4c57-812e-7a085b199362\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.236337 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-combined-ca-bundle\") pod \"c46fe918-c220-4c57-812e-7a085b199362\" (UID: \"c46fe918-c220-4c57-812e-7a085b199362\") " Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.241978 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd771216-04c2-4469-9146-6a732a36843d-logs" (OuterVolumeSpecName: "logs") pod "dd771216-04c2-4469-9146-6a732a36843d" (UID: "dd771216-04c2-4469-9146-6a732a36843d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.242897 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c46fe918-c220-4c57-812e-7a085b199362-logs" (OuterVolumeSpecName: "logs") pod "c46fe918-c220-4c57-812e-7a085b199362" (UID: "c46fe918-c220-4c57-812e-7a085b199362"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.245150 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c46fe918-c220-4c57-812e-7a085b199362-kube-api-access-5hslx" (OuterVolumeSpecName: "kube-api-access-5hslx") pod "c46fe918-c220-4c57-812e-7a085b199362" (UID: "c46fe918-c220-4c57-812e-7a085b199362"). InnerVolumeSpecName "kube-api-access-5hslx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.250004 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd771216-04c2-4469-9146-6a732a36843d-kube-api-access-zq42m" (OuterVolumeSpecName: "kube-api-access-zq42m") pod "dd771216-04c2-4469-9146-6a732a36843d" (UID: "dd771216-04c2-4469-9146-6a732a36843d"). InnerVolumeSpecName "kube-api-access-zq42m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.274543 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-config-data" (OuterVolumeSpecName: "config-data") pod "c46fe918-c220-4c57-812e-7a085b199362" (UID: "c46fe918-c220-4c57-812e-7a085b199362"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.277290 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd771216-04c2-4469-9146-6a732a36843d" (UID: "dd771216-04c2-4469-9146-6a732a36843d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.289982 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-config-data" (OuterVolumeSpecName: "config-data") pod "dd771216-04c2-4469-9146-6a732a36843d" (UID: "dd771216-04c2-4469-9146-6a732a36843d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.302073 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c46fe918-c220-4c57-812e-7a085b199362" (UID: "c46fe918-c220-4c57-812e-7a085b199362"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.317840 5002 scope.go:117] "RemoveContainer" containerID="0ae179bd45135554ae40623e3f8436aebb71e338b5e536f221e0e5bb9258cbbc" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.341631 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.341669 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hslx\" (UniqueName: \"kubernetes.io/projected/c46fe918-c220-4c57-812e-7a085b199362-kube-api-access-5hslx\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.341687 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd771216-04c2-4469-9146-6a732a36843d-logs\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.341697 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.341708 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq42m\" (UniqueName: \"kubernetes.io/projected/dd771216-04c2-4469-9146-6a732a36843d-kube-api-access-zq42m\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.341720 5002 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46fe918-c220-4c57-812e-7a085b199362-logs\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.341730 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46fe918-c220-4c57-812e-7a085b199362-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.341743 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd771216-04c2-4469-9146-6a732a36843d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.422017 5002 scope.go:117] "RemoveContainer" containerID="458af308bf65a903ca6da8a820f6f3734105a18e2c74e18e70f8c34b16c61eaf" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.452450 5002 scope.go:117] "RemoveContainer" containerID="a5f4153c494e051ca3a95763a488b6d59775fcea18f2eef4a7f65e480a887a30" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.476140 5002 scope.go:117] "RemoveContainer" containerID="79506c16e9a3624e2e56c4d1192962fd11abf39bba5efc9e0da33fae1bea68a5" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.608314 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.669258 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.751935 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-combined-ca-bundle\") pod \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.752636 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfz9l\" (UniqueName: \"kubernetes.io/projected/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-kube-api-access-rfz9l\") pod \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.752686 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-config-data\") pod \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\" (UID: \"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8\") " Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.770548 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-kube-api-access-rfz9l" (OuterVolumeSpecName: "kube-api-access-rfz9l") pod "4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8" (UID: "4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8"). InnerVolumeSpecName "kube-api-access-rfz9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.791261 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-config-data" (OuterVolumeSpecName: "config-data") pod "4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8" (UID: "4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.795301 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8" (UID: "4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.796631 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww"] Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.855800 5002 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.855868 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfz9l\" (UniqueName: \"kubernetes.io/projected/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-kube-api-access-rfz9l\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:57 crc kubenswrapper[5002]: I1209 12:29:57.855881 5002 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.080959 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.083848 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.083876 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.097315 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1" path="/var/lib/kubelet/pods/6f9f69d8-c33e-4a92-9a10-e4f4d6dde7b1/volumes" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.106695 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.106743 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" event={"ID":"e448dd02-1116-4b33-9298-7053138072e9","Type":"ContainerStarted","Data":"10adb23d808bbe5f3e880bd75b92702ae2e705a9161ab1561cdb0d9aa26be9a4"} Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.106784 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8","Type":"ContainerDied","Data":"b645a19032d129bdc850b33463612cf05f6314c0748ac5c46b79b9db8ec5d3ba"} Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.106854 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e16a9875-2e67-4fd1-aad1-4e56d5d8f460","Type":"ContainerStarted","Data":"c575d708a573e4c1faed4310c6dab4b11a0f62727d7548563a860caafe6f527e"} Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.106871 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e16a9875-2e67-4fd1-aad1-4e56d5d8f460","Type":"ContainerStarted","Data":"e669f87e85d1af1af44cedc344941155fae854fa7b19b2da880a89a8658ecf30"} Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.106897 5002 scope.go:117] "RemoveContainer" containerID="c8450d3a608834ab9f31afd013a7fb75dc5024672797bf598790495aec53f4ac" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.198568 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.215737 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.267716 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: E1209 12:29:58.269280 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46fe918-c220-4c57-812e-7a085b199362" containerName="nova-api-log" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.269315 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46fe918-c220-4c57-812e-7a085b199362" containerName="nova-api-log" Dec 09 12:29:58 crc kubenswrapper[5002]: E1209 12:29:58.269490 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-metadata" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.269511 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-metadata" Dec 09 12:29:58 crc kubenswrapper[5002]: E1209 12:29:58.269539 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46fe918-c220-4c57-812e-7a085b199362" containerName="nova-api-api" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.269548 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46fe918-c220-4c57-812e-7a085b199362" containerName="nova-api-api" Dec 09 12:29:58 crc kubenswrapper[5002]: E1209 12:29:58.269567 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-log" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.269576 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-log" Dec 09 12:29:58 crc kubenswrapper[5002]: E1209 12:29:58.269595 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8" containerName="nova-scheduler-scheduler" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.269605 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8" containerName="nova-scheduler-scheduler" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.270184 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46fe918-c220-4c57-812e-7a085b199362" containerName="nova-api-api" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.270221 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46fe918-c220-4c57-812e-7a085b199362" containerName="nova-api-log" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.270246 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8" containerName="nova-scheduler-scheduler" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.270267 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-log" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.270288 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd771216-04c2-4469-9146-6a732a36843d" containerName="nova-metadata-metadata" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.272408 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.290953 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.290923923 podStartE2EDuration="2.290923923s" podCreationTimestamp="2025-12-09 12:29:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 12:29:58.179244991 +0000 UTC m=+8930.571296072" watchObservedRunningTime="2025-12-09 12:29:58.290923923 +0000 UTC m=+8930.682975004" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.296333 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.341287 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.354601 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.368992 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.377103 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8zt6\" (UniqueName: \"kubernetes.io/projected/94c16746-775c-4ee9-8cdc-898ff655cd41-kube-api-access-v8zt6\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.377212 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c16746-775c-4ee9-8cdc-898ff655cd41-logs\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.377267 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c16746-775c-4ee9-8cdc-898ff655cd41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.377304 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c16746-775c-4ee9-8cdc-898ff655cd41-config-data\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.381294 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.394582 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.406967 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.408678 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.411258 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.424672 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.438527 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.441166 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.443512 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.454199 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.480163 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfnc7\" (UniqueName: \"kubernetes.io/projected/e0fbb27e-b807-4823-978f-0ff21020d012-kube-api-access-nfnc7\") pod \"nova-scheduler-0\" (UID: \"e0fbb27e-b807-4823-978f-0ff21020d012\") " pod="openstack/nova-scheduler-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.480435 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8zt6\" (UniqueName: \"kubernetes.io/projected/94c16746-775c-4ee9-8cdc-898ff655cd41-kube-api-access-v8zt6\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.480630 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c16746-775c-4ee9-8cdc-898ff655cd41-logs\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.480675 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0fbb27e-b807-4823-978f-0ff21020d012-config-data\") pod \"nova-scheduler-0\" (UID: \"e0fbb27e-b807-4823-978f-0ff21020d012\") " pod="openstack/nova-scheduler-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.480705 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c16746-775c-4ee9-8cdc-898ff655cd41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.480743 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0fbb27e-b807-4823-978f-0ff21020d012-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e0fbb27e-b807-4823-978f-0ff21020d012\") " pod="openstack/nova-scheduler-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.480779 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c16746-775c-4ee9-8cdc-898ff655cd41-config-data\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.481583 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c16746-775c-4ee9-8cdc-898ff655cd41-logs\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.583172 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17dde858-c91e-499f-be5e-1e859b11c81f-config-data\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.583259 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0fbb27e-b807-4823-978f-0ff21020d012-config-data\") pod \"nova-scheduler-0\" (UID: \"e0fbb27e-b807-4823-978f-0ff21020d012\") " pod="openstack/nova-scheduler-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.583307 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0fbb27e-b807-4823-978f-0ff21020d012-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e0fbb27e-b807-4823-978f-0ff21020d012\") " pod="openstack/nova-scheduler-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.583418 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dde858-c91e-499f-be5e-1e859b11c81f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.583493 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfnc7\" (UniqueName: \"kubernetes.io/projected/e0fbb27e-b807-4823-978f-0ff21020d012-kube-api-access-nfnc7\") pod \"nova-scheduler-0\" (UID: \"e0fbb27e-b807-4823-978f-0ff21020d012\") " pod="openstack/nova-scheduler-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.583524 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17dde858-c91e-499f-be5e-1e859b11c81f-logs\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.583563 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2ttq\" (UniqueName: \"kubernetes.io/projected/17dde858-c91e-499f-be5e-1e859b11c81f-kube-api-access-z2ttq\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.685663 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2ttq\" (UniqueName: \"kubernetes.io/projected/17dde858-c91e-499f-be5e-1e859b11c81f-kube-api-access-z2ttq\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.685798 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17dde858-c91e-499f-be5e-1e859b11c81f-config-data\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.686090 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dde858-c91e-499f-be5e-1e859b11c81f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.686235 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17dde858-c91e-499f-be5e-1e859b11c81f-logs\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.687061 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17dde858-c91e-499f-be5e-1e859b11c81f-logs\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.840886 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c16746-775c-4ee9-8cdc-898ff655cd41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.841919 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c16746-775c-4ee9-8cdc-898ff655cd41-config-data\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.842594 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8zt6\" (UniqueName: \"kubernetes.io/projected/94c16746-775c-4ee9-8cdc-898ff655cd41-kube-api-access-v8zt6\") pod \"nova-metadata-0\" (UID: \"94c16746-775c-4ee9-8cdc-898ff655cd41\") " pod="openstack/nova-metadata-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.843743 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0fbb27e-b807-4823-978f-0ff21020d012-config-data\") pod \"nova-scheduler-0\" (UID: \"e0fbb27e-b807-4823-978f-0ff21020d012\") " pod="openstack/nova-scheduler-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.852574 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0fbb27e-b807-4823-978f-0ff21020d012-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e0fbb27e-b807-4823-978f-0ff21020d012\") " pod="openstack/nova-scheduler-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.852950 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfnc7\" (UniqueName: \"kubernetes.io/projected/e0fbb27e-b807-4823-978f-0ff21020d012-kube-api-access-nfnc7\") pod \"nova-scheduler-0\" (UID: \"e0fbb27e-b807-4823-978f-0ff21020d012\") " pod="openstack/nova-scheduler-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.858370 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dde858-c91e-499f-be5e-1e859b11c81f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.859086 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17dde858-c91e-499f-be5e-1e859b11c81f-config-data\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:58 crc kubenswrapper[5002]: I1209 12:29:58.890392 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2ttq\" (UniqueName: \"kubernetes.io/projected/17dde858-c91e-499f-be5e-1e859b11c81f-kube-api-access-z2ttq\") pod \"nova-api-0\" (UID: \"17dde858-c91e-499f-be5e-1e859b11c81f\") " pod="openstack/nova-api-0" Dec 09 12:29:59 crc kubenswrapper[5002]: I1209 12:29:59.257766 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 12:29:59 crc kubenswrapper[5002]: I1209 12:29:59.409846 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 12:29:59 crc kubenswrapper[5002]: I1209 12:29:59.433965 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 12:29:59 crc kubenswrapper[5002]: W1209 12:29:59.774335 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94c16746_775c_4ee9_8cdc_898ff655cd41.slice/crio-8cda45c581bb897a670e092227bb8a4bc79010059791cb14dd40d5d22073ff0b WatchSource:0}: Error finding container 8cda45c581bb897a670e092227bb8a4bc79010059791cb14dd40d5d22073ff0b: Status 404 returned error can't find the container with id 8cda45c581bb897a670e092227bb8a4bc79010059791cb14dd40d5d22073ff0b Dec 09 12:29:59 crc kubenswrapper[5002]: I1209 12:29:59.777546 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.021303 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 12:30:00 crc kubenswrapper[5002]: W1209 12:30:00.024442 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17dde858_c91e_499f_be5e_1e859b11c81f.slice/crio-2749e6015b8eeab3660dc1966161005b2c30e5b0496294a56754cc1b6a3c6f48 WatchSource:0}: Error finding container 2749e6015b8eeab3660dc1966161005b2c30e5b0496294a56754cc1b6a3c6f48: Status 404 returned error can't find the container with id 2749e6015b8eeab3660dc1966161005b2c30e5b0496294a56754cc1b6a3c6f48 Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.088580 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8" path="/var/lib/kubelet/pods/4356bbcb-7ff8-4553-ac2e-bc86d2a9c1d8/volumes" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.089518 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c46fe918-c220-4c57-812e-7a085b199362" path="/var/lib/kubelet/pods/c46fe918-c220-4c57-812e-7a085b199362/volumes" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.090274 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd771216-04c2-4469-9146-6a732a36843d" path="/var/lib/kubelet/pods/dd771216-04c2-4469-9146-6a732a36843d/volumes" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.134565 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.136609 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17dde858-c91e-499f-be5e-1e859b11c81f","Type":"ContainerStarted","Data":"2749e6015b8eeab3660dc1966161005b2c30e5b0496294a56754cc1b6a3c6f48"} Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.144341 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"94c16746-775c-4ee9-8cdc-898ff655cd41","Type":"ContainerStarted","Data":"34a2b1aeda1eb825043f1fdb02725c9b7099a2a916ff84a2a0cee54c91bea7f1"} Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.144395 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"94c16746-775c-4ee9-8cdc-898ff655cd41","Type":"ContainerStarted","Data":"8cda45c581bb897a670e092227bb8a4bc79010059791cb14dd40d5d22073ff0b"} Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.149362 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz"] Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.151997 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.152630 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" event={"ID":"e448dd02-1116-4b33-9298-7053138072e9","Type":"ContainerStarted","Data":"6a52117b925a89a617f2a80100ef70755aa70686ff4956c5b51d90fbf4903da6"} Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.160308 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.160535 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.162240 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz"] Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.222547 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" podStartSLOduration=3.650545343 podStartE2EDuration="4.222523636s" podCreationTimestamp="2025-12-09 12:29:56 +0000 UTC" firstStartedPulling="2025-12-09 12:29:57.8071256 +0000 UTC m=+8930.199176681" lastFinishedPulling="2025-12-09 12:29:58.379103893 +0000 UTC m=+8930.771154974" observedRunningTime="2025-12-09 12:30:00.208935911 +0000 UTC m=+8932.600987002" watchObservedRunningTime="2025-12-09 12:30:00.222523636 +0000 UTC m=+8932.614574717" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.231391 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2l85\" (UniqueName: \"kubernetes.io/projected/d97cfde0-d55c-4e80-87fa-3d94006e25d1-kube-api-access-s2l85\") pod \"collect-profiles-29421390-rvbcz\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.231788 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d97cfde0-d55c-4e80-87fa-3d94006e25d1-config-volume\") pod \"collect-profiles-29421390-rvbcz\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.231957 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d97cfde0-d55c-4e80-87fa-3d94006e25d1-secret-volume\") pod \"collect-profiles-29421390-rvbcz\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.333664 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2l85\" (UniqueName: \"kubernetes.io/projected/d97cfde0-d55c-4e80-87fa-3d94006e25d1-kube-api-access-s2l85\") pod \"collect-profiles-29421390-rvbcz\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.335457 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d97cfde0-d55c-4e80-87fa-3d94006e25d1-config-volume\") pod \"collect-profiles-29421390-rvbcz\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.335663 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d97cfde0-d55c-4e80-87fa-3d94006e25d1-secret-volume\") pod \"collect-profiles-29421390-rvbcz\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.337573 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d97cfde0-d55c-4e80-87fa-3d94006e25d1-config-volume\") pod \"collect-profiles-29421390-rvbcz\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.346484 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d97cfde0-d55c-4e80-87fa-3d94006e25d1-secret-volume\") pod \"collect-profiles-29421390-rvbcz\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.360799 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2l85\" (UniqueName: \"kubernetes.io/projected/d97cfde0-d55c-4e80-87fa-3d94006e25d1-kube-api-access-s2l85\") pod \"collect-profiles-29421390-rvbcz\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:00 crc kubenswrapper[5002]: I1209 12:30:00.540890 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:01 crc kubenswrapper[5002]: I1209 12:30:01.084520 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz"] Dec 09 12:30:01 crc kubenswrapper[5002]: W1209 12:30:01.107744 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd97cfde0_d55c_4e80_87fa_3d94006e25d1.slice/crio-6af452e1fb0ef3a08849e5571ae75009a629f6e18342f4d0a364775e677396f8 WatchSource:0}: Error finding container 6af452e1fb0ef3a08849e5571ae75009a629f6e18342f4d0a364775e677396f8: Status 404 returned error can't find the container with id 6af452e1fb0ef3a08849e5571ae75009a629f6e18342f4d0a364775e677396f8 Dec 09 12:30:01 crc kubenswrapper[5002]: I1209 12:30:01.172632 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"94c16746-775c-4ee9-8cdc-898ff655cd41","Type":"ContainerStarted","Data":"c289b2c03d6820cb8014b1e5c9e9ec79eb7261afd3f7b125cb0ed9b88947c818"} Dec 09 12:30:01 crc kubenswrapper[5002]: I1209 12:30:01.174789 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" event={"ID":"d97cfde0-d55c-4e80-87fa-3d94006e25d1","Type":"ContainerStarted","Data":"6af452e1fb0ef3a08849e5571ae75009a629f6e18342f4d0a364775e677396f8"} Dec 09 12:30:01 crc kubenswrapper[5002]: I1209 12:30:01.176759 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e0fbb27e-b807-4823-978f-0ff21020d012","Type":"ContainerStarted","Data":"64dd46d157826649f5e4bc03c6261a05167df7c93096aa006ec4bbe1fd73b0a8"} Dec 09 12:30:01 crc kubenswrapper[5002]: I1209 12:30:01.176836 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e0fbb27e-b807-4823-978f-0ff21020d012","Type":"ContainerStarted","Data":"878d713e7df8cffc644af7f772d8de844a7ef38df40fb9e92588364a32421bce"} Dec 09 12:30:01 crc kubenswrapper[5002]: I1209 12:30:01.180173 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17dde858-c91e-499f-be5e-1e859b11c81f","Type":"ContainerStarted","Data":"4d3fb49fdb294dbc5b8c196247affff40e9ffbf174cf40c12aaae7808dec6c4b"} Dec 09 12:30:01 crc kubenswrapper[5002]: I1209 12:30:01.180223 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17dde858-c91e-499f-be5e-1e859b11c81f","Type":"ContainerStarted","Data":"8d5c1e96de276bb295dcdf13ba4c5e9f07d8bff344056ceb648dac8a2fd3e38b"} Dec 09 12:30:01 crc kubenswrapper[5002]: I1209 12:30:01.205117 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.205095742 podStartE2EDuration="3.205095742s" podCreationTimestamp="2025-12-09 12:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 12:30:01.192773671 +0000 UTC m=+8933.584824752" watchObservedRunningTime="2025-12-09 12:30:01.205095742 +0000 UTC m=+8933.597146823" Dec 09 12:30:01 crc kubenswrapper[5002]: I1209 12:30:01.227125 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.227101904 podStartE2EDuration="3.227101904s" podCreationTimestamp="2025-12-09 12:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 12:30:01.209620864 +0000 UTC m=+8933.601671945" watchObservedRunningTime="2025-12-09 12:30:01.227101904 +0000 UTC m=+8933.619152985" Dec 09 12:30:01 crc kubenswrapper[5002]: I1209 12:30:01.248861 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.248837968 podStartE2EDuration="3.248837968s" podCreationTimestamp="2025-12-09 12:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 12:30:01.233997079 +0000 UTC m=+8933.626048160" watchObservedRunningTime="2025-12-09 12:30:01.248837968 +0000 UTC m=+8933.640889049" Dec 09 12:30:02 crc kubenswrapper[5002]: I1209 12:30:02.194515 5002 generic.go:334] "Generic (PLEG): container finished" podID="d97cfde0-d55c-4e80-87fa-3d94006e25d1" containerID="472b4174f8feabd8ce67403abc0b6f79765bb656c5f2843d5588e0388d2dd959" exitCode=0 Dec 09 12:30:02 crc kubenswrapper[5002]: I1209 12:30:02.194608 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" event={"ID":"d97cfde0-d55c-4e80-87fa-3d94006e25d1","Type":"ContainerDied","Data":"472b4174f8feabd8ce67403abc0b6f79765bb656c5f2843d5588e0388d2dd959"} Dec 09 12:30:03 crc kubenswrapper[5002]: I1209 12:30:03.377747 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 09 12:30:03 crc kubenswrapper[5002]: I1209 12:30:03.640171 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:03 crc kubenswrapper[5002]: I1209 12:30:03.728108 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d97cfde0-d55c-4e80-87fa-3d94006e25d1-config-volume\") pod \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " Dec 09 12:30:03 crc kubenswrapper[5002]: I1209 12:30:03.728157 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2l85\" (UniqueName: \"kubernetes.io/projected/d97cfde0-d55c-4e80-87fa-3d94006e25d1-kube-api-access-s2l85\") pod \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " Dec 09 12:30:03 crc kubenswrapper[5002]: I1209 12:30:03.728436 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d97cfde0-d55c-4e80-87fa-3d94006e25d1-secret-volume\") pod \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\" (UID: \"d97cfde0-d55c-4e80-87fa-3d94006e25d1\") " Dec 09 12:30:03 crc kubenswrapper[5002]: I1209 12:30:03.728681 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97cfde0-d55c-4e80-87fa-3d94006e25d1-config-volume" (OuterVolumeSpecName: "config-volume") pod "d97cfde0-d55c-4e80-87fa-3d94006e25d1" (UID: "d97cfde0-d55c-4e80-87fa-3d94006e25d1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:30:03 crc kubenswrapper[5002]: I1209 12:30:03.729129 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d97cfde0-d55c-4e80-87fa-3d94006e25d1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:30:03 crc kubenswrapper[5002]: I1209 12:30:03.734896 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d97cfde0-d55c-4e80-87fa-3d94006e25d1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d97cfde0-d55c-4e80-87fa-3d94006e25d1" (UID: "d97cfde0-d55c-4e80-87fa-3d94006e25d1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:30:03 crc kubenswrapper[5002]: I1209 12:30:03.741578 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97cfde0-d55c-4e80-87fa-3d94006e25d1-kube-api-access-s2l85" (OuterVolumeSpecName: "kube-api-access-s2l85") pod "d97cfde0-d55c-4e80-87fa-3d94006e25d1" (UID: "d97cfde0-d55c-4e80-87fa-3d94006e25d1"). InnerVolumeSpecName "kube-api-access-s2l85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:30:03 crc kubenswrapper[5002]: I1209 12:30:03.831245 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2l85\" (UniqueName: \"kubernetes.io/projected/d97cfde0-d55c-4e80-87fa-3d94006e25d1-kube-api-access-s2l85\") on node \"crc\" DevicePath \"\"" Dec 09 12:30:03 crc kubenswrapper[5002]: I1209 12:30:03.831291 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d97cfde0-d55c-4e80-87fa-3d94006e25d1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:30:04 crc kubenswrapper[5002]: I1209 12:30:04.217271 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" event={"ID":"d97cfde0-d55c-4e80-87fa-3d94006e25d1","Type":"ContainerDied","Data":"6af452e1fb0ef3a08849e5571ae75009a629f6e18342f4d0a364775e677396f8"} Dec 09 12:30:04 crc kubenswrapper[5002]: I1209 12:30:04.217315 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6af452e1fb0ef3a08849e5571ae75009a629f6e18342f4d0a364775e677396f8" Dec 09 12:30:04 crc kubenswrapper[5002]: I1209 12:30:04.217322 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-rvbcz" Dec 09 12:30:04 crc kubenswrapper[5002]: I1209 12:30:04.258118 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 12:30:04 crc kubenswrapper[5002]: I1209 12:30:04.258162 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 12:30:04 crc kubenswrapper[5002]: I1209 12:30:04.411511 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 12:30:04 crc kubenswrapper[5002]: I1209 12:30:04.717654 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5"] Dec 09 12:30:04 crc kubenswrapper[5002]: I1209 12:30:04.727003 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421345-6ksl5"] Dec 09 12:30:06 crc kubenswrapper[5002]: I1209 12:30:06.073539 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84b92a8a-c566-499c-903d-9fc0021b343e" path="/var/lib/kubelet/pods/84b92a8a-c566-499c-903d-9fc0021b343e/volumes" Dec 09 12:30:06 crc kubenswrapper[5002]: I1209 12:30:06.792658 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 09 12:30:07 crc kubenswrapper[5002]: I1209 12:30:07.061140 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:30:07 crc kubenswrapper[5002]: E1209 12:30:07.061632 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:30:09 crc kubenswrapper[5002]: I1209 12:30:09.258864 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 12:30:09 crc kubenswrapper[5002]: I1209 12:30:09.259202 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 12:30:09 crc kubenswrapper[5002]: I1209 12:30:09.411904 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 12:30:09 crc kubenswrapper[5002]: I1209 12:30:09.435128 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 12:30:09 crc kubenswrapper[5002]: I1209 12:30:09.435180 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 12:30:09 crc kubenswrapper[5002]: I1209 12:30:09.584183 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 12:30:10 crc kubenswrapper[5002]: I1209 12:30:10.315964 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 12:30:10 crc kubenswrapper[5002]: I1209 12:30:10.340980 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="94c16746-775c-4ee9-8cdc-898ff655cd41" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.186:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 12:30:10 crc kubenswrapper[5002]: I1209 12:30:10.341326 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="94c16746-775c-4ee9-8cdc-898ff655cd41" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.186:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 12:30:10 crc kubenswrapper[5002]: I1209 12:30:10.517064 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="17dde858-c91e-499f-be5e-1e859b11c81f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 12:30:10 crc kubenswrapper[5002]: I1209 12:30:10.517086 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="17dde858-c91e-499f-be5e-1e859b11c81f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 12:30:18 crc kubenswrapper[5002]: I1209 12:30:18.071206 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:30:18 crc kubenswrapper[5002]: E1209 12:30:18.071966 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:30:19 crc kubenswrapper[5002]: I1209 12:30:19.261357 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 12:30:19 crc kubenswrapper[5002]: I1209 12:30:19.265595 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 12:30:19 crc kubenswrapper[5002]: I1209 12:30:19.267489 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 12:30:19 crc kubenswrapper[5002]: I1209 12:30:19.389233 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 12:30:19 crc kubenswrapper[5002]: I1209 12:30:19.450267 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 12:30:19 crc kubenswrapper[5002]: I1209 12:30:19.451921 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 12:30:19 crc kubenswrapper[5002]: I1209 12:30:19.452074 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 12:30:19 crc kubenswrapper[5002]: I1209 12:30:19.460835 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 12:30:20 crc kubenswrapper[5002]: I1209 12:30:20.397202 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 12:30:20 crc kubenswrapper[5002]: I1209 12:30:20.401578 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 12:30:29 crc kubenswrapper[5002]: I1209 12:30:29.060340 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:30:29 crc kubenswrapper[5002]: E1209 12:30:29.061163 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:30:40 crc kubenswrapper[5002]: I1209 12:30:40.060103 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:30:40 crc kubenswrapper[5002]: E1209 12:30:40.061022 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:30:54 crc kubenswrapper[5002]: I1209 12:30:54.066500 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:30:54 crc kubenswrapper[5002]: E1209 12:30:54.067545 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:30:57 crc kubenswrapper[5002]: I1209 12:30:57.791869 5002 scope.go:117] "RemoveContainer" containerID="8a1b453c23b1d66d7543315ea300dee42cc788781abcda5d26c38ed4ee58aa23" Dec 09 12:31:05 crc kubenswrapper[5002]: I1209 12:31:05.059915 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:31:05 crc kubenswrapper[5002]: E1209 12:31:05.060701 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:31:18 crc kubenswrapper[5002]: I1209 12:31:18.060910 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:31:18 crc kubenswrapper[5002]: E1209 12:31:18.061868 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:31:29 crc kubenswrapper[5002]: I1209 12:31:29.061090 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:31:29 crc kubenswrapper[5002]: E1209 12:31:29.062043 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:31:41 crc kubenswrapper[5002]: I1209 12:31:41.060324 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:31:41 crc kubenswrapper[5002]: E1209 12:31:41.061282 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:31:55 crc kubenswrapper[5002]: I1209 12:31:55.060399 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:31:55 crc kubenswrapper[5002]: E1209 12:31:55.062525 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:32:06 crc kubenswrapper[5002]: I1209 12:32:06.060775 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:32:06 crc kubenswrapper[5002]: E1209 12:32:06.061566 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:32:19 crc kubenswrapper[5002]: I1209 12:32:19.060483 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:32:19 crc kubenswrapper[5002]: E1209 12:32:19.061240 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.321532 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l76fw"] Dec 09 12:32:20 crc kubenswrapper[5002]: E1209 12:32:20.322224 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97cfde0-d55c-4e80-87fa-3d94006e25d1" containerName="collect-profiles" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.322237 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97cfde0-d55c-4e80-87fa-3d94006e25d1" containerName="collect-profiles" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.322449 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d97cfde0-d55c-4e80-87fa-3d94006e25d1" containerName="collect-profiles" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.323942 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.339453 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l76fw"] Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.439657 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trfql\" (UniqueName: \"kubernetes.io/projected/59170ede-fbef-47c8-b0e1-a97fd96a4f41-kube-api-access-trfql\") pod \"certified-operators-l76fw\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.439719 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-catalog-content\") pod \"certified-operators-l76fw\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.439760 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-utilities\") pod \"certified-operators-l76fw\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.541445 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trfql\" (UniqueName: \"kubernetes.io/projected/59170ede-fbef-47c8-b0e1-a97fd96a4f41-kube-api-access-trfql\") pod \"certified-operators-l76fw\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.541510 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-catalog-content\") pod \"certified-operators-l76fw\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.541545 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-utilities\") pod \"certified-operators-l76fw\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.542005 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-catalog-content\") pod \"certified-operators-l76fw\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.542087 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-utilities\") pod \"certified-operators-l76fw\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.563680 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trfql\" (UniqueName: \"kubernetes.io/projected/59170ede-fbef-47c8-b0e1-a97fd96a4f41-kube-api-access-trfql\") pod \"certified-operators-l76fw\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:20 crc kubenswrapper[5002]: I1209 12:32:20.657156 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:21 crc kubenswrapper[5002]: I1209 12:32:21.210728 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l76fw"] Dec 09 12:32:21 crc kubenswrapper[5002]: I1209 12:32:21.629868 5002 generic.go:334] "Generic (PLEG): container finished" podID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" containerID="4a6b09c28c885ead46f3d9bacead92bf8d9cacb23acc64a479599ed6a7caacae" exitCode=0 Dec 09 12:32:21 crc kubenswrapper[5002]: I1209 12:32:21.629908 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l76fw" event={"ID":"59170ede-fbef-47c8-b0e1-a97fd96a4f41","Type":"ContainerDied","Data":"4a6b09c28c885ead46f3d9bacead92bf8d9cacb23acc64a479599ed6a7caacae"} Dec 09 12:32:21 crc kubenswrapper[5002]: I1209 12:32:21.630114 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l76fw" event={"ID":"59170ede-fbef-47c8-b0e1-a97fd96a4f41","Type":"ContainerStarted","Data":"6ff94db2b9177a72c915a0860c42f1eb6835df3f2383832a7f3f2c853bc28dd3"} Dec 09 12:32:21 crc kubenswrapper[5002]: I1209 12:32:21.632349 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:32:23 crc kubenswrapper[5002]: I1209 12:32:23.652167 5002 generic.go:334] "Generic (PLEG): container finished" podID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" containerID="990afe5757a7162e9edc2289f3ab9c10f3263a240a97856a49e096f568e266ac" exitCode=0 Dec 09 12:32:23 crc kubenswrapper[5002]: I1209 12:32:23.652214 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l76fw" event={"ID":"59170ede-fbef-47c8-b0e1-a97fd96a4f41","Type":"ContainerDied","Data":"990afe5757a7162e9edc2289f3ab9c10f3263a240a97856a49e096f568e266ac"} Dec 09 12:32:24 crc kubenswrapper[5002]: I1209 12:32:24.665156 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l76fw" event={"ID":"59170ede-fbef-47c8-b0e1-a97fd96a4f41","Type":"ContainerStarted","Data":"bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d"} Dec 09 12:32:24 crc kubenswrapper[5002]: I1209 12:32:24.697682 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l76fw" podStartSLOduration=2.261219262 podStartE2EDuration="4.697659453s" podCreationTimestamp="2025-12-09 12:32:20 +0000 UTC" firstStartedPulling="2025-12-09 12:32:21.632094344 +0000 UTC m=+9074.024145425" lastFinishedPulling="2025-12-09 12:32:24.068534535 +0000 UTC m=+9076.460585616" observedRunningTime="2025-12-09 12:32:24.68564586 +0000 UTC m=+9077.077696941" watchObservedRunningTime="2025-12-09 12:32:24.697659453 +0000 UTC m=+9077.089710554" Dec 09 12:32:30 crc kubenswrapper[5002]: I1209 12:32:30.657556 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:30 crc kubenswrapper[5002]: I1209 12:32:30.658196 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:30 crc kubenswrapper[5002]: I1209 12:32:30.707752 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:30 crc kubenswrapper[5002]: I1209 12:32:30.783146 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:30 crc kubenswrapper[5002]: I1209 12:32:30.954290 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l76fw"] Dec 09 12:32:32 crc kubenswrapper[5002]: I1209 12:32:32.750798 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l76fw" podUID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" containerName="registry-server" containerID="cri-o://bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d" gracePeriod=2 Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.264424 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.430320 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trfql\" (UniqueName: \"kubernetes.io/projected/59170ede-fbef-47c8-b0e1-a97fd96a4f41-kube-api-access-trfql\") pod \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.430407 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-utilities\") pod \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.430451 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-catalog-content\") pod \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\" (UID: \"59170ede-fbef-47c8-b0e1-a97fd96a4f41\") " Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.433788 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-utilities" (OuterVolumeSpecName: "utilities") pod "59170ede-fbef-47c8-b0e1-a97fd96a4f41" (UID: "59170ede-fbef-47c8-b0e1-a97fd96a4f41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.442316 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59170ede-fbef-47c8-b0e1-a97fd96a4f41-kube-api-access-trfql" (OuterVolumeSpecName: "kube-api-access-trfql") pod "59170ede-fbef-47c8-b0e1-a97fd96a4f41" (UID: "59170ede-fbef-47c8-b0e1-a97fd96a4f41"). InnerVolumeSpecName "kube-api-access-trfql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.484991 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59170ede-fbef-47c8-b0e1-a97fd96a4f41" (UID: "59170ede-fbef-47c8-b0e1-a97fd96a4f41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.532636 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trfql\" (UniqueName: \"kubernetes.io/projected/59170ede-fbef-47c8-b0e1-a97fd96a4f41-kube-api-access-trfql\") on node \"crc\" DevicePath \"\"" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.532666 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.532677 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59170ede-fbef-47c8-b0e1-a97fd96a4f41-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.764653 5002 generic.go:334] "Generic (PLEG): container finished" podID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" containerID="bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d" exitCode=0 Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.764714 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l76fw" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.764740 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l76fw" event={"ID":"59170ede-fbef-47c8-b0e1-a97fd96a4f41","Type":"ContainerDied","Data":"bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d"} Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.766326 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l76fw" event={"ID":"59170ede-fbef-47c8-b0e1-a97fd96a4f41","Type":"ContainerDied","Data":"6ff94db2b9177a72c915a0860c42f1eb6835df3f2383832a7f3f2c853bc28dd3"} Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.766358 5002 scope.go:117] "RemoveContainer" containerID="bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.788617 5002 scope.go:117] "RemoveContainer" containerID="990afe5757a7162e9edc2289f3ab9c10f3263a240a97856a49e096f568e266ac" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.809872 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l76fw"] Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.819654 5002 scope.go:117] "RemoveContainer" containerID="4a6b09c28c885ead46f3d9bacead92bf8d9cacb23acc64a479599ed6a7caacae" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.822745 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l76fw"] Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.883033 5002 scope.go:117] "RemoveContainer" containerID="bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d" Dec 09 12:32:33 crc kubenswrapper[5002]: E1209 12:32:33.884354 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d\": container with ID starting with bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d not found: ID does not exist" containerID="bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.884419 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d"} err="failed to get container status \"bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d\": rpc error: code = NotFound desc = could not find container \"bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d\": container with ID starting with bbc8aca4f6a18f6ea75f4e307ed7d36ad58b40561cc9ac0c35558cb18b17f60d not found: ID does not exist" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.884451 5002 scope.go:117] "RemoveContainer" containerID="990afe5757a7162e9edc2289f3ab9c10f3263a240a97856a49e096f568e266ac" Dec 09 12:32:33 crc kubenswrapper[5002]: E1209 12:32:33.884937 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"990afe5757a7162e9edc2289f3ab9c10f3263a240a97856a49e096f568e266ac\": container with ID starting with 990afe5757a7162e9edc2289f3ab9c10f3263a240a97856a49e096f568e266ac not found: ID does not exist" containerID="990afe5757a7162e9edc2289f3ab9c10f3263a240a97856a49e096f568e266ac" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.884962 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"990afe5757a7162e9edc2289f3ab9c10f3263a240a97856a49e096f568e266ac"} err="failed to get container status \"990afe5757a7162e9edc2289f3ab9c10f3263a240a97856a49e096f568e266ac\": rpc error: code = NotFound desc = could not find container \"990afe5757a7162e9edc2289f3ab9c10f3263a240a97856a49e096f568e266ac\": container with ID starting with 990afe5757a7162e9edc2289f3ab9c10f3263a240a97856a49e096f568e266ac not found: ID does not exist" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.884982 5002 scope.go:117] "RemoveContainer" containerID="4a6b09c28c885ead46f3d9bacead92bf8d9cacb23acc64a479599ed6a7caacae" Dec 09 12:32:33 crc kubenswrapper[5002]: E1209 12:32:33.885362 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6b09c28c885ead46f3d9bacead92bf8d9cacb23acc64a479599ed6a7caacae\": container with ID starting with 4a6b09c28c885ead46f3d9bacead92bf8d9cacb23acc64a479599ed6a7caacae not found: ID does not exist" containerID="4a6b09c28c885ead46f3d9bacead92bf8d9cacb23acc64a479599ed6a7caacae" Dec 09 12:32:33 crc kubenswrapper[5002]: I1209 12:32:33.885392 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6b09c28c885ead46f3d9bacead92bf8d9cacb23acc64a479599ed6a7caacae"} err="failed to get container status \"4a6b09c28c885ead46f3d9bacead92bf8d9cacb23acc64a479599ed6a7caacae\": rpc error: code = NotFound desc = could not find container \"4a6b09c28c885ead46f3d9bacead92bf8d9cacb23acc64a479599ed6a7caacae\": container with ID starting with 4a6b09c28c885ead46f3d9bacead92bf8d9cacb23acc64a479599ed6a7caacae not found: ID does not exist" Dec 09 12:32:34 crc kubenswrapper[5002]: I1209 12:32:34.060184 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:32:34 crc kubenswrapper[5002]: E1209 12:32:34.060571 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:32:34 crc kubenswrapper[5002]: I1209 12:32:34.074897 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" path="/var/lib/kubelet/pods/59170ede-fbef-47c8-b0e1-a97fd96a4f41/volumes" Dec 09 12:32:47 crc kubenswrapper[5002]: I1209 12:32:47.059745 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:32:47 crc kubenswrapper[5002]: E1209 12:32:47.060631 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:32:59 crc kubenswrapper[5002]: I1209 12:32:59.069092 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:32:59 crc kubenswrapper[5002]: E1209 12:32:59.070301 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:33:12 crc kubenswrapper[5002]: I1209 12:33:12.064305 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:33:12 crc kubenswrapper[5002]: E1209 12:33:12.065097 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:33:26 crc kubenswrapper[5002]: I1209 12:33:26.061287 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:33:26 crc kubenswrapper[5002]: E1209 12:33:26.062319 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:33:40 crc kubenswrapper[5002]: I1209 12:33:40.060233 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:33:40 crc kubenswrapper[5002]: E1209 12:33:40.061006 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:33:53 crc kubenswrapper[5002]: I1209 12:33:53.060587 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:33:53 crc kubenswrapper[5002]: E1209 12:33:53.061410 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:34:05 crc kubenswrapper[5002]: I1209 12:34:05.060454 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:34:05 crc kubenswrapper[5002]: E1209 12:34:05.061408 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:34:16 crc kubenswrapper[5002]: I1209 12:34:16.060334 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:34:17 crc kubenswrapper[5002]: I1209 12:34:17.052593 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"20a8a57c0480b1e66e8fe238396fa88d6e5a58fcd236299b623d24862a22772e"} Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.787216 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s69dn"] Dec 09 12:35:31 crc kubenswrapper[5002]: E1209 12:35:31.789240 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" containerName="extract-utilities" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.789326 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" containerName="extract-utilities" Dec 09 12:35:31 crc kubenswrapper[5002]: E1209 12:35:31.789404 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" containerName="extract-content" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.789460 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" containerName="extract-content" Dec 09 12:35:31 crc kubenswrapper[5002]: E1209 12:35:31.789552 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" containerName="registry-server" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.789606 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" containerName="registry-server" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.789874 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="59170ede-fbef-47c8-b0e1-a97fd96a4f41" containerName="registry-server" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.791900 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.799283 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s69dn"] Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.873977 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-catalog-content\") pod \"redhat-marketplace-s69dn\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.874024 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtf7j\" (UniqueName: \"kubernetes.io/projected/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-kube-api-access-vtf7j\") pod \"redhat-marketplace-s69dn\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.874158 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-utilities\") pod \"redhat-marketplace-s69dn\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.976424 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-catalog-content\") pod \"redhat-marketplace-s69dn\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.976483 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtf7j\" (UniqueName: \"kubernetes.io/projected/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-kube-api-access-vtf7j\") pod \"redhat-marketplace-s69dn\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.976618 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-utilities\") pod \"redhat-marketplace-s69dn\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.977127 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-catalog-content\") pod \"redhat-marketplace-s69dn\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.977202 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-utilities\") pod \"redhat-marketplace-s69dn\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:31 crc kubenswrapper[5002]: I1209 12:35:31.997312 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtf7j\" (UniqueName: \"kubernetes.io/projected/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-kube-api-access-vtf7j\") pod \"redhat-marketplace-s69dn\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:32 crc kubenswrapper[5002]: I1209 12:35:32.120129 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:32 crc kubenswrapper[5002]: I1209 12:35:32.686996 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s69dn"] Dec 09 12:35:32 crc kubenswrapper[5002]: I1209 12:35:32.898132 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s69dn" event={"ID":"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329","Type":"ContainerStarted","Data":"ee76655fa02d45185176214214a417a093fa509d99b086c3ceb4394525f67ab2"} Dec 09 12:35:33 crc kubenswrapper[5002]: I1209 12:35:33.909943 5002 generic.go:334] "Generic (PLEG): container finished" podID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" containerID="4fd137a5c7a1b8b15645686ef45133438513f242b036b2a59c9ea21fb06ff3f2" exitCode=0 Dec 09 12:35:33 crc kubenswrapper[5002]: I1209 12:35:33.910003 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s69dn" event={"ID":"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329","Type":"ContainerDied","Data":"4fd137a5c7a1b8b15645686ef45133438513f242b036b2a59c9ea21fb06ff3f2"} Dec 09 12:35:40 crc kubenswrapper[5002]: I1209 12:35:40.978243 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s69dn" event={"ID":"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329","Type":"ContainerStarted","Data":"776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd"} Dec 09 12:35:41 crc kubenswrapper[5002]: I1209 12:35:41.991547 5002 generic.go:334] "Generic (PLEG): container finished" podID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" containerID="776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd" exitCode=0 Dec 09 12:35:41 crc kubenswrapper[5002]: I1209 12:35:41.991587 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s69dn" event={"ID":"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329","Type":"ContainerDied","Data":"776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd"} Dec 09 12:35:44 crc kubenswrapper[5002]: I1209 12:35:44.015553 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s69dn" event={"ID":"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329","Type":"ContainerStarted","Data":"a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc"} Dec 09 12:35:44 crc kubenswrapper[5002]: I1209 12:35:44.040727 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s69dn" podStartSLOduration=4.163879641 podStartE2EDuration="13.040708749s" podCreationTimestamp="2025-12-09 12:35:31 +0000 UTC" firstStartedPulling="2025-12-09 12:35:33.913021034 +0000 UTC m=+9266.305072145" lastFinishedPulling="2025-12-09 12:35:42.789850172 +0000 UTC m=+9275.181901253" observedRunningTime="2025-12-09 12:35:44.034568274 +0000 UTC m=+9276.426619375" watchObservedRunningTime="2025-12-09 12:35:44.040708749 +0000 UTC m=+9276.432759830" Dec 09 12:35:52 crc kubenswrapper[5002]: I1209 12:35:52.120348 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:52 crc kubenswrapper[5002]: I1209 12:35:52.121098 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:52 crc kubenswrapper[5002]: I1209 12:35:52.177410 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:53 crc kubenswrapper[5002]: I1209 12:35:53.204939 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:53 crc kubenswrapper[5002]: I1209 12:35:53.264245 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s69dn"] Dec 09 12:35:55 crc kubenswrapper[5002]: I1209 12:35:55.158277 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s69dn" podUID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" containerName="registry-server" containerID="cri-o://a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc" gracePeriod=2 Dec 09 12:35:55 crc kubenswrapper[5002]: I1209 12:35:55.699606 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:55 crc kubenswrapper[5002]: I1209 12:35:55.827951 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-utilities\") pod \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " Dec 09 12:35:55 crc kubenswrapper[5002]: I1209 12:35:55.828023 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-catalog-content\") pod \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " Dec 09 12:35:55 crc kubenswrapper[5002]: I1209 12:35:55.828322 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtf7j\" (UniqueName: \"kubernetes.io/projected/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-kube-api-access-vtf7j\") pod \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\" (UID: \"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329\") " Dec 09 12:35:55 crc kubenswrapper[5002]: I1209 12:35:55.829036 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-utilities" (OuterVolumeSpecName: "utilities") pod "b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" (UID: "b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:35:55 crc kubenswrapper[5002]: I1209 12:35:55.835121 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-kube-api-access-vtf7j" (OuterVolumeSpecName: "kube-api-access-vtf7j") pod "b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" (UID: "b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329"). InnerVolumeSpecName "kube-api-access-vtf7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:35:55 crc kubenswrapper[5002]: I1209 12:35:55.857784 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" (UID: "b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:35:55 crc kubenswrapper[5002]: I1209 12:35:55.930449 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtf7j\" (UniqueName: \"kubernetes.io/projected/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-kube-api-access-vtf7j\") on node \"crc\" DevicePath \"\"" Dec 09 12:35:55 crc kubenswrapper[5002]: I1209 12:35:55.930487 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:35:55 crc kubenswrapper[5002]: I1209 12:35:55.930499 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.170595 5002 generic.go:334] "Generic (PLEG): container finished" podID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" containerID="a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc" exitCode=0 Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.170638 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s69dn" event={"ID":"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329","Type":"ContainerDied","Data":"a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc"} Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.170654 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s69dn" Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.170698 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s69dn" event={"ID":"b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329","Type":"ContainerDied","Data":"ee76655fa02d45185176214214a417a093fa509d99b086c3ceb4394525f67ab2"} Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.170721 5002 scope.go:117] "RemoveContainer" containerID="a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc" Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.197522 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s69dn"] Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.202771 5002 scope.go:117] "RemoveContainer" containerID="776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd" Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.210804 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s69dn"] Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.445063 5002 scope.go:117] "RemoveContainer" containerID="4fd137a5c7a1b8b15645686ef45133438513f242b036b2a59c9ea21fb06ff3f2" Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.505418 5002 scope.go:117] "RemoveContainer" containerID="a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc" Dec 09 12:35:56 crc kubenswrapper[5002]: E1209 12:35:56.506302 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc\": container with ID starting with a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc not found: ID does not exist" containerID="a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc" Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.506398 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc"} err="failed to get container status \"a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc\": rpc error: code = NotFound desc = could not find container \"a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc\": container with ID starting with a26a88ea73c8f31c01349fab038932a910e462878b67ce1e211723bf9b581ebc not found: ID does not exist" Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.506498 5002 scope.go:117] "RemoveContainer" containerID="776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd" Dec 09 12:35:56 crc kubenswrapper[5002]: E1209 12:35:56.507056 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd\": container with ID starting with 776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd not found: ID does not exist" containerID="776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd" Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.507095 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd"} err="failed to get container status \"776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd\": rpc error: code = NotFound desc = could not find container \"776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd\": container with ID starting with 776186a74e29fd7919b1d3dad271a5175f832d9005d9436834459da6a32093dd not found: ID does not exist" Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.507149 5002 scope.go:117] "RemoveContainer" containerID="4fd137a5c7a1b8b15645686ef45133438513f242b036b2a59c9ea21fb06ff3f2" Dec 09 12:35:56 crc kubenswrapper[5002]: E1209 12:35:56.507480 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd137a5c7a1b8b15645686ef45133438513f242b036b2a59c9ea21fb06ff3f2\": container with ID starting with 4fd137a5c7a1b8b15645686ef45133438513f242b036b2a59c9ea21fb06ff3f2 not found: ID does not exist" containerID="4fd137a5c7a1b8b15645686ef45133438513f242b036b2a59c9ea21fb06ff3f2" Dec 09 12:35:56 crc kubenswrapper[5002]: I1209 12:35:56.507508 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd137a5c7a1b8b15645686ef45133438513f242b036b2a59c9ea21fb06ff3f2"} err="failed to get container status \"4fd137a5c7a1b8b15645686ef45133438513f242b036b2a59c9ea21fb06ff3f2\": rpc error: code = NotFound desc = could not find container \"4fd137a5c7a1b8b15645686ef45133438513f242b036b2a59c9ea21fb06ff3f2\": container with ID starting with 4fd137a5c7a1b8b15645686ef45133438513f242b036b2a59c9ea21fb06ff3f2 not found: ID does not exist" Dec 09 12:35:58 crc kubenswrapper[5002]: I1209 12:35:58.074138 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" path="/var/lib/kubelet/pods/b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329/volumes" Dec 09 12:36:37 crc kubenswrapper[5002]: I1209 12:36:37.965037 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:36:37 crc kubenswrapper[5002]: I1209 12:36:37.965611 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.329360 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-phkqg"] Dec 09 12:36:46 crc kubenswrapper[5002]: E1209 12:36:46.330400 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" containerName="extract-content" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.330415 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" containerName="extract-content" Dec 09 12:36:46 crc kubenswrapper[5002]: E1209 12:36:46.330430 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" containerName="extract-utilities" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.330436 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" containerName="extract-utilities" Dec 09 12:36:46 crc kubenswrapper[5002]: E1209 12:36:46.330468 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" containerName="registry-server" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.330474 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" containerName="registry-server" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.330720 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ecb2d8-2e1a-44d7-a53e-3762d0d0a329" containerName="registry-server" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.332340 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.345595 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phkqg"] Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.468179 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-catalog-content\") pod \"redhat-operators-phkqg\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.468549 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm5sm\" (UniqueName: \"kubernetes.io/projected/b84228a0-7141-4897-aa01-bd2c1f36e49f-kube-api-access-cm5sm\") pod \"redhat-operators-phkqg\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.468789 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-utilities\") pod \"redhat-operators-phkqg\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.571020 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm5sm\" (UniqueName: \"kubernetes.io/projected/b84228a0-7141-4897-aa01-bd2c1f36e49f-kube-api-access-cm5sm\") pod \"redhat-operators-phkqg\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.571097 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-utilities\") pod \"redhat-operators-phkqg\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.571209 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-catalog-content\") pod \"redhat-operators-phkqg\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.571732 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-utilities\") pod \"redhat-operators-phkqg\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.571763 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-catalog-content\") pod \"redhat-operators-phkqg\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.590077 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm5sm\" (UniqueName: \"kubernetes.io/projected/b84228a0-7141-4897-aa01-bd2c1f36e49f-kube-api-access-cm5sm\") pod \"redhat-operators-phkqg\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:46 crc kubenswrapper[5002]: I1209 12:36:46.664770 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:47 crc kubenswrapper[5002]: I1209 12:36:47.183658 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phkqg"] Dec 09 12:36:47 crc kubenswrapper[5002]: I1209 12:36:47.734091 5002 generic.go:334] "Generic (PLEG): container finished" podID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerID="05ae1815b258c2d82b9922a94d1e5563b72e5eee5ff0f343add79b5d7903a5cc" exitCode=0 Dec 09 12:36:47 crc kubenswrapper[5002]: I1209 12:36:47.734159 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkqg" event={"ID":"b84228a0-7141-4897-aa01-bd2c1f36e49f","Type":"ContainerDied","Data":"05ae1815b258c2d82b9922a94d1e5563b72e5eee5ff0f343add79b5d7903a5cc"} Dec 09 12:36:47 crc kubenswrapper[5002]: I1209 12:36:47.734406 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkqg" event={"ID":"b84228a0-7141-4897-aa01-bd2c1f36e49f","Type":"ContainerStarted","Data":"baf4039f5a5cecb1a3a16ab04bb6866a4b2386a5f6368ff96f1e630cd7c88968"} Dec 09 12:36:49 crc kubenswrapper[5002]: I1209 12:36:49.776342 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkqg" event={"ID":"b84228a0-7141-4897-aa01-bd2c1f36e49f","Type":"ContainerStarted","Data":"3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8"} Dec 09 12:36:52 crc kubenswrapper[5002]: I1209 12:36:52.810884 5002 generic.go:334] "Generic (PLEG): container finished" podID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerID="3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8" exitCode=0 Dec 09 12:36:52 crc kubenswrapper[5002]: I1209 12:36:52.810936 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkqg" event={"ID":"b84228a0-7141-4897-aa01-bd2c1f36e49f","Type":"ContainerDied","Data":"3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8"} Dec 09 12:36:53 crc kubenswrapper[5002]: I1209 12:36:53.826156 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkqg" event={"ID":"b84228a0-7141-4897-aa01-bd2c1f36e49f","Type":"ContainerStarted","Data":"4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174"} Dec 09 12:36:53 crc kubenswrapper[5002]: I1209 12:36:53.858795 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-phkqg" podStartSLOduration=2.289452124 podStartE2EDuration="7.858770241s" podCreationTimestamp="2025-12-09 12:36:46 +0000 UTC" firstStartedPulling="2025-12-09 12:36:47.736206057 +0000 UTC m=+9340.128257138" lastFinishedPulling="2025-12-09 12:36:53.305524174 +0000 UTC m=+9345.697575255" observedRunningTime="2025-12-09 12:36:53.84534138 +0000 UTC m=+9346.237392481" watchObservedRunningTime="2025-12-09 12:36:53.858770241 +0000 UTC m=+9346.250821322" Dec 09 12:36:56 crc kubenswrapper[5002]: I1209 12:36:56.665584 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:56 crc kubenswrapper[5002]: I1209 12:36:56.665955 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:36:57 crc kubenswrapper[5002]: I1209 12:36:57.712212 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-phkqg" podUID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerName="registry-server" probeResult="failure" output=< Dec 09 12:36:57 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 12:36:57 crc kubenswrapper[5002]: > Dec 09 12:37:06 crc kubenswrapper[5002]: I1209 12:37:06.733565 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:37:06 crc kubenswrapper[5002]: I1209 12:37:06.802482 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:37:06 crc kubenswrapper[5002]: I1209 12:37:06.970470 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phkqg"] Dec 09 12:37:07 crc kubenswrapper[5002]: I1209 12:37:07.965039 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:37:07 crc kubenswrapper[5002]: I1209 12:37:07.965126 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:37:07 crc kubenswrapper[5002]: I1209 12:37:07.975689 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-phkqg" podUID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerName="registry-server" containerID="cri-o://4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174" gracePeriod=2 Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.490258 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.657617 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-utilities\") pod \"b84228a0-7141-4897-aa01-bd2c1f36e49f\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.657759 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-catalog-content\") pod \"b84228a0-7141-4897-aa01-bd2c1f36e49f\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.657789 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm5sm\" (UniqueName: \"kubernetes.io/projected/b84228a0-7141-4897-aa01-bd2c1f36e49f-kube-api-access-cm5sm\") pod \"b84228a0-7141-4897-aa01-bd2c1f36e49f\" (UID: \"b84228a0-7141-4897-aa01-bd2c1f36e49f\") " Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.658615 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-utilities" (OuterVolumeSpecName: "utilities") pod "b84228a0-7141-4897-aa01-bd2c1f36e49f" (UID: "b84228a0-7141-4897-aa01-bd2c1f36e49f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.665010 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b84228a0-7141-4897-aa01-bd2c1f36e49f-kube-api-access-cm5sm" (OuterVolumeSpecName: "kube-api-access-cm5sm") pod "b84228a0-7141-4897-aa01-bd2c1f36e49f" (UID: "b84228a0-7141-4897-aa01-bd2c1f36e49f"). InnerVolumeSpecName "kube-api-access-cm5sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.760418 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.760698 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm5sm\" (UniqueName: \"kubernetes.io/projected/b84228a0-7141-4897-aa01-bd2c1f36e49f-kube-api-access-cm5sm\") on node \"crc\" DevicePath \"\"" Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.772982 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b84228a0-7141-4897-aa01-bd2c1f36e49f" (UID: "b84228a0-7141-4897-aa01-bd2c1f36e49f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.863237 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84228a0-7141-4897-aa01-bd2c1f36e49f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.988178 5002 generic.go:334] "Generic (PLEG): container finished" podID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerID="4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174" exitCode=0 Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.988246 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkqg" Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.988222 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkqg" event={"ID":"b84228a0-7141-4897-aa01-bd2c1f36e49f","Type":"ContainerDied","Data":"4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174"} Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.988379 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkqg" event={"ID":"b84228a0-7141-4897-aa01-bd2c1f36e49f","Type":"ContainerDied","Data":"baf4039f5a5cecb1a3a16ab04bb6866a4b2386a5f6368ff96f1e630cd7c88968"} Dec 09 12:37:08 crc kubenswrapper[5002]: I1209 12:37:08.988397 5002 scope.go:117] "RemoveContainer" containerID="4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174" Dec 09 12:37:09 crc kubenswrapper[5002]: I1209 12:37:09.023878 5002 scope.go:117] "RemoveContainer" containerID="3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8" Dec 09 12:37:09 crc kubenswrapper[5002]: I1209 12:37:09.032903 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phkqg"] Dec 09 12:37:09 crc kubenswrapper[5002]: I1209 12:37:09.049800 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-phkqg"] Dec 09 12:37:09 crc kubenswrapper[5002]: I1209 12:37:09.053558 5002 scope.go:117] "RemoveContainer" containerID="05ae1815b258c2d82b9922a94d1e5563b72e5eee5ff0f343add79b5d7903a5cc" Dec 09 12:37:09 crc kubenswrapper[5002]: I1209 12:37:09.097234 5002 scope.go:117] "RemoveContainer" containerID="4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174" Dec 09 12:37:09 crc kubenswrapper[5002]: E1209 12:37:09.097682 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174\": container with ID starting with 4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174 not found: ID does not exist" containerID="4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174" Dec 09 12:37:09 crc kubenswrapper[5002]: I1209 12:37:09.097800 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174"} err="failed to get container status \"4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174\": rpc error: code = NotFound desc = could not find container \"4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174\": container with ID starting with 4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174 not found: ID does not exist" Dec 09 12:37:09 crc kubenswrapper[5002]: I1209 12:37:09.097852 5002 scope.go:117] "RemoveContainer" containerID="3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8" Dec 09 12:37:09 crc kubenswrapper[5002]: E1209 12:37:09.098168 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8\": container with ID starting with 3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8 not found: ID does not exist" containerID="3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8" Dec 09 12:37:09 crc kubenswrapper[5002]: I1209 12:37:09.098197 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8"} err="failed to get container status \"3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8\": rpc error: code = NotFound desc = could not find container \"3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8\": container with ID starting with 3a28bbe9dd9f518bc33a7132294b50a1a00860645b17c1425471e021a05baef8 not found: ID does not exist" Dec 09 12:37:09 crc kubenswrapper[5002]: I1209 12:37:09.098225 5002 scope.go:117] "RemoveContainer" containerID="05ae1815b258c2d82b9922a94d1e5563b72e5eee5ff0f343add79b5d7903a5cc" Dec 09 12:37:09 crc kubenswrapper[5002]: E1209 12:37:09.098480 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ae1815b258c2d82b9922a94d1e5563b72e5eee5ff0f343add79b5d7903a5cc\": container with ID starting with 05ae1815b258c2d82b9922a94d1e5563b72e5eee5ff0f343add79b5d7903a5cc not found: ID does not exist" containerID="05ae1815b258c2d82b9922a94d1e5563b72e5eee5ff0f343add79b5d7903a5cc" Dec 09 12:37:09 crc kubenswrapper[5002]: I1209 12:37:09.098497 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ae1815b258c2d82b9922a94d1e5563b72e5eee5ff0f343add79b5d7903a5cc"} err="failed to get container status \"05ae1815b258c2d82b9922a94d1e5563b72e5eee5ff0f343add79b5d7903a5cc\": rpc error: code = NotFound desc = could not find container \"05ae1815b258c2d82b9922a94d1e5563b72e5eee5ff0f343add79b5d7903a5cc\": container with ID starting with 05ae1815b258c2d82b9922a94d1e5563b72e5eee5ff0f343add79b5d7903a5cc not found: ID does not exist" Dec 09 12:37:10 crc kubenswrapper[5002]: I1209 12:37:10.083606 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b84228a0-7141-4897-aa01-bd2c1f36e49f" path="/var/lib/kubelet/pods/b84228a0-7141-4897-aa01-bd2c1f36e49f/volumes" Dec 09 12:37:11 crc kubenswrapper[5002]: E1209 12:37:11.141684 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84228a0_7141_4897_aa01_bd2c1f36e49f.slice/crio-conmon-4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174.scope\": RecentStats: unable to find data in memory cache]" Dec 09 12:37:21 crc kubenswrapper[5002]: E1209 12:37:21.465831 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84228a0_7141_4897_aa01_bd2c1f36e49f.slice/crio-conmon-4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174.scope\": RecentStats: unable to find data in memory cache]" Dec 09 12:37:31 crc kubenswrapper[5002]: E1209 12:37:31.736648 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84228a0_7141_4897_aa01_bd2c1f36e49f.slice/crio-conmon-4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174.scope\": RecentStats: unable to find data in memory cache]" Dec 09 12:37:37 crc kubenswrapper[5002]: I1209 12:37:37.965088 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:37:37 crc kubenswrapper[5002]: I1209 12:37:37.965623 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:37:37 crc kubenswrapper[5002]: I1209 12:37:37.965680 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 12:37:37 crc kubenswrapper[5002]: I1209 12:37:37.966573 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20a8a57c0480b1e66e8fe238396fa88d6e5a58fcd236299b623d24862a22772e"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:37:37 crc kubenswrapper[5002]: I1209 12:37:37.966628 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://20a8a57c0480b1e66e8fe238396fa88d6e5a58fcd236299b623d24862a22772e" gracePeriod=600 Dec 09 12:37:38 crc kubenswrapper[5002]: I1209 12:37:38.287943 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="20a8a57c0480b1e66e8fe238396fa88d6e5a58fcd236299b623d24862a22772e" exitCode=0 Dec 09 12:37:38 crc kubenswrapper[5002]: I1209 12:37:38.288085 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"20a8a57c0480b1e66e8fe238396fa88d6e5a58fcd236299b623d24862a22772e"} Dec 09 12:37:38 crc kubenswrapper[5002]: I1209 12:37:38.288279 5002 scope.go:117] "RemoveContainer" containerID="4b150fb9f983a4e6a1a2f55ee6d78c3489d3daaae10fa0510a80b1084f5abe7f" Dec 09 12:37:39 crc kubenswrapper[5002]: I1209 12:37:39.302466 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b"} Dec 09 12:37:42 crc kubenswrapper[5002]: E1209 12:37:42.065277 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84228a0_7141_4897_aa01_bd2c1f36e49f.slice/crio-conmon-4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174.scope\": RecentStats: unable to find data in memory cache]" Dec 09 12:37:52 crc kubenswrapper[5002]: E1209 12:37:52.353718 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84228a0_7141_4897_aa01_bd2c1f36e49f.slice/crio-conmon-4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174.scope\": RecentStats: unable to find data in memory cache]" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.044252 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f5pvx"] Dec 09 12:38:01 crc kubenswrapper[5002]: E1209 12:38:01.046409 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerName="extract-utilities" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.046535 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerName="extract-utilities" Dec 09 12:38:01 crc kubenswrapper[5002]: E1209 12:38:01.046640 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerName="registry-server" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.046723 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerName="registry-server" Dec 09 12:38:01 crc kubenswrapper[5002]: E1209 12:38:01.046843 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerName="extract-content" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.046932 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerName="extract-content" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.047262 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="b84228a0-7141-4897-aa01-bd2c1f36e49f" containerName="registry-server" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.049060 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.055226 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f5pvx"] Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.221597 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-utilities\") pod \"community-operators-f5pvx\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.221701 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-catalog-content\") pod \"community-operators-f5pvx\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.221779 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q6cp\" (UniqueName: \"kubernetes.io/projected/cd06493e-9ed3-4dec-848f-3108b431179e-kube-api-access-2q6cp\") pod \"community-operators-f5pvx\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.323568 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-utilities\") pod \"community-operators-f5pvx\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.323627 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-catalog-content\") pod \"community-operators-f5pvx\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.323646 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q6cp\" (UniqueName: \"kubernetes.io/projected/cd06493e-9ed3-4dec-848f-3108b431179e-kube-api-access-2q6cp\") pod \"community-operators-f5pvx\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.324150 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-utilities\") pod \"community-operators-f5pvx\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.324240 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-catalog-content\") pod \"community-operators-f5pvx\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.343606 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q6cp\" (UniqueName: \"kubernetes.io/projected/cd06493e-9ed3-4dec-848f-3108b431179e-kube-api-access-2q6cp\") pod \"community-operators-f5pvx\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.377896 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:01 crc kubenswrapper[5002]: I1209 12:38:01.933498 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f5pvx"] Dec 09 12:38:02 crc kubenswrapper[5002]: I1209 12:38:02.597020 5002 generic.go:334] "Generic (PLEG): container finished" podID="cd06493e-9ed3-4dec-848f-3108b431179e" containerID="eb1172dd270bb07b7bd3d0e3cf095347f88c3973d93c81ec1e892aee949eb235" exitCode=0 Dec 09 12:38:02 crc kubenswrapper[5002]: I1209 12:38:02.597133 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5pvx" event={"ID":"cd06493e-9ed3-4dec-848f-3108b431179e","Type":"ContainerDied","Data":"eb1172dd270bb07b7bd3d0e3cf095347f88c3973d93c81ec1e892aee949eb235"} Dec 09 12:38:02 crc kubenswrapper[5002]: I1209 12:38:02.597331 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5pvx" event={"ID":"cd06493e-9ed3-4dec-848f-3108b431179e","Type":"ContainerStarted","Data":"f23d3d5bee4a2ad161436f34b4e90650bfc8a6bc1d8eb585ed52e12d421b8c54"} Dec 09 12:38:02 crc kubenswrapper[5002]: I1209 12:38:02.599201 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:38:02 crc kubenswrapper[5002]: E1209 12:38:02.625138 5002 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84228a0_7141_4897_aa01_bd2c1f36e49f.slice/crio-conmon-4884aedb99491aeccd8f7718cfdb5e155b4da436239b2c051526789b4a1f2174.scope\": RecentStats: unable to find data in memory cache]" Dec 09 12:38:04 crc kubenswrapper[5002]: I1209 12:38:04.621610 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5pvx" event={"ID":"cd06493e-9ed3-4dec-848f-3108b431179e","Type":"ContainerStarted","Data":"4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957"} Dec 09 12:38:05 crc kubenswrapper[5002]: I1209 12:38:05.630988 5002 generic.go:334] "Generic (PLEG): container finished" podID="cd06493e-9ed3-4dec-848f-3108b431179e" containerID="4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957" exitCode=0 Dec 09 12:38:05 crc kubenswrapper[5002]: I1209 12:38:05.631030 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5pvx" event={"ID":"cd06493e-9ed3-4dec-848f-3108b431179e","Type":"ContainerDied","Data":"4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957"} Dec 09 12:38:06 crc kubenswrapper[5002]: I1209 12:38:06.644571 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5pvx" event={"ID":"cd06493e-9ed3-4dec-848f-3108b431179e","Type":"ContainerStarted","Data":"99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15"} Dec 09 12:38:06 crc kubenswrapper[5002]: I1209 12:38:06.671346 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f5pvx" podStartSLOduration=2.051839673 podStartE2EDuration="5.671322203s" podCreationTimestamp="2025-12-09 12:38:01 +0000 UTC" firstStartedPulling="2025-12-09 12:38:02.598957915 +0000 UTC m=+9414.991008986" lastFinishedPulling="2025-12-09 12:38:06.218440435 +0000 UTC m=+9418.610491516" observedRunningTime="2025-12-09 12:38:06.664078278 +0000 UTC m=+9419.056129369" watchObservedRunningTime="2025-12-09 12:38:06.671322203 +0000 UTC m=+9419.063373304" Dec 09 12:38:11 crc kubenswrapper[5002]: I1209 12:38:11.387176 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:11 crc kubenswrapper[5002]: I1209 12:38:11.388159 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:11 crc kubenswrapper[5002]: I1209 12:38:11.458389 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:11 crc kubenswrapper[5002]: I1209 12:38:11.741765 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:11 crc kubenswrapper[5002]: I1209 12:38:11.793488 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f5pvx"] Dec 09 12:38:13 crc kubenswrapper[5002]: I1209 12:38:13.714683 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f5pvx" podUID="cd06493e-9ed3-4dec-848f-3108b431179e" containerName="registry-server" containerID="cri-o://99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15" gracePeriod=2 Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.227872 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.317046 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-utilities\") pod \"cd06493e-9ed3-4dec-848f-3108b431179e\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.317226 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-catalog-content\") pod \"cd06493e-9ed3-4dec-848f-3108b431179e\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.317517 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q6cp\" (UniqueName: \"kubernetes.io/projected/cd06493e-9ed3-4dec-848f-3108b431179e-kube-api-access-2q6cp\") pod \"cd06493e-9ed3-4dec-848f-3108b431179e\" (UID: \"cd06493e-9ed3-4dec-848f-3108b431179e\") " Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.318004 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-utilities" (OuterVolumeSpecName: "utilities") pod "cd06493e-9ed3-4dec-848f-3108b431179e" (UID: "cd06493e-9ed3-4dec-848f-3108b431179e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.318096 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.323746 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd06493e-9ed3-4dec-848f-3108b431179e-kube-api-access-2q6cp" (OuterVolumeSpecName: "kube-api-access-2q6cp") pod "cd06493e-9ed3-4dec-848f-3108b431179e" (UID: "cd06493e-9ed3-4dec-848f-3108b431179e"). InnerVolumeSpecName "kube-api-access-2q6cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.372109 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd06493e-9ed3-4dec-848f-3108b431179e" (UID: "cd06493e-9ed3-4dec-848f-3108b431179e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.419849 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd06493e-9ed3-4dec-848f-3108b431179e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.419880 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q6cp\" (UniqueName: \"kubernetes.io/projected/cd06493e-9ed3-4dec-848f-3108b431179e-kube-api-access-2q6cp\") on node \"crc\" DevicePath \"\"" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.732253 5002 generic.go:334] "Generic (PLEG): container finished" podID="cd06493e-9ed3-4dec-848f-3108b431179e" containerID="99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15" exitCode=0 Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.732334 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5pvx" event={"ID":"cd06493e-9ed3-4dec-848f-3108b431179e","Type":"ContainerDied","Data":"99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15"} Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.732341 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5pvx" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.732389 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5pvx" event={"ID":"cd06493e-9ed3-4dec-848f-3108b431179e","Type":"ContainerDied","Data":"f23d3d5bee4a2ad161436f34b4e90650bfc8a6bc1d8eb585ed52e12d421b8c54"} Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.732427 5002 scope.go:117] "RemoveContainer" containerID="99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.760706 5002 scope.go:117] "RemoveContainer" containerID="4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.779898 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f5pvx"] Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.790380 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f5pvx"] Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.797293 5002 scope.go:117] "RemoveContainer" containerID="eb1172dd270bb07b7bd3d0e3cf095347f88c3973d93c81ec1e892aee949eb235" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.863663 5002 scope.go:117] "RemoveContainer" containerID="99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15" Dec 09 12:38:14 crc kubenswrapper[5002]: E1209 12:38:14.864553 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15\": container with ID starting with 99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15 not found: ID does not exist" containerID="99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.864627 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15"} err="failed to get container status \"99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15\": rpc error: code = NotFound desc = could not find container \"99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15\": container with ID starting with 99a0ebf7b9df27501ccc438a283c6bd6f173366d0c25f471339edddc0d607b15 not found: ID does not exist" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.864670 5002 scope.go:117] "RemoveContainer" containerID="4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957" Dec 09 12:38:14 crc kubenswrapper[5002]: E1209 12:38:14.865301 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957\": container with ID starting with 4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957 not found: ID does not exist" containerID="4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.865958 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957"} err="failed to get container status \"4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957\": rpc error: code = NotFound desc = could not find container \"4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957\": container with ID starting with 4ace3695813e7195cb2319701aade0c7d25e8b25d8c29eae2291251a40887957 not found: ID does not exist" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.865989 5002 scope.go:117] "RemoveContainer" containerID="eb1172dd270bb07b7bd3d0e3cf095347f88c3973d93c81ec1e892aee949eb235" Dec 09 12:38:14 crc kubenswrapper[5002]: E1209 12:38:14.866741 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb1172dd270bb07b7bd3d0e3cf095347f88c3973d93c81ec1e892aee949eb235\": container with ID starting with eb1172dd270bb07b7bd3d0e3cf095347f88c3973d93c81ec1e892aee949eb235 not found: ID does not exist" containerID="eb1172dd270bb07b7bd3d0e3cf095347f88c3973d93c81ec1e892aee949eb235" Dec 09 12:38:14 crc kubenswrapper[5002]: I1209 12:38:14.866801 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb1172dd270bb07b7bd3d0e3cf095347f88c3973d93c81ec1e892aee949eb235"} err="failed to get container status \"eb1172dd270bb07b7bd3d0e3cf095347f88c3973d93c81ec1e892aee949eb235\": rpc error: code = NotFound desc = could not find container \"eb1172dd270bb07b7bd3d0e3cf095347f88c3973d93c81ec1e892aee949eb235\": container with ID starting with eb1172dd270bb07b7bd3d0e3cf095347f88c3973d93c81ec1e892aee949eb235 not found: ID does not exist" Dec 09 12:38:16 crc kubenswrapper[5002]: I1209 12:38:16.079988 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd06493e-9ed3-4dec-848f-3108b431179e" path="/var/lib/kubelet/pods/cd06493e-9ed3-4dec-848f-3108b431179e/volumes" Dec 09 12:40:07 crc kubenswrapper[5002]: I1209 12:40:07.965000 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:40:07 crc kubenswrapper[5002]: I1209 12:40:07.965673 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:40:37 crc kubenswrapper[5002]: I1209 12:40:37.964880 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:40:37 crc kubenswrapper[5002]: I1209 12:40:37.965462 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:41:04 crc kubenswrapper[5002]: I1209 12:41:04.628366 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-5z75v" podUID="f85c45d5-d9fa-418c-91ea-0f054181f4c7" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 12:41:07 crc kubenswrapper[5002]: I1209 12:41:07.965429 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:41:07 crc kubenswrapper[5002]: I1209 12:41:07.966105 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:41:07 crc kubenswrapper[5002]: I1209 12:41:07.966205 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 12:41:07 crc kubenswrapper[5002]: I1209 12:41:07.967349 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:41:07 crc kubenswrapper[5002]: I1209 12:41:07.967469 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" gracePeriod=600 Dec 09 12:41:08 crc kubenswrapper[5002]: I1209 12:41:08.518179 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" exitCode=0 Dec 09 12:41:08 crc kubenswrapper[5002]: I1209 12:41:08.518442 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b"} Dec 09 12:41:08 crc kubenswrapper[5002]: I1209 12:41:08.518581 5002 scope.go:117] "RemoveContainer" containerID="20a8a57c0480b1e66e8fe238396fa88d6e5a58fcd236299b623d24862a22772e" Dec 09 12:41:08 crc kubenswrapper[5002]: E1209 12:41:08.651904 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:41:09 crc kubenswrapper[5002]: I1209 12:41:09.532465 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:41:09 crc kubenswrapper[5002]: E1209 12:41:09.533277 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:41:11 crc kubenswrapper[5002]: I1209 12:41:11.559317 5002 generic.go:334] "Generic (PLEG): container finished" podID="e448dd02-1116-4b33-9298-7053138072e9" containerID="6a52117b925a89a617f2a80100ef70755aa70686ff4956c5b51d90fbf4903da6" exitCode=0 Dec 09 12:41:11 crc kubenswrapper[5002]: I1209 12:41:11.559402 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" event={"ID":"e448dd02-1116-4b33-9298-7053138072e9","Type":"ContainerDied","Data":"6a52117b925a89a617f2a80100ef70755aa70686ff4956c5b51d90fbf4903da6"} Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.244808 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.396061 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms7km\" (UniqueName: \"kubernetes.io/projected/e448dd02-1116-4b33-9298-7053138072e9-kube-api-access-ms7km\") pod \"e448dd02-1116-4b33-9298-7053138072e9\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.396102 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ceph\") pod \"e448dd02-1116-4b33-9298-7053138072e9\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.396126 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ssh-key\") pod \"e448dd02-1116-4b33-9298-7053138072e9\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.396144 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-1\") pod \"e448dd02-1116-4b33-9298-7053138072e9\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.396174 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-inventory\") pod \"e448dd02-1116-4b33-9298-7053138072e9\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.396188 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-1\") pod \"e448dd02-1116-4b33-9298-7053138072e9\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.396218 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-combined-ca-bundle\") pod \"e448dd02-1116-4b33-9298-7053138072e9\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.396290 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-0\") pod \"e448dd02-1116-4b33-9298-7053138072e9\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.396312 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-0\") pod \"e448dd02-1116-4b33-9298-7053138072e9\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.396347 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-0\") pod \"e448dd02-1116-4b33-9298-7053138072e9\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.396377 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-1\") pod \"e448dd02-1116-4b33-9298-7053138072e9\" (UID: \"e448dd02-1116-4b33-9298-7053138072e9\") " Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.403714 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "e448dd02-1116-4b33-9298-7053138072e9" (UID: "e448dd02-1116-4b33-9298-7053138072e9"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.403780 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ceph" (OuterVolumeSpecName: "ceph") pod "e448dd02-1116-4b33-9298-7053138072e9" (UID: "e448dd02-1116-4b33-9298-7053138072e9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.405368 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e448dd02-1116-4b33-9298-7053138072e9-kube-api-access-ms7km" (OuterVolumeSpecName: "kube-api-access-ms7km") pod "e448dd02-1116-4b33-9298-7053138072e9" (UID: "e448dd02-1116-4b33-9298-7053138072e9"). InnerVolumeSpecName "kube-api-access-ms7km". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.433455 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "e448dd02-1116-4b33-9298-7053138072e9" (UID: "e448dd02-1116-4b33-9298-7053138072e9"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.438703 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e448dd02-1116-4b33-9298-7053138072e9" (UID: "e448dd02-1116-4b33-9298-7053138072e9"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.444045 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e448dd02-1116-4b33-9298-7053138072e9" (UID: "e448dd02-1116-4b33-9298-7053138072e9"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.444278 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e448dd02-1116-4b33-9298-7053138072e9" (UID: "e448dd02-1116-4b33-9298-7053138072e9"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.451162 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "e448dd02-1116-4b33-9298-7053138072e9" (UID: "e448dd02-1116-4b33-9298-7053138072e9"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.457624 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e448dd02-1116-4b33-9298-7053138072e9" (UID: "e448dd02-1116-4b33-9298-7053138072e9"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.460391 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e448dd02-1116-4b33-9298-7053138072e9" (UID: "e448dd02-1116-4b33-9298-7053138072e9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.463276 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-inventory" (OuterVolumeSpecName: "inventory") pod "e448dd02-1116-4b33-9298-7053138072e9" (UID: "e448dd02-1116-4b33-9298-7053138072e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.498957 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms7km\" (UniqueName: \"kubernetes.io/projected/e448dd02-1116-4b33-9298-7053138072e9-kube-api-access-ms7km\") on node \"crc\" DevicePath \"\"" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.498993 5002 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.499004 5002 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.499014 5002 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.499024 5002 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.499033 5002 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.499041 5002 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.499050 5002 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.499059 5002 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.499067 5002 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e448dd02-1116-4b33-9298-7053138072e9-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.499075 5002 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e448dd02-1116-4b33-9298-7053138072e9-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.583176 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" event={"ID":"e448dd02-1116-4b33-9298-7053138072e9","Type":"ContainerDied","Data":"10adb23d808bbe5f3e880bd75b92702ae2e705a9161ab1561cdb0d9aa26be9a4"} Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.583216 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10adb23d808bbe5f3e880bd75b92702ae2e705a9161ab1561cdb0d9aa26be9a4" Dec 09 12:41:13 crc kubenswrapper[5002]: I1209 12:41:13.583243 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww" Dec 09 12:41:24 crc kubenswrapper[5002]: I1209 12:41:24.061194 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:41:24 crc kubenswrapper[5002]: E1209 12:41:24.062177 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:41:38 crc kubenswrapper[5002]: I1209 12:41:38.067633 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:41:38 crc kubenswrapper[5002]: E1209 12:41:38.069760 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:41:52 crc kubenswrapper[5002]: I1209 12:41:52.060781 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:41:52 crc kubenswrapper[5002]: E1209 12:41:52.061652 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:42:07 crc kubenswrapper[5002]: I1209 12:42:07.060565 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:42:07 crc kubenswrapper[5002]: E1209 12:42:07.061471 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:42:19 crc kubenswrapper[5002]: I1209 12:42:19.061217 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:42:19 crc kubenswrapper[5002]: E1209 12:42:19.063262 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:42:32 crc kubenswrapper[5002]: I1209 12:42:32.060510 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:42:32 crc kubenswrapper[5002]: E1209 12:42:32.061598 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:42:45 crc kubenswrapper[5002]: I1209 12:42:45.060858 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:42:45 crc kubenswrapper[5002]: E1209 12:42:45.062337 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:42:56 crc kubenswrapper[5002]: I1209 12:42:56.060176 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:42:56 crc kubenswrapper[5002]: E1209 12:42:56.060926 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:43:07 crc kubenswrapper[5002]: I1209 12:43:07.066595 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:43:07 crc kubenswrapper[5002]: E1209 12:43:07.071328 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:43:18 crc kubenswrapper[5002]: I1209 12:43:18.068446 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:43:18 crc kubenswrapper[5002]: E1209 12:43:18.069368 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:43:22 crc kubenswrapper[5002]: I1209 12:43:22.701353 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 12:43:22 crc kubenswrapper[5002]: I1209 12:43:22.701984 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="64e3023c-35e4-43fb-b414-e7c2abbcd64d" containerName="adoption" containerID="cri-o://a3e07ec08f505c0916ba42005e65f044690c20195ecda081e0183617c886894f" gracePeriod=30 Dec 09 12:43:33 crc kubenswrapper[5002]: I1209 12:43:33.060235 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:43:33 crc kubenswrapper[5002]: E1209 12:43:33.062329 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:43:47 crc kubenswrapper[5002]: I1209 12:43:47.061706 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:43:47 crc kubenswrapper[5002]: E1209 12:43:47.062648 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:43:52 crc kubenswrapper[5002]: I1209 12:43:52.891289 5002 generic.go:334] "Generic (PLEG): container finished" podID="64e3023c-35e4-43fb-b414-e7c2abbcd64d" containerID="a3e07ec08f505c0916ba42005e65f044690c20195ecda081e0183617c886894f" exitCode=137 Dec 09 12:43:52 crc kubenswrapper[5002]: I1209 12:43:52.891380 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"64e3023c-35e4-43fb-b414-e7c2abbcd64d","Type":"ContainerDied","Data":"a3e07ec08f505c0916ba42005e65f044690c20195ecda081e0183617c886894f"} Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.251957 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.375176 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7452df44-56a9-4ffd-999a-668c99b19f56\") pod \"64e3023c-35e4-43fb-b414-e7c2abbcd64d\" (UID: \"64e3023c-35e4-43fb-b414-e7c2abbcd64d\") " Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.375454 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfm6d\" (UniqueName: \"kubernetes.io/projected/64e3023c-35e4-43fb-b414-e7c2abbcd64d-kube-api-access-bfm6d\") pod \"64e3023c-35e4-43fb-b414-e7c2abbcd64d\" (UID: \"64e3023c-35e4-43fb-b414-e7c2abbcd64d\") " Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.384227 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e3023c-35e4-43fb-b414-e7c2abbcd64d-kube-api-access-bfm6d" (OuterVolumeSpecName: "kube-api-access-bfm6d") pod "64e3023c-35e4-43fb-b414-e7c2abbcd64d" (UID: "64e3023c-35e4-43fb-b414-e7c2abbcd64d"). InnerVolumeSpecName "kube-api-access-bfm6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.396679 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7452df44-56a9-4ffd-999a-668c99b19f56" (OuterVolumeSpecName: "mariadb-data") pod "64e3023c-35e4-43fb-b414-e7c2abbcd64d" (UID: "64e3023c-35e4-43fb-b414-e7c2abbcd64d"). InnerVolumeSpecName "pvc-7452df44-56a9-4ffd-999a-668c99b19f56". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.478421 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfm6d\" (UniqueName: \"kubernetes.io/projected/64e3023c-35e4-43fb-b414-e7c2abbcd64d-kube-api-access-bfm6d\") on node \"crc\" DevicePath \"\"" Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.478480 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7452df44-56a9-4ffd-999a-668c99b19f56\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7452df44-56a9-4ffd-999a-668c99b19f56\") on node \"crc\" " Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.510935 5002 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.511172 5002 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7452df44-56a9-4ffd-999a-668c99b19f56" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7452df44-56a9-4ffd-999a-668c99b19f56") on node "crc" Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.580116 5002 reconciler_common.go:293] "Volume detached for volume \"pvc-7452df44-56a9-4ffd-999a-668c99b19f56\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7452df44-56a9-4ffd-999a-668c99b19f56\") on node \"crc\" DevicePath \"\"" Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.900940 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"64e3023c-35e4-43fb-b414-e7c2abbcd64d","Type":"ContainerDied","Data":"00f9c63990c46149b68ea76671d594a55c85b27cfbc8d24b9ec42c07c48e59c0"} Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.900991 5002 scope.go:117] "RemoveContainer" containerID="a3e07ec08f505c0916ba42005e65f044690c20195ecda081e0183617c886894f" Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.900998 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.937112 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 12:43:53 crc kubenswrapper[5002]: I1209 12:43:53.965955 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 12:43:54 crc kubenswrapper[5002]: I1209 12:43:54.078371 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64e3023c-35e4-43fb-b414-e7c2abbcd64d" path="/var/lib/kubelet/pods/64e3023c-35e4-43fb-b414-e7c2abbcd64d/volumes" Dec 09 12:43:54 crc kubenswrapper[5002]: I1209 12:43:54.547769 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 09 12:43:54 crc kubenswrapper[5002]: I1209 12:43:54.548036 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="e89a37d0-8021-46be-bf1e-9e9f9b6b0869" containerName="adoption" containerID="cri-o://5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698" gracePeriod=30 Dec 09 12:44:00 crc kubenswrapper[5002]: I1209 12:44:00.061451 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:44:00 crc kubenswrapper[5002]: E1209 12:44:00.062174 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:44:15 crc kubenswrapper[5002]: I1209 12:44:15.060570 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:44:15 crc kubenswrapper[5002]: E1209 12:44:15.061429 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.358382 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-knxz2"] Dec 09 12:44:20 crc kubenswrapper[5002]: E1209 12:44:20.360602 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e448dd02-1116-4b33-9298-7053138072e9" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.360691 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e448dd02-1116-4b33-9298-7053138072e9" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 09 12:44:20 crc kubenswrapper[5002]: E1209 12:44:20.360770 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd06493e-9ed3-4dec-848f-3108b431179e" containerName="extract-content" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.360844 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd06493e-9ed3-4dec-848f-3108b431179e" containerName="extract-content" Dec 09 12:44:20 crc kubenswrapper[5002]: E1209 12:44:20.360927 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e3023c-35e4-43fb-b414-e7c2abbcd64d" containerName="adoption" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.360996 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e3023c-35e4-43fb-b414-e7c2abbcd64d" containerName="adoption" Dec 09 12:44:20 crc kubenswrapper[5002]: E1209 12:44:20.361067 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd06493e-9ed3-4dec-848f-3108b431179e" containerName="extract-utilities" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.361122 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd06493e-9ed3-4dec-848f-3108b431179e" containerName="extract-utilities" Dec 09 12:44:20 crc kubenswrapper[5002]: E1209 12:44:20.361198 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd06493e-9ed3-4dec-848f-3108b431179e" containerName="registry-server" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.361256 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd06493e-9ed3-4dec-848f-3108b431179e" containerName="registry-server" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.361522 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e448dd02-1116-4b33-9298-7053138072e9" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.361594 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd06493e-9ed3-4dec-848f-3108b431179e" containerName="registry-server" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.361667 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e3023c-35e4-43fb-b414-e7c2abbcd64d" containerName="adoption" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.363317 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.383871 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-knxz2"] Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.493053 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-catalog-content\") pod \"certified-operators-knxz2\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.494832 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-utilities\") pod \"certified-operators-knxz2\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.495003 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8vvr\" (UniqueName: \"kubernetes.io/projected/27dce769-1505-4bcc-89cd-fef1e4393662-kube-api-access-v8vvr\") pod \"certified-operators-knxz2\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.596860 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-catalog-content\") pod \"certified-operators-knxz2\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.597203 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-utilities\") pod \"certified-operators-knxz2\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.597479 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-catalog-content\") pod \"certified-operators-knxz2\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.597607 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-utilities\") pod \"certified-operators-knxz2\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.597672 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8vvr\" (UniqueName: \"kubernetes.io/projected/27dce769-1505-4bcc-89cd-fef1e4393662-kube-api-access-v8vvr\") pod \"certified-operators-knxz2\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.617123 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8vvr\" (UniqueName: \"kubernetes.io/projected/27dce769-1505-4bcc-89cd-fef1e4393662-kube-api-access-v8vvr\") pod \"certified-operators-knxz2\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:20 crc kubenswrapper[5002]: I1209 12:44:20.752482 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:21 crc kubenswrapper[5002]: I1209 12:44:21.305057 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-knxz2"] Dec 09 12:44:22 crc kubenswrapper[5002]: I1209 12:44:22.221357 5002 generic.go:334] "Generic (PLEG): container finished" podID="27dce769-1505-4bcc-89cd-fef1e4393662" containerID="c9c464527152173224567b1e3691b95d64e6e2c595a3901ab161f6244d33e095" exitCode=0 Dec 09 12:44:22 crc kubenswrapper[5002]: I1209 12:44:22.221612 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knxz2" event={"ID":"27dce769-1505-4bcc-89cd-fef1e4393662","Type":"ContainerDied","Data":"c9c464527152173224567b1e3691b95d64e6e2c595a3901ab161f6244d33e095"} Dec 09 12:44:22 crc kubenswrapper[5002]: I1209 12:44:22.221728 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knxz2" event={"ID":"27dce769-1505-4bcc-89cd-fef1e4393662","Type":"ContainerStarted","Data":"149bf435428fd2ef7696b094311af57cb6e12d31a9b6aec263fd3256b5681251"} Dec 09 12:44:22 crc kubenswrapper[5002]: I1209 12:44:22.223918 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:44:23 crc kubenswrapper[5002]: I1209 12:44:23.233901 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knxz2" event={"ID":"27dce769-1505-4bcc-89cd-fef1e4393662","Type":"ContainerStarted","Data":"2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2"} Dec 09 12:44:24 crc kubenswrapper[5002]: I1209 12:44:24.244003 5002 generic.go:334] "Generic (PLEG): container finished" podID="27dce769-1505-4bcc-89cd-fef1e4393662" containerID="2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2" exitCode=0 Dec 09 12:44:24 crc kubenswrapper[5002]: I1209 12:44:24.244361 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knxz2" event={"ID":"27dce769-1505-4bcc-89cd-fef1e4393662","Type":"ContainerDied","Data":"2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2"} Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.030869 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.108764 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\") pod \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.109878 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npwzn\" (UniqueName: \"kubernetes.io/projected/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-kube-api-access-npwzn\") pod \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.109920 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-ovn-data-cert\") pod \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\" (UID: \"e89a37d0-8021-46be-bf1e-9e9f9b6b0869\") " Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.116303 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "e89a37d0-8021-46be-bf1e-9e9f9b6b0869" (UID: "e89a37d0-8021-46be-bf1e-9e9f9b6b0869"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.124152 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-kube-api-access-npwzn" (OuterVolumeSpecName: "kube-api-access-npwzn") pod "e89a37d0-8021-46be-bf1e-9e9f9b6b0869" (UID: "e89a37d0-8021-46be-bf1e-9e9f9b6b0869"). InnerVolumeSpecName "kube-api-access-npwzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.128190 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26" (OuterVolumeSpecName: "ovn-data") pod "e89a37d0-8021-46be-bf1e-9e9f9b6b0869" (UID: "e89a37d0-8021-46be-bf1e-9e9f9b6b0869"). InnerVolumeSpecName "pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.212284 5002 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\") on node \"crc\" " Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.212332 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npwzn\" (UniqueName: \"kubernetes.io/projected/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-kube-api-access-npwzn\") on node \"crc\" DevicePath \"\"" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.212346 5002 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/e89a37d0-8021-46be-bf1e-9e9f9b6b0869-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.241382 5002 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.241541 5002 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26") on node "crc" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.259925 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knxz2" event={"ID":"27dce769-1505-4bcc-89cd-fef1e4393662","Type":"ContainerStarted","Data":"d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10"} Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.267231 5002 generic.go:334] "Generic (PLEG): container finished" podID="e89a37d0-8021-46be-bf1e-9e9f9b6b0869" containerID="5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698" exitCode=137 Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.267300 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"e89a37d0-8021-46be-bf1e-9e9f9b6b0869","Type":"ContainerDied","Data":"5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698"} Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.267381 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"e89a37d0-8021-46be-bf1e-9e9f9b6b0869","Type":"ContainerDied","Data":"cf4c850ad41285eae556a4359ec47ff505da1290b49801d6f9cf9faa6e70a80c"} Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.267322 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.267414 5002 scope.go:117] "RemoveContainer" containerID="5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.304846 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-knxz2" podStartSLOduration=2.853390147 podStartE2EDuration="5.304803986s" podCreationTimestamp="2025-12-09 12:44:20 +0000 UTC" firstStartedPulling="2025-12-09 12:44:22.223639072 +0000 UTC m=+9794.615690143" lastFinishedPulling="2025-12-09 12:44:24.675052901 +0000 UTC m=+9797.067103982" observedRunningTime="2025-12-09 12:44:25.281299104 +0000 UTC m=+9797.673350195" watchObservedRunningTime="2025-12-09 12:44:25.304803986 +0000 UTC m=+9797.696855067" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.308559 5002 scope.go:117] "RemoveContainer" containerID="5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698" Dec 09 12:44:25 crc kubenswrapper[5002]: E1209 12:44:25.309015 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698\": container with ID starting with 5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698 not found: ID does not exist" containerID="5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.309051 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698"} err="failed to get container status \"5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698\": rpc error: code = NotFound desc = could not find container \"5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698\": container with ID starting with 5755b257f1b347ee74e0fd5292a0d84b6a6e41938757236bbbba486d2e64c698 not found: ID does not exist" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.314700 5002 reconciler_common.go:293] "Volume detached for volume \"pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99e3e2fb-eae6-4ac9-bc44-cb4e8f659d26\") on node \"crc\" DevicePath \"\"" Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.357814 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 09 12:44:25 crc kubenswrapper[5002]: I1209 12:44:25.369218 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Dec 09 12:44:26 crc kubenswrapper[5002]: I1209 12:44:26.076763 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89a37d0-8021-46be-bf1e-9e9f9b6b0869" path="/var/lib/kubelet/pods/e89a37d0-8021-46be-bf1e-9e9f9b6b0869/volumes" Dec 09 12:44:28 crc kubenswrapper[5002]: I1209 12:44:28.068639 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:44:28 crc kubenswrapper[5002]: E1209 12:44:28.069322 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:44:30 crc kubenswrapper[5002]: I1209 12:44:30.752994 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:30 crc kubenswrapper[5002]: I1209 12:44:30.753408 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:30 crc kubenswrapper[5002]: I1209 12:44:30.802713 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:31 crc kubenswrapper[5002]: I1209 12:44:31.374302 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:31 crc kubenswrapper[5002]: I1209 12:44:31.425062 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-knxz2"] Dec 09 12:44:33 crc kubenswrapper[5002]: I1209 12:44:33.337134 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-knxz2" podUID="27dce769-1505-4bcc-89cd-fef1e4393662" containerName="registry-server" containerID="cri-o://d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10" gracePeriod=2 Dec 09 12:44:33 crc kubenswrapper[5002]: I1209 12:44:33.815204 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:33 crc kubenswrapper[5002]: I1209 12:44:33.910543 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8vvr\" (UniqueName: \"kubernetes.io/projected/27dce769-1505-4bcc-89cd-fef1e4393662-kube-api-access-v8vvr\") pod \"27dce769-1505-4bcc-89cd-fef1e4393662\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " Dec 09 12:44:33 crc kubenswrapper[5002]: I1209 12:44:33.910669 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-utilities\") pod \"27dce769-1505-4bcc-89cd-fef1e4393662\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " Dec 09 12:44:33 crc kubenswrapper[5002]: I1209 12:44:33.910721 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-catalog-content\") pod \"27dce769-1505-4bcc-89cd-fef1e4393662\" (UID: \"27dce769-1505-4bcc-89cd-fef1e4393662\") " Dec 09 12:44:33 crc kubenswrapper[5002]: I1209 12:44:33.911509 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-utilities" (OuterVolumeSpecName: "utilities") pod "27dce769-1505-4bcc-89cd-fef1e4393662" (UID: "27dce769-1505-4bcc-89cd-fef1e4393662"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:44:33 crc kubenswrapper[5002]: I1209 12:44:33.919045 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27dce769-1505-4bcc-89cd-fef1e4393662-kube-api-access-v8vvr" (OuterVolumeSpecName: "kube-api-access-v8vvr") pod "27dce769-1505-4bcc-89cd-fef1e4393662" (UID: "27dce769-1505-4bcc-89cd-fef1e4393662"). InnerVolumeSpecName "kube-api-access-v8vvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.014311 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8vvr\" (UniqueName: \"kubernetes.io/projected/27dce769-1505-4bcc-89cd-fef1e4393662-kube-api-access-v8vvr\") on node \"crc\" DevicePath \"\"" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.014374 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.313254 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27dce769-1505-4bcc-89cd-fef1e4393662" (UID: "27dce769-1505-4bcc-89cd-fef1e4393662"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.320627 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27dce769-1505-4bcc-89cd-fef1e4393662-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.349843 5002 generic.go:334] "Generic (PLEG): container finished" podID="27dce769-1505-4bcc-89cd-fef1e4393662" containerID="d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10" exitCode=0 Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.349882 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-knxz2" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.349888 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knxz2" event={"ID":"27dce769-1505-4bcc-89cd-fef1e4393662","Type":"ContainerDied","Data":"d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10"} Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.349917 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knxz2" event={"ID":"27dce769-1505-4bcc-89cd-fef1e4393662","Type":"ContainerDied","Data":"149bf435428fd2ef7696b094311af57cb6e12d31a9b6aec263fd3256b5681251"} Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.349937 5002 scope.go:117] "RemoveContainer" containerID="d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.372789 5002 scope.go:117] "RemoveContainer" containerID="2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.403355 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-knxz2"] Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.414205 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-knxz2"] Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.420058 5002 scope.go:117] "RemoveContainer" containerID="c9c464527152173224567b1e3691b95d64e6e2c595a3901ab161f6244d33e095" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.454282 5002 scope.go:117] "RemoveContainer" containerID="d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10" Dec 09 12:44:34 crc kubenswrapper[5002]: E1209 12:44:34.454735 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10\": container with ID starting with d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10 not found: ID does not exist" containerID="d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.454787 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10"} err="failed to get container status \"d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10\": rpc error: code = NotFound desc = could not find container \"d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10\": container with ID starting with d8b93e8a6c83167b48bc218d4cba3ca5a4c4cddc0bd5718c91ff27f4f9606a10 not found: ID does not exist" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.454847 5002 scope.go:117] "RemoveContainer" containerID="2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2" Dec 09 12:44:34 crc kubenswrapper[5002]: E1209 12:44:34.455138 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2\": container with ID starting with 2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2 not found: ID does not exist" containerID="2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.455164 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2"} err="failed to get container status \"2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2\": rpc error: code = NotFound desc = could not find container \"2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2\": container with ID starting with 2274085a67e70cc5a5ea69b0001d3655c73ba822c8fbb0c9c5e2a3a9962081a2 not found: ID does not exist" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.455179 5002 scope.go:117] "RemoveContainer" containerID="c9c464527152173224567b1e3691b95d64e6e2c595a3901ab161f6244d33e095" Dec 09 12:44:34 crc kubenswrapper[5002]: E1209 12:44:34.455407 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c464527152173224567b1e3691b95d64e6e2c595a3901ab161f6244d33e095\": container with ID starting with c9c464527152173224567b1e3691b95d64e6e2c595a3901ab161f6244d33e095 not found: ID does not exist" containerID="c9c464527152173224567b1e3691b95d64e6e2c595a3901ab161f6244d33e095" Dec 09 12:44:34 crc kubenswrapper[5002]: I1209 12:44:34.455427 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c464527152173224567b1e3691b95d64e6e2c595a3901ab161f6244d33e095"} err="failed to get container status \"c9c464527152173224567b1e3691b95d64e6e2c595a3901ab161f6244d33e095\": rpc error: code = NotFound desc = could not find container \"c9c464527152173224567b1e3691b95d64e6e2c595a3901ab161f6244d33e095\": container with ID starting with c9c464527152173224567b1e3691b95d64e6e2c595a3901ab161f6244d33e095 not found: ID does not exist" Dec 09 12:44:36 crc kubenswrapper[5002]: I1209 12:44:36.071072 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27dce769-1505-4bcc-89cd-fef1e4393662" path="/var/lib/kubelet/pods/27dce769-1505-4bcc-89cd-fef1e4393662/volumes" Dec 09 12:44:40 crc kubenswrapper[5002]: I1209 12:44:40.060582 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:44:40 crc kubenswrapper[5002]: E1209 12:44:40.061380 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:44:55 crc kubenswrapper[5002]: I1209 12:44:55.060122 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:44:55 crc kubenswrapper[5002]: E1209 12:44:55.060881 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.153066 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7"] Dec 09 12:45:00 crc kubenswrapper[5002]: E1209 12:45:00.154151 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27dce769-1505-4bcc-89cd-fef1e4393662" containerName="extract-content" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.154167 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="27dce769-1505-4bcc-89cd-fef1e4393662" containerName="extract-content" Dec 09 12:45:00 crc kubenswrapper[5002]: E1209 12:45:00.154211 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89a37d0-8021-46be-bf1e-9e9f9b6b0869" containerName="adoption" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.154218 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89a37d0-8021-46be-bf1e-9e9f9b6b0869" containerName="adoption" Dec 09 12:45:00 crc kubenswrapper[5002]: E1209 12:45:00.154231 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27dce769-1505-4bcc-89cd-fef1e4393662" containerName="extract-utilities" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.154241 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="27dce769-1505-4bcc-89cd-fef1e4393662" containerName="extract-utilities" Dec 09 12:45:00 crc kubenswrapper[5002]: E1209 12:45:00.154273 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27dce769-1505-4bcc-89cd-fef1e4393662" containerName="registry-server" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.154280 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="27dce769-1505-4bcc-89cd-fef1e4393662" containerName="registry-server" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.154556 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="27dce769-1505-4bcc-89cd-fef1e4393662" containerName="registry-server" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.154579 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89a37d0-8021-46be-bf1e-9e9f9b6b0869" containerName="adoption" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.155506 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.158046 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.159239 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.168596 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7"] Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.281270 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-secret-volume\") pod \"collect-profiles-29421405-2xfn7\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.281743 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx58j\" (UniqueName: \"kubernetes.io/projected/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-kube-api-access-xx58j\") pod \"collect-profiles-29421405-2xfn7\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.281794 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-config-volume\") pod \"collect-profiles-29421405-2xfn7\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.384466 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx58j\" (UniqueName: \"kubernetes.io/projected/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-kube-api-access-xx58j\") pod \"collect-profiles-29421405-2xfn7\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.384539 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-config-volume\") pod \"collect-profiles-29421405-2xfn7\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.384646 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-secret-volume\") pod \"collect-profiles-29421405-2xfn7\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.385871 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-config-volume\") pod \"collect-profiles-29421405-2xfn7\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.398720 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-secret-volume\") pod \"collect-profiles-29421405-2xfn7\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.401015 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx58j\" (UniqueName: \"kubernetes.io/projected/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-kube-api-access-xx58j\") pod \"collect-profiles-29421405-2xfn7\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.484979 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:00 crc kubenswrapper[5002]: I1209 12:45:00.953139 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7"] Dec 09 12:45:01 crc kubenswrapper[5002]: I1209 12:45:01.042435 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" event={"ID":"2bc3e2a0-63e3-4a6c-9221-c3b48849884b","Type":"ContainerStarted","Data":"4969c2ff1b64af6b1b342ad7c1fa20c7d1fc16dc1254e2bfcb727c28e7d31d4f"} Dec 09 12:45:02 crc kubenswrapper[5002]: I1209 12:45:02.057796 5002 generic.go:334] "Generic (PLEG): container finished" podID="2bc3e2a0-63e3-4a6c-9221-c3b48849884b" containerID="0ccd464fbf8f19c2ad19c0ff77ef44e089806f834bb2f0c2c9714ecd0c513c69" exitCode=0 Dec 09 12:45:02 crc kubenswrapper[5002]: I1209 12:45:02.057854 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" event={"ID":"2bc3e2a0-63e3-4a6c-9221-c3b48849884b","Type":"ContainerDied","Data":"0ccd464fbf8f19c2ad19c0ff77ef44e089806f834bb2f0c2c9714ecd0c513c69"} Dec 09 12:45:03 crc kubenswrapper[5002]: I1209 12:45:03.439547 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:03 crc kubenswrapper[5002]: I1209 12:45:03.559218 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx58j\" (UniqueName: \"kubernetes.io/projected/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-kube-api-access-xx58j\") pod \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " Dec 09 12:45:03 crc kubenswrapper[5002]: I1209 12:45:03.559540 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-config-volume\") pod \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " Dec 09 12:45:03 crc kubenswrapper[5002]: I1209 12:45:03.559882 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-secret-volume\") pod \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\" (UID: \"2bc3e2a0-63e3-4a6c-9221-c3b48849884b\") " Dec 09 12:45:03 crc kubenswrapper[5002]: I1209 12:45:03.560253 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-config-volume" (OuterVolumeSpecName: "config-volume") pod "2bc3e2a0-63e3-4a6c-9221-c3b48849884b" (UID: "2bc3e2a0-63e3-4a6c-9221-c3b48849884b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:45:03 crc kubenswrapper[5002]: I1209 12:45:03.560720 5002 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:45:03 crc kubenswrapper[5002]: I1209 12:45:03.564440 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2bc3e2a0-63e3-4a6c-9221-c3b48849884b" (UID: "2bc3e2a0-63e3-4a6c-9221-c3b48849884b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:45:03 crc kubenswrapper[5002]: I1209 12:45:03.564879 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-kube-api-access-xx58j" (OuterVolumeSpecName: "kube-api-access-xx58j") pod "2bc3e2a0-63e3-4a6c-9221-c3b48849884b" (UID: "2bc3e2a0-63e3-4a6c-9221-c3b48849884b"). InnerVolumeSpecName "kube-api-access-xx58j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:45:03 crc kubenswrapper[5002]: I1209 12:45:03.662649 5002 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:45:03 crc kubenswrapper[5002]: I1209 12:45:03.662923 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx58j\" (UniqueName: \"kubernetes.io/projected/2bc3e2a0-63e3-4a6c-9221-c3b48849884b-kube-api-access-xx58j\") on node \"crc\" DevicePath \"\"" Dec 09 12:45:04 crc kubenswrapper[5002]: I1209 12:45:04.077504 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" event={"ID":"2bc3e2a0-63e3-4a6c-9221-c3b48849884b","Type":"ContainerDied","Data":"4969c2ff1b64af6b1b342ad7c1fa20c7d1fc16dc1254e2bfcb727c28e7d31d4f"} Dec 09 12:45:04 crc kubenswrapper[5002]: I1209 12:45:04.077543 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4969c2ff1b64af6b1b342ad7c1fa20c7d1fc16dc1254e2bfcb727c28e7d31d4f" Dec 09 12:45:04 crc kubenswrapper[5002]: I1209 12:45:04.077552 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-2xfn7" Dec 09 12:45:04 crc kubenswrapper[5002]: I1209 12:45:04.521094 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh"] Dec 09 12:45:04 crc kubenswrapper[5002]: I1209 12:45:04.533025 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421360-lqkxh"] Dec 09 12:45:06 crc kubenswrapper[5002]: I1209 12:45:06.074093 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8fe753-2f2c-4a3d-9287-6854676262e7" path="/var/lib/kubelet/pods/8e8fe753-2f2c-4a3d-9287-6854676262e7/volumes" Dec 09 12:45:08 crc kubenswrapper[5002]: I1209 12:45:08.069415 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:45:08 crc kubenswrapper[5002]: E1209 12:45:08.069962 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:45:20 crc kubenswrapper[5002]: I1209 12:45:20.060161 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:45:20 crc kubenswrapper[5002]: E1209 12:45:20.060955 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.063126 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:45:33 crc kubenswrapper[5002]: E1209 12:45:33.064043 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.403464 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7gz9/must-gather-85wlw"] Dec 09 12:45:33 crc kubenswrapper[5002]: E1209 12:45:33.403933 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc3e2a0-63e3-4a6c-9221-c3b48849884b" containerName="collect-profiles" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.403950 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc3e2a0-63e3-4a6c-9221-c3b48849884b" containerName="collect-profiles" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.404137 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc3e2a0-63e3-4a6c-9221-c3b48849884b" containerName="collect-profiles" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.405278 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/must-gather-85wlw" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.407226 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x7gz9"/"kube-root-ca.crt" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.413232 5002 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-x7gz9"/"default-dockercfg-h4zxt" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.413620 5002 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x7gz9"/"openshift-service-ca.crt" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.418252 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7gz9/must-gather-85wlw"] Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.532848 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9wt7\" (UniqueName: \"kubernetes.io/projected/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-kube-api-access-h9wt7\") pod \"must-gather-85wlw\" (UID: \"1d3dbff2-1469-42d0-84d4-71bc89cf68f9\") " pod="openshift-must-gather-x7gz9/must-gather-85wlw" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.533211 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-must-gather-output\") pod \"must-gather-85wlw\" (UID: \"1d3dbff2-1469-42d0-84d4-71bc89cf68f9\") " pod="openshift-must-gather-x7gz9/must-gather-85wlw" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.635621 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9wt7\" (UniqueName: \"kubernetes.io/projected/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-kube-api-access-h9wt7\") pod \"must-gather-85wlw\" (UID: \"1d3dbff2-1469-42d0-84d4-71bc89cf68f9\") " pod="openshift-must-gather-x7gz9/must-gather-85wlw" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.635837 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-must-gather-output\") pod \"must-gather-85wlw\" (UID: \"1d3dbff2-1469-42d0-84d4-71bc89cf68f9\") " pod="openshift-must-gather-x7gz9/must-gather-85wlw" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.636360 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-must-gather-output\") pod \"must-gather-85wlw\" (UID: \"1d3dbff2-1469-42d0-84d4-71bc89cf68f9\") " pod="openshift-must-gather-x7gz9/must-gather-85wlw" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.654587 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9wt7\" (UniqueName: \"kubernetes.io/projected/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-kube-api-access-h9wt7\") pod \"must-gather-85wlw\" (UID: \"1d3dbff2-1469-42d0-84d4-71bc89cf68f9\") " pod="openshift-must-gather-x7gz9/must-gather-85wlw" Dec 09 12:45:33 crc kubenswrapper[5002]: I1209 12:45:33.730681 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/must-gather-85wlw" Dec 09 12:45:34 crc kubenswrapper[5002]: I1209 12:45:34.254977 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7gz9/must-gather-85wlw"] Dec 09 12:45:34 crc kubenswrapper[5002]: I1209 12:45:34.378613 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7gz9/must-gather-85wlw" event={"ID":"1d3dbff2-1469-42d0-84d4-71bc89cf68f9","Type":"ContainerStarted","Data":"54adf994c53164c31746e438e45051ca1bc68dd72835fd36a4ab4665ce28969e"} Dec 09 12:45:42 crc kubenswrapper[5002]: I1209 12:45:42.459110 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7gz9/must-gather-85wlw" event={"ID":"1d3dbff2-1469-42d0-84d4-71bc89cf68f9","Type":"ContainerStarted","Data":"4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba"} Dec 09 12:45:42 crc kubenswrapper[5002]: I1209 12:45:42.459743 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7gz9/must-gather-85wlw" event={"ID":"1d3dbff2-1469-42d0-84d4-71bc89cf68f9","Type":"ContainerStarted","Data":"63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357"} Dec 09 12:45:42 crc kubenswrapper[5002]: I1209 12:45:42.486710 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x7gz9/must-gather-85wlw" podStartSLOduration=2.381358336 podStartE2EDuration="9.486675122s" podCreationTimestamp="2025-12-09 12:45:33 +0000 UTC" firstStartedPulling="2025-12-09 12:45:34.268966543 +0000 UTC m=+9866.661017624" lastFinishedPulling="2025-12-09 12:45:41.374283329 +0000 UTC m=+9873.766334410" observedRunningTime="2025-12-09 12:45:42.475956524 +0000 UTC m=+9874.868007615" watchObservedRunningTime="2025-12-09 12:45:42.486675122 +0000 UTC m=+9874.878726203" Dec 09 12:45:46 crc kubenswrapper[5002]: I1209 12:45:46.243095 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7gz9/crc-debug-z2lj2"] Dec 09 12:45:46 crc kubenswrapper[5002]: I1209 12:45:46.245610 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" Dec 09 12:45:46 crc kubenswrapper[5002]: I1209 12:45:46.404193 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-host\") pod \"crc-debug-z2lj2\" (UID: \"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb\") " pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" Dec 09 12:45:46 crc kubenswrapper[5002]: I1209 12:45:46.404247 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6rpw\" (UniqueName: \"kubernetes.io/projected/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-kube-api-access-l6rpw\") pod \"crc-debug-z2lj2\" (UID: \"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb\") " pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" Dec 09 12:45:46 crc kubenswrapper[5002]: I1209 12:45:46.506470 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-host\") pod \"crc-debug-z2lj2\" (UID: \"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb\") " pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" Dec 09 12:45:46 crc kubenswrapper[5002]: I1209 12:45:46.506544 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6rpw\" (UniqueName: \"kubernetes.io/projected/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-kube-api-access-l6rpw\") pod \"crc-debug-z2lj2\" (UID: \"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb\") " pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" Dec 09 12:45:46 crc kubenswrapper[5002]: I1209 12:45:46.506677 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-host\") pod \"crc-debug-z2lj2\" (UID: \"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb\") " pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" Dec 09 12:45:46 crc kubenswrapper[5002]: I1209 12:45:46.527643 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6rpw\" (UniqueName: \"kubernetes.io/projected/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-kube-api-access-l6rpw\") pod \"crc-debug-z2lj2\" (UID: \"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb\") " pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" Dec 09 12:45:46 crc kubenswrapper[5002]: I1209 12:45:46.568336 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" Dec 09 12:45:46 crc kubenswrapper[5002]: W1209 12:45:46.626832 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fd2f0f7_9ece_4664_8450_7d0dd02e5ceb.slice/crio-46ed0dfe3af65a0691bbb9c3f95d1c4f4855a87852f27e8190326b511dbebc83 WatchSource:0}: Error finding container 46ed0dfe3af65a0691bbb9c3f95d1c4f4855a87852f27e8190326b511dbebc83: Status 404 returned error can't find the container with id 46ed0dfe3af65a0691bbb9c3f95d1c4f4855a87852f27e8190326b511dbebc83 Dec 09 12:45:47 crc kubenswrapper[5002]: I1209 12:45:47.061199 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:45:47 crc kubenswrapper[5002]: E1209 12:45:47.061793 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:45:47 crc kubenswrapper[5002]: I1209 12:45:47.511210 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" event={"ID":"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb","Type":"ContainerStarted","Data":"46ed0dfe3af65a0691bbb9c3f95d1c4f4855a87852f27e8190326b511dbebc83"} Dec 09 12:45:58 crc kubenswrapper[5002]: I1209 12:45:58.345579 5002 scope.go:117] "RemoveContainer" containerID="9dd0b6e228adb197f9ddcf57ceee5e17f5be7e55833e7c886bfd14d89e09a9ae" Dec 09 12:45:59 crc kubenswrapper[5002]: I1209 12:45:59.633515 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" event={"ID":"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb","Type":"ContainerStarted","Data":"9fdc15fc9c73bd0cd85aa9b68a692f7ad21e112c5a041c44d24a46bb706222c0"} Dec 09 12:45:59 crc kubenswrapper[5002]: I1209 12:45:59.655559 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" podStartSLOduration=1.7319702609999998 podStartE2EDuration="13.655532023s" podCreationTimestamp="2025-12-09 12:45:46 +0000 UTC" firstStartedPulling="2025-12-09 12:45:46.632713471 +0000 UTC m=+9879.024764552" lastFinishedPulling="2025-12-09 12:45:58.556275233 +0000 UTC m=+9890.948326314" observedRunningTime="2025-12-09 12:45:59.646203003 +0000 UTC m=+9892.038254084" watchObservedRunningTime="2025-12-09 12:45:59.655532023 +0000 UTC m=+9892.047583124" Dec 09 12:46:02 crc kubenswrapper[5002]: I1209 12:46:02.060441 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:46:02 crc kubenswrapper[5002]: E1209 12:46:02.061325 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:46:15 crc kubenswrapper[5002]: I1209 12:46:15.060101 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:46:16 crc kubenswrapper[5002]: I1209 12:46:16.854185 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"2b189dbc4da5bd17d3dbe1a55a72bd7d5ec553e1d7ef1f793ee32247b44861fa"} Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.658575 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q5fmn"] Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.662254 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.669668 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q5fmn"] Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.719337 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-utilities\") pod \"redhat-marketplace-q5fmn\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.719425 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl9ft\" (UniqueName: \"kubernetes.io/projected/1a948f8d-e351-4382-830b-ee25cd35e3b6-kube-api-access-cl9ft\") pod \"redhat-marketplace-q5fmn\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.719628 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-catalog-content\") pod \"redhat-marketplace-q5fmn\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.822063 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-catalog-content\") pod \"redhat-marketplace-q5fmn\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.822147 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-utilities\") pod \"redhat-marketplace-q5fmn\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.822190 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl9ft\" (UniqueName: \"kubernetes.io/projected/1a948f8d-e351-4382-830b-ee25cd35e3b6-kube-api-access-cl9ft\") pod \"redhat-marketplace-q5fmn\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.822952 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-utilities\") pod \"redhat-marketplace-q5fmn\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.823075 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-catalog-content\") pod \"redhat-marketplace-q5fmn\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.859635 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl9ft\" (UniqueName: \"kubernetes.io/projected/1a948f8d-e351-4382-830b-ee25cd35e3b6-kube-api-access-cl9ft\") pod \"redhat-marketplace-q5fmn\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:17 crc kubenswrapper[5002]: I1209 12:46:17.983852 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:18 crc kubenswrapper[5002]: I1209 12:46:18.749542 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q5fmn"] Dec 09 12:46:18 crc kubenswrapper[5002]: I1209 12:46:18.886565 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5fmn" event={"ID":"1a948f8d-e351-4382-830b-ee25cd35e3b6","Type":"ContainerStarted","Data":"09d1112af4679b5d69c00010b306e166156739fbb1685ba42adb4a163d42d012"} Dec 09 12:46:18 crc kubenswrapper[5002]: I1209 12:46:18.893525 5002 generic.go:334] "Generic (PLEG): container finished" podID="7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb" containerID="9fdc15fc9c73bd0cd85aa9b68a692f7ad21e112c5a041c44d24a46bb706222c0" exitCode=0 Dec 09 12:46:18 crc kubenswrapper[5002]: I1209 12:46:18.893567 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" event={"ID":"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb","Type":"ContainerDied","Data":"9fdc15fc9c73bd0cd85aa9b68a692f7ad21e112c5a041c44d24a46bb706222c0"} Dec 09 12:46:19 crc kubenswrapper[5002]: I1209 12:46:19.905207 5002 generic.go:334] "Generic (PLEG): container finished" podID="1a948f8d-e351-4382-830b-ee25cd35e3b6" containerID="2824381e06efe240369b224895ea18ff77c765ab61308dbd951c2c8ed47f3fd2" exitCode=0 Dec 09 12:46:19 crc kubenswrapper[5002]: I1209 12:46:19.905268 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5fmn" event={"ID":"1a948f8d-e351-4382-830b-ee25cd35e3b6","Type":"ContainerDied","Data":"2824381e06efe240369b224895ea18ff77c765ab61308dbd951c2c8ed47f3fd2"} Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.053045 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.103769 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x7gz9/crc-debug-z2lj2"] Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.126117 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x7gz9/crc-debug-z2lj2"] Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.185036 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-host\") pod \"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb\" (UID: \"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb\") " Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.185168 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6rpw\" (UniqueName: \"kubernetes.io/projected/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-kube-api-access-l6rpw\") pod \"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb\" (UID: \"7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb\") " Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.187349 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-host" (OuterVolumeSpecName: "host") pod "7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb" (UID: "7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.233632 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-kube-api-access-l6rpw" (OuterVolumeSpecName: "kube-api-access-l6rpw") pod "7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb" (UID: "7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb"). InnerVolumeSpecName "kube-api-access-l6rpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.287865 5002 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-host\") on node \"crc\" DevicePath \"\"" Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.287909 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6rpw\" (UniqueName: \"kubernetes.io/projected/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb-kube-api-access-l6rpw\") on node \"crc\" DevicePath \"\"" Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.923555 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5fmn" event={"ID":"1a948f8d-e351-4382-830b-ee25cd35e3b6","Type":"ContainerStarted","Data":"5109254cc9190f10933c09502783efb6e961bf116ede562d508aa93ba14c892d"} Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.930287 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46ed0dfe3af65a0691bbb9c3f95d1c4f4855a87852f27e8190326b511dbebc83" Dec 09 12:46:20 crc kubenswrapper[5002]: I1209 12:46:20.930369 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/crc-debug-z2lj2" Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.329259 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7gz9/crc-debug-kkbd7"] Dec 09 12:46:21 crc kubenswrapper[5002]: E1209 12:46:21.330015 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb" containerName="container-00" Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.330027 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb" containerName="container-00" Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.330248 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb" containerName="container-00" Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.331088 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.513363 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tj52\" (UniqueName: \"kubernetes.io/projected/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-kube-api-access-7tj52\") pod \"crc-debug-kkbd7\" (UID: \"f54048fb-5c29-4ff7-866c-4e9b28fc85ff\") " pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.513499 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-host\") pod \"crc-debug-kkbd7\" (UID: \"f54048fb-5c29-4ff7-866c-4e9b28fc85ff\") " pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.614961 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-host\") pod \"crc-debug-kkbd7\" (UID: \"f54048fb-5c29-4ff7-866c-4e9b28fc85ff\") " pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.615100 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tj52\" (UniqueName: \"kubernetes.io/projected/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-kube-api-access-7tj52\") pod \"crc-debug-kkbd7\" (UID: \"f54048fb-5c29-4ff7-866c-4e9b28fc85ff\") " pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.615138 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-host\") pod \"crc-debug-kkbd7\" (UID: \"f54048fb-5c29-4ff7-866c-4e9b28fc85ff\") " pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.637282 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tj52\" (UniqueName: \"kubernetes.io/projected/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-kube-api-access-7tj52\") pod \"crc-debug-kkbd7\" (UID: \"f54048fb-5c29-4ff7-866c-4e9b28fc85ff\") " pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.650513 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" Dec 09 12:46:21 crc kubenswrapper[5002]: W1209 12:46:21.684012 5002 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf54048fb_5c29_4ff7_866c_4e9b28fc85ff.slice/crio-5931e4c250f13d25b99bd55c3b0f2ea4957dc311b5cac5b7a7d5b5f026d06262 WatchSource:0}: Error finding container 5931e4c250f13d25b99bd55c3b0f2ea4957dc311b5cac5b7a7d5b5f026d06262: Status 404 returned error can't find the container with id 5931e4c250f13d25b99bd55c3b0f2ea4957dc311b5cac5b7a7d5b5f026d06262 Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.940667 5002 generic.go:334] "Generic (PLEG): container finished" podID="f54048fb-5c29-4ff7-866c-4e9b28fc85ff" containerID="c24c20ac52fa0c2b24d7cb499d28c52e3456cd8621d719d8c090e1fdee84cee2" exitCode=1 Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.940737 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" event={"ID":"f54048fb-5c29-4ff7-866c-4e9b28fc85ff","Type":"ContainerDied","Data":"c24c20ac52fa0c2b24d7cb499d28c52e3456cd8621d719d8c090e1fdee84cee2"} Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.941005 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" event={"ID":"f54048fb-5c29-4ff7-866c-4e9b28fc85ff","Type":"ContainerStarted","Data":"5931e4c250f13d25b99bd55c3b0f2ea4957dc311b5cac5b7a7d5b5f026d06262"} Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.944031 5002 generic.go:334] "Generic (PLEG): container finished" podID="1a948f8d-e351-4382-830b-ee25cd35e3b6" containerID="5109254cc9190f10933c09502783efb6e961bf116ede562d508aa93ba14c892d" exitCode=0 Dec 09 12:46:21 crc kubenswrapper[5002]: I1209 12:46:21.944059 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5fmn" event={"ID":"1a948f8d-e351-4382-830b-ee25cd35e3b6","Type":"ContainerDied","Data":"5109254cc9190f10933c09502783efb6e961bf116ede562d508aa93ba14c892d"} Dec 09 12:46:22 crc kubenswrapper[5002]: I1209 12:46:22.000295 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x7gz9/crc-debug-kkbd7"] Dec 09 12:46:22 crc kubenswrapper[5002]: I1209 12:46:22.019085 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x7gz9/crc-debug-kkbd7"] Dec 09 12:46:22 crc kubenswrapper[5002]: I1209 12:46:22.072139 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb" path="/var/lib/kubelet/pods/7fd2f0f7-9ece-4664-8450-7d0dd02e5ceb/volumes" Dec 09 12:46:22 crc kubenswrapper[5002]: I1209 12:46:22.955428 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5fmn" event={"ID":"1a948f8d-e351-4382-830b-ee25cd35e3b6","Type":"ContainerStarted","Data":"9e9f76a14651656696421c7000ea36e350e79e3e04488fcbc404f364e55e857c"} Dec 09 12:46:22 crc kubenswrapper[5002]: I1209 12:46:22.981664 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q5fmn" podStartSLOduration=3.26298495 podStartE2EDuration="5.981651067s" podCreationTimestamp="2025-12-09 12:46:17 +0000 UTC" firstStartedPulling="2025-12-09 12:46:19.909667229 +0000 UTC m=+9912.301718310" lastFinishedPulling="2025-12-09 12:46:22.628333346 +0000 UTC m=+9915.020384427" observedRunningTime="2025-12-09 12:46:22.980157327 +0000 UTC m=+9915.372208408" watchObservedRunningTime="2025-12-09 12:46:22.981651067 +0000 UTC m=+9915.373702148" Dec 09 12:46:23 crc kubenswrapper[5002]: I1209 12:46:23.074451 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" Dec 09 12:46:23 crc kubenswrapper[5002]: I1209 12:46:23.170669 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tj52\" (UniqueName: \"kubernetes.io/projected/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-kube-api-access-7tj52\") pod \"f54048fb-5c29-4ff7-866c-4e9b28fc85ff\" (UID: \"f54048fb-5c29-4ff7-866c-4e9b28fc85ff\") " Dec 09 12:46:23 crc kubenswrapper[5002]: I1209 12:46:23.170918 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-host\") pod \"f54048fb-5c29-4ff7-866c-4e9b28fc85ff\" (UID: \"f54048fb-5c29-4ff7-866c-4e9b28fc85ff\") " Dec 09 12:46:23 crc kubenswrapper[5002]: I1209 12:46:23.171062 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-host" (OuterVolumeSpecName: "host") pod "f54048fb-5c29-4ff7-866c-4e9b28fc85ff" (UID: "f54048fb-5c29-4ff7-866c-4e9b28fc85ff"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 12:46:23 crc kubenswrapper[5002]: I1209 12:46:23.171724 5002 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-host\") on node \"crc\" DevicePath \"\"" Dec 09 12:46:23 crc kubenswrapper[5002]: I1209 12:46:23.182137 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-kube-api-access-7tj52" (OuterVolumeSpecName: "kube-api-access-7tj52") pod "f54048fb-5c29-4ff7-866c-4e9b28fc85ff" (UID: "f54048fb-5c29-4ff7-866c-4e9b28fc85ff"). InnerVolumeSpecName "kube-api-access-7tj52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:46:23 crc kubenswrapper[5002]: I1209 12:46:23.273721 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tj52\" (UniqueName: \"kubernetes.io/projected/f54048fb-5c29-4ff7-866c-4e9b28fc85ff-kube-api-access-7tj52\") on node \"crc\" DevicePath \"\"" Dec 09 12:46:23 crc kubenswrapper[5002]: I1209 12:46:23.966187 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/crc-debug-kkbd7" Dec 09 12:46:23 crc kubenswrapper[5002]: I1209 12:46:23.966210 5002 scope.go:117] "RemoveContainer" containerID="c24c20ac52fa0c2b24d7cb499d28c52e3456cd8621d719d8c090e1fdee84cee2" Dec 09 12:46:24 crc kubenswrapper[5002]: I1209 12:46:24.075792 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f54048fb-5c29-4ff7-866c-4e9b28fc85ff" path="/var/lib/kubelet/pods/f54048fb-5c29-4ff7-866c-4e9b28fc85ff/volumes" Dec 09 12:46:27 crc kubenswrapper[5002]: I1209 12:46:27.984156 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:27 crc kubenswrapper[5002]: I1209 12:46:27.984744 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:28 crc kubenswrapper[5002]: I1209 12:46:28.051461 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:28 crc kubenswrapper[5002]: I1209 12:46:28.104394 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:28 crc kubenswrapper[5002]: I1209 12:46:28.304352 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q5fmn"] Dec 09 12:46:30 crc kubenswrapper[5002]: I1209 12:46:30.019351 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q5fmn" podUID="1a948f8d-e351-4382-830b-ee25cd35e3b6" containerName="registry-server" containerID="cri-o://9e9f76a14651656696421c7000ea36e350e79e3e04488fcbc404f364e55e857c" gracePeriod=2 Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.030457 5002 generic.go:334] "Generic (PLEG): container finished" podID="1a948f8d-e351-4382-830b-ee25cd35e3b6" containerID="9e9f76a14651656696421c7000ea36e350e79e3e04488fcbc404f364e55e857c" exitCode=0 Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.030531 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5fmn" event={"ID":"1a948f8d-e351-4382-830b-ee25cd35e3b6","Type":"ContainerDied","Data":"9e9f76a14651656696421c7000ea36e350e79e3e04488fcbc404f364e55e857c"} Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.030948 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5fmn" event={"ID":"1a948f8d-e351-4382-830b-ee25cd35e3b6","Type":"ContainerDied","Data":"09d1112af4679b5d69c00010b306e166156739fbb1685ba42adb4a163d42d012"} Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.030969 5002 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d1112af4679b5d69c00010b306e166156739fbb1685ba42adb4a163d42d012" Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.061846 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.155089 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl9ft\" (UniqueName: \"kubernetes.io/projected/1a948f8d-e351-4382-830b-ee25cd35e3b6-kube-api-access-cl9ft\") pod \"1a948f8d-e351-4382-830b-ee25cd35e3b6\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.155299 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-utilities\") pod \"1a948f8d-e351-4382-830b-ee25cd35e3b6\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.155351 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-catalog-content\") pod \"1a948f8d-e351-4382-830b-ee25cd35e3b6\" (UID: \"1a948f8d-e351-4382-830b-ee25cd35e3b6\") " Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.155976 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-utilities" (OuterVolumeSpecName: "utilities") pod "1a948f8d-e351-4382-830b-ee25cd35e3b6" (UID: "1a948f8d-e351-4382-830b-ee25cd35e3b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.156089 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.172760 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a948f8d-e351-4382-830b-ee25cd35e3b6-kube-api-access-cl9ft" (OuterVolumeSpecName: "kube-api-access-cl9ft") pod "1a948f8d-e351-4382-830b-ee25cd35e3b6" (UID: "1a948f8d-e351-4382-830b-ee25cd35e3b6"). InnerVolumeSpecName "kube-api-access-cl9ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.176954 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a948f8d-e351-4382-830b-ee25cd35e3b6" (UID: "1a948f8d-e351-4382-830b-ee25cd35e3b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.258367 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a948f8d-e351-4382-830b-ee25cd35e3b6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:46:31 crc kubenswrapper[5002]: I1209 12:46:31.258406 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl9ft\" (UniqueName: \"kubernetes.io/projected/1a948f8d-e351-4382-830b-ee25cd35e3b6-kube-api-access-cl9ft\") on node \"crc\" DevicePath \"\"" Dec 09 12:46:32 crc kubenswrapper[5002]: I1209 12:46:32.042003 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q5fmn" Dec 09 12:46:32 crc kubenswrapper[5002]: I1209 12:46:32.094654 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q5fmn"] Dec 09 12:46:32 crc kubenswrapper[5002]: I1209 12:46:32.109545 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q5fmn"] Dec 09 12:46:34 crc kubenswrapper[5002]: I1209 12:46:34.072522 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a948f8d-e351-4382-830b-ee25cd35e3b6" path="/var/lib/kubelet/pods/1a948f8d-e351-4382-830b-ee25cd35e3b6/volumes" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.179619 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w8824"] Dec 09 12:46:55 crc kubenswrapper[5002]: E1209 12:46:55.181927 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a948f8d-e351-4382-830b-ee25cd35e3b6" containerName="registry-server" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.181963 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a948f8d-e351-4382-830b-ee25cd35e3b6" containerName="registry-server" Dec 09 12:46:55 crc kubenswrapper[5002]: E1209 12:46:55.182008 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a948f8d-e351-4382-830b-ee25cd35e3b6" containerName="extract-utilities" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.182021 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a948f8d-e351-4382-830b-ee25cd35e3b6" containerName="extract-utilities" Dec 09 12:46:55 crc kubenswrapper[5002]: E1209 12:46:55.182044 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54048fb-5c29-4ff7-866c-4e9b28fc85ff" containerName="container-00" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.182056 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54048fb-5c29-4ff7-866c-4e9b28fc85ff" containerName="container-00" Dec 09 12:46:55 crc kubenswrapper[5002]: E1209 12:46:55.182075 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a948f8d-e351-4382-830b-ee25cd35e3b6" containerName="extract-content" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.182089 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a948f8d-e351-4382-830b-ee25cd35e3b6" containerName="extract-content" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.182449 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="f54048fb-5c29-4ff7-866c-4e9b28fc85ff" containerName="container-00" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.182481 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a948f8d-e351-4382-830b-ee25cd35e3b6" containerName="registry-server" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.185389 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.199620 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8824"] Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.384245 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-catalog-content\") pod \"redhat-operators-w8824\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.384652 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ksx5\" (UniqueName: \"kubernetes.io/projected/1f1ed9a6-2aaf-4643-b149-28158e7635a9-kube-api-access-7ksx5\") pod \"redhat-operators-w8824\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.384796 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-utilities\") pod \"redhat-operators-w8824\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.486658 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-utilities\") pod \"redhat-operators-w8824\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.486749 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-catalog-content\") pod \"redhat-operators-w8824\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.486914 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ksx5\" (UniqueName: \"kubernetes.io/projected/1f1ed9a6-2aaf-4643-b149-28158e7635a9-kube-api-access-7ksx5\") pod \"redhat-operators-w8824\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.487307 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-catalog-content\") pod \"redhat-operators-w8824\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.487541 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-utilities\") pod \"redhat-operators-w8824\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.508652 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ksx5\" (UniqueName: \"kubernetes.io/projected/1f1ed9a6-2aaf-4643-b149-28158e7635a9-kube-api-access-7ksx5\") pod \"redhat-operators-w8824\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.518392 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:46:55 crc kubenswrapper[5002]: I1209 12:46:55.987059 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8824"] Dec 09 12:46:56 crc kubenswrapper[5002]: I1209 12:46:56.337161 5002 generic.go:334] "Generic (PLEG): container finished" podID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerID="5c35ead0b2916cdeb986ac62439e94c10fec6a61d171d011caa70061de72c31a" exitCode=0 Dec 09 12:46:56 crc kubenswrapper[5002]: I1209 12:46:56.337295 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8824" event={"ID":"1f1ed9a6-2aaf-4643-b149-28158e7635a9","Type":"ContainerDied","Data":"5c35ead0b2916cdeb986ac62439e94c10fec6a61d171d011caa70061de72c31a"} Dec 09 12:46:56 crc kubenswrapper[5002]: I1209 12:46:56.337413 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8824" event={"ID":"1f1ed9a6-2aaf-4643-b149-28158e7635a9","Type":"ContainerStarted","Data":"d3d438be19ec86c3d30d15b66bd92c62a0ce8924645a151de2a1f756b5997b2e"} Dec 09 12:46:57 crc kubenswrapper[5002]: I1209 12:46:57.350395 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8824" event={"ID":"1f1ed9a6-2aaf-4643-b149-28158e7635a9","Type":"ContainerStarted","Data":"0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1"} Dec 09 12:47:00 crc kubenswrapper[5002]: I1209 12:47:00.380291 5002 generic.go:334] "Generic (PLEG): container finished" podID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerID="0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1" exitCode=0 Dec 09 12:47:00 crc kubenswrapper[5002]: I1209 12:47:00.380950 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8824" event={"ID":"1f1ed9a6-2aaf-4643-b149-28158e7635a9","Type":"ContainerDied","Data":"0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1"} Dec 09 12:47:01 crc kubenswrapper[5002]: I1209 12:47:01.398544 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8824" event={"ID":"1f1ed9a6-2aaf-4643-b149-28158e7635a9","Type":"ContainerStarted","Data":"59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9"} Dec 09 12:47:01 crc kubenswrapper[5002]: I1209 12:47:01.423001 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w8824" podStartSLOduration=1.996064354 podStartE2EDuration="6.422976137s" podCreationTimestamp="2025-12-09 12:46:55 +0000 UTC" firstStartedPulling="2025-12-09 12:46:56.338957655 +0000 UTC m=+9948.731008736" lastFinishedPulling="2025-12-09 12:47:00.765869408 +0000 UTC m=+9953.157920519" observedRunningTime="2025-12-09 12:47:01.418518417 +0000 UTC m=+9953.810569568" watchObservedRunningTime="2025-12-09 12:47:01.422976137 +0000 UTC m=+9953.815027228" Dec 09 12:47:05 crc kubenswrapper[5002]: I1209 12:47:05.518907 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:47:05 crc kubenswrapper[5002]: I1209 12:47:05.520664 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:47:06 crc kubenswrapper[5002]: I1209 12:47:06.583845 5002 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w8824" podUID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerName="registry-server" probeResult="failure" output=< Dec 09 12:47:06 crc kubenswrapper[5002]: timeout: failed to connect service ":50051" within 1s Dec 09 12:47:06 crc kubenswrapper[5002]: > Dec 09 12:47:15 crc kubenswrapper[5002]: I1209 12:47:15.595319 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:47:15 crc kubenswrapper[5002]: I1209 12:47:15.660138 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:47:15 crc kubenswrapper[5002]: I1209 12:47:15.833941 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8824"] Dec 09 12:47:16 crc kubenswrapper[5002]: I1209 12:47:16.613913 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w8824" podUID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerName="registry-server" containerID="cri-o://59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9" gracePeriod=2 Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.133370 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.182863 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ksx5\" (UniqueName: \"kubernetes.io/projected/1f1ed9a6-2aaf-4643-b149-28158e7635a9-kube-api-access-7ksx5\") pod \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.183011 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-utilities\") pod \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.183101 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-catalog-content\") pod \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\" (UID: \"1f1ed9a6-2aaf-4643-b149-28158e7635a9\") " Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.184000 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-utilities" (OuterVolumeSpecName: "utilities") pod "1f1ed9a6-2aaf-4643-b149-28158e7635a9" (UID: "1f1ed9a6-2aaf-4643-b149-28158e7635a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.195044 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1ed9a6-2aaf-4643-b149-28158e7635a9-kube-api-access-7ksx5" (OuterVolumeSpecName: "kube-api-access-7ksx5") pod "1f1ed9a6-2aaf-4643-b149-28158e7635a9" (UID: "1f1ed9a6-2aaf-4643-b149-28158e7635a9"). InnerVolumeSpecName "kube-api-access-7ksx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.287598 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ksx5\" (UniqueName: \"kubernetes.io/projected/1f1ed9a6-2aaf-4643-b149-28158e7635a9-kube-api-access-7ksx5\") on node \"crc\" DevicePath \"\"" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.287722 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.312904 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f1ed9a6-2aaf-4643-b149-28158e7635a9" (UID: "1f1ed9a6-2aaf-4643-b149-28158e7635a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.391061 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1ed9a6-2aaf-4643-b149-28158e7635a9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.627799 5002 generic.go:334] "Generic (PLEG): container finished" podID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerID="59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9" exitCode=0 Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.627851 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8824" event={"ID":"1f1ed9a6-2aaf-4643-b149-28158e7635a9","Type":"ContainerDied","Data":"59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9"} Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.627953 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8824" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.629027 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8824" event={"ID":"1f1ed9a6-2aaf-4643-b149-28158e7635a9","Type":"ContainerDied","Data":"d3d438be19ec86c3d30d15b66bd92c62a0ce8924645a151de2a1f756b5997b2e"} Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.629136 5002 scope.go:117] "RemoveContainer" containerID="59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.661458 5002 scope.go:117] "RemoveContainer" containerID="0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.669574 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8824"] Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.681685 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w8824"] Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.698766 5002 scope.go:117] "RemoveContainer" containerID="5c35ead0b2916cdeb986ac62439e94c10fec6a61d171d011caa70061de72c31a" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.733595 5002 scope.go:117] "RemoveContainer" containerID="59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9" Dec 09 12:47:17 crc kubenswrapper[5002]: E1209 12:47:17.734143 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9\": container with ID starting with 59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9 not found: ID does not exist" containerID="59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.734183 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9"} err="failed to get container status \"59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9\": rpc error: code = NotFound desc = could not find container \"59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9\": container with ID starting with 59344506ce3b49c48a45dac93c751776ebba4bb4512cda22ee207d06bdf06af9 not found: ID does not exist" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.734206 5002 scope.go:117] "RemoveContainer" containerID="0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1" Dec 09 12:47:17 crc kubenswrapper[5002]: E1209 12:47:17.734584 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1\": container with ID starting with 0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1 not found: ID does not exist" containerID="0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.734613 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1"} err="failed to get container status \"0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1\": rpc error: code = NotFound desc = could not find container \"0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1\": container with ID starting with 0fb34e359b8e7cb8803d7d262a10b3731251686f8ef9bf8350a6a2150ef705a1 not found: ID does not exist" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.734633 5002 scope.go:117] "RemoveContainer" containerID="5c35ead0b2916cdeb986ac62439e94c10fec6a61d171d011caa70061de72c31a" Dec 09 12:47:17 crc kubenswrapper[5002]: E1209 12:47:17.734946 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c35ead0b2916cdeb986ac62439e94c10fec6a61d171d011caa70061de72c31a\": container with ID starting with 5c35ead0b2916cdeb986ac62439e94c10fec6a61d171d011caa70061de72c31a not found: ID does not exist" containerID="5c35ead0b2916cdeb986ac62439e94c10fec6a61d171d011caa70061de72c31a" Dec 09 12:47:17 crc kubenswrapper[5002]: I1209 12:47:17.734968 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c35ead0b2916cdeb986ac62439e94c10fec6a61d171d011caa70061de72c31a"} err="failed to get container status \"5c35ead0b2916cdeb986ac62439e94c10fec6a61d171d011caa70061de72c31a\": rpc error: code = NotFound desc = could not find container \"5c35ead0b2916cdeb986ac62439e94c10fec6a61d171d011caa70061de72c31a\": container with ID starting with 5c35ead0b2916cdeb986ac62439e94c10fec6a61d171d011caa70061de72c31a not found: ID does not exist" Dec 09 12:47:18 crc kubenswrapper[5002]: I1209 12:47:18.082652 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" path="/var/lib/kubelet/pods/1f1ed9a6-2aaf-4643-b149-28158e7635a9/volumes" Dec 09 12:48:37 crc kubenswrapper[5002]: I1209 12:48:37.964368 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:48:37 crc kubenswrapper[5002]: I1209 12:48:37.964920 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.336728 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lm9hs"] Dec 09 12:48:57 crc kubenswrapper[5002]: E1209 12:48:57.338692 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerName="registry-server" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.338714 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerName="registry-server" Dec 09 12:48:57 crc kubenswrapper[5002]: E1209 12:48:57.338757 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerName="extract-content" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.338764 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerName="extract-content" Dec 09 12:48:57 crc kubenswrapper[5002]: E1209 12:48:57.338780 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerName="extract-utilities" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.338788 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerName="extract-utilities" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.339111 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f1ed9a6-2aaf-4643-b149-28158e7635a9" containerName="registry-server" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.344866 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.349904 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lm9hs"] Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.350446 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-catalog-content\") pod \"community-operators-lm9hs\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.350584 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csrgk\" (UniqueName: \"kubernetes.io/projected/d7326dcf-8a30-4b22-af34-b7705e04e54c-kube-api-access-csrgk\") pod \"community-operators-lm9hs\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.350760 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-utilities\") pod \"community-operators-lm9hs\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.458942 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-utilities\") pod \"community-operators-lm9hs\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.459206 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-catalog-content\") pod \"community-operators-lm9hs\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.459505 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csrgk\" (UniqueName: \"kubernetes.io/projected/d7326dcf-8a30-4b22-af34-b7705e04e54c-kube-api-access-csrgk\") pod \"community-operators-lm9hs\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.459550 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-utilities\") pod \"community-operators-lm9hs\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.460061 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-catalog-content\") pod \"community-operators-lm9hs\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.484229 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csrgk\" (UniqueName: \"kubernetes.io/projected/d7326dcf-8a30-4b22-af34-b7705e04e54c-kube-api-access-csrgk\") pod \"community-operators-lm9hs\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:48:57 crc kubenswrapper[5002]: I1209 12:48:57.682141 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:48:58 crc kubenswrapper[5002]: I1209 12:48:58.240866 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lm9hs"] Dec 09 12:48:58 crc kubenswrapper[5002]: I1209 12:48:58.844061 5002 generic.go:334] "Generic (PLEG): container finished" podID="d7326dcf-8a30-4b22-af34-b7705e04e54c" containerID="d6e5462a50d600c80c6ea7fe9e16001b4911c2244d48d17157177ad292b940df" exitCode=0 Dec 09 12:48:58 crc kubenswrapper[5002]: I1209 12:48:58.844264 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lm9hs" event={"ID":"d7326dcf-8a30-4b22-af34-b7705e04e54c","Type":"ContainerDied","Data":"d6e5462a50d600c80c6ea7fe9e16001b4911c2244d48d17157177ad292b940df"} Dec 09 12:48:58 crc kubenswrapper[5002]: I1209 12:48:58.844414 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lm9hs" event={"ID":"d7326dcf-8a30-4b22-af34-b7705e04e54c","Type":"ContainerStarted","Data":"4fe3ab958452cb82f8136f8ce98d956889d2cda8796ff3866178d5177ffa0c46"} Dec 09 12:49:00 crc kubenswrapper[5002]: I1209 12:49:00.890533 5002 generic.go:334] "Generic (PLEG): container finished" podID="d7326dcf-8a30-4b22-af34-b7705e04e54c" containerID="8f24699505d2407b4093ae9ef70a8f27fa9ae862f88162ba1adb425a1771ab76" exitCode=0 Dec 09 12:49:00 crc kubenswrapper[5002]: I1209 12:49:00.890678 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lm9hs" event={"ID":"d7326dcf-8a30-4b22-af34-b7705e04e54c","Type":"ContainerDied","Data":"8f24699505d2407b4093ae9ef70a8f27fa9ae862f88162ba1adb425a1771ab76"} Dec 09 12:49:01 crc kubenswrapper[5002]: I1209 12:49:01.901566 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lm9hs" event={"ID":"d7326dcf-8a30-4b22-af34-b7705e04e54c","Type":"ContainerStarted","Data":"e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920"} Dec 09 12:49:01 crc kubenswrapper[5002]: I1209 12:49:01.919735 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lm9hs" podStartSLOduration=2.472587503 podStartE2EDuration="4.919714588s" podCreationTimestamp="2025-12-09 12:48:57 +0000 UTC" firstStartedPulling="2025-12-09 12:48:58.84772485 +0000 UTC m=+10071.239775931" lastFinishedPulling="2025-12-09 12:49:01.294851935 +0000 UTC m=+10073.686903016" observedRunningTime="2025-12-09 12:49:01.91646029 +0000 UTC m=+10074.308511391" watchObservedRunningTime="2025-12-09 12:49:01.919714588 +0000 UTC m=+10074.311765689" Dec 09 12:49:07 crc kubenswrapper[5002]: I1209 12:49:07.682869 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:49:07 crc kubenswrapper[5002]: I1209 12:49:07.683497 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:49:07 crc kubenswrapper[5002]: I1209 12:49:07.775415 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:49:07 crc kubenswrapper[5002]: I1209 12:49:07.964963 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:49:07 crc kubenswrapper[5002]: I1209 12:49:07.965023 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:49:08 crc kubenswrapper[5002]: I1209 12:49:08.029037 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:49:08 crc kubenswrapper[5002]: I1209 12:49:08.084757 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lm9hs"] Dec 09 12:49:09 crc kubenswrapper[5002]: I1209 12:49:09.973044 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lm9hs" podUID="d7326dcf-8a30-4b22-af34-b7705e04e54c" containerName="registry-server" containerID="cri-o://e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920" gracePeriod=2 Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.494620 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.527612 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-utilities\") pod \"d7326dcf-8a30-4b22-af34-b7705e04e54c\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.527742 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-catalog-content\") pod \"d7326dcf-8a30-4b22-af34-b7705e04e54c\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.527775 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csrgk\" (UniqueName: \"kubernetes.io/projected/d7326dcf-8a30-4b22-af34-b7705e04e54c-kube-api-access-csrgk\") pod \"d7326dcf-8a30-4b22-af34-b7705e04e54c\" (UID: \"d7326dcf-8a30-4b22-af34-b7705e04e54c\") " Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.528320 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-utilities" (OuterVolumeSpecName: "utilities") pod "d7326dcf-8a30-4b22-af34-b7705e04e54c" (UID: "d7326dcf-8a30-4b22-af34-b7705e04e54c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.544058 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7326dcf-8a30-4b22-af34-b7705e04e54c-kube-api-access-csrgk" (OuterVolumeSpecName: "kube-api-access-csrgk") pod "d7326dcf-8a30-4b22-af34-b7705e04e54c" (UID: "d7326dcf-8a30-4b22-af34-b7705e04e54c"). InnerVolumeSpecName "kube-api-access-csrgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.598776 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7326dcf-8a30-4b22-af34-b7705e04e54c" (UID: "d7326dcf-8a30-4b22-af34-b7705e04e54c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.630126 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.630162 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csrgk\" (UniqueName: \"kubernetes.io/projected/d7326dcf-8a30-4b22-af34-b7705e04e54c-kube-api-access-csrgk\") on node \"crc\" DevicePath \"\"" Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.630175 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7326dcf-8a30-4b22-af34-b7705e04e54c-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.985889 5002 generic.go:334] "Generic (PLEG): container finished" podID="d7326dcf-8a30-4b22-af34-b7705e04e54c" containerID="e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920" exitCode=0 Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.985999 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lm9hs" Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.985996 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lm9hs" event={"ID":"d7326dcf-8a30-4b22-af34-b7705e04e54c","Type":"ContainerDied","Data":"e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920"} Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.986333 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lm9hs" event={"ID":"d7326dcf-8a30-4b22-af34-b7705e04e54c","Type":"ContainerDied","Data":"4fe3ab958452cb82f8136f8ce98d956889d2cda8796ff3866178d5177ffa0c46"} Dec 09 12:49:10 crc kubenswrapper[5002]: I1209 12:49:10.986360 5002 scope.go:117] "RemoveContainer" containerID="e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920" Dec 09 12:49:11 crc kubenswrapper[5002]: I1209 12:49:11.014355 5002 scope.go:117] "RemoveContainer" containerID="8f24699505d2407b4093ae9ef70a8f27fa9ae862f88162ba1adb425a1771ab76" Dec 09 12:49:11 crc kubenswrapper[5002]: I1209 12:49:11.029397 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lm9hs"] Dec 09 12:49:11 crc kubenswrapper[5002]: I1209 12:49:11.041599 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lm9hs"] Dec 09 12:49:11 crc kubenswrapper[5002]: I1209 12:49:11.053844 5002 scope.go:117] "RemoveContainer" containerID="d6e5462a50d600c80c6ea7fe9e16001b4911c2244d48d17157177ad292b940df" Dec 09 12:49:11 crc kubenswrapper[5002]: I1209 12:49:11.088098 5002 scope.go:117] "RemoveContainer" containerID="e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920" Dec 09 12:49:11 crc kubenswrapper[5002]: E1209 12:49:11.088455 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920\": container with ID starting with e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920 not found: ID does not exist" containerID="e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920" Dec 09 12:49:11 crc kubenswrapper[5002]: I1209 12:49:11.088485 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920"} err="failed to get container status \"e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920\": rpc error: code = NotFound desc = could not find container \"e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920\": container with ID starting with e4894ec811ffe601bb28c7862fe240959794960fe5871a5cc680ac0ff8d92920 not found: ID does not exist" Dec 09 12:49:11 crc kubenswrapper[5002]: I1209 12:49:11.088506 5002 scope.go:117] "RemoveContainer" containerID="8f24699505d2407b4093ae9ef70a8f27fa9ae862f88162ba1adb425a1771ab76" Dec 09 12:49:11 crc kubenswrapper[5002]: E1209 12:49:11.088722 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f24699505d2407b4093ae9ef70a8f27fa9ae862f88162ba1adb425a1771ab76\": container with ID starting with 8f24699505d2407b4093ae9ef70a8f27fa9ae862f88162ba1adb425a1771ab76 not found: ID does not exist" containerID="8f24699505d2407b4093ae9ef70a8f27fa9ae862f88162ba1adb425a1771ab76" Dec 09 12:49:11 crc kubenswrapper[5002]: I1209 12:49:11.088748 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f24699505d2407b4093ae9ef70a8f27fa9ae862f88162ba1adb425a1771ab76"} err="failed to get container status \"8f24699505d2407b4093ae9ef70a8f27fa9ae862f88162ba1adb425a1771ab76\": rpc error: code = NotFound desc = could not find container \"8f24699505d2407b4093ae9ef70a8f27fa9ae862f88162ba1adb425a1771ab76\": container with ID starting with 8f24699505d2407b4093ae9ef70a8f27fa9ae862f88162ba1adb425a1771ab76 not found: ID does not exist" Dec 09 12:49:11 crc kubenswrapper[5002]: I1209 12:49:11.088766 5002 scope.go:117] "RemoveContainer" containerID="d6e5462a50d600c80c6ea7fe9e16001b4911c2244d48d17157177ad292b940df" Dec 09 12:49:11 crc kubenswrapper[5002]: E1209 12:49:11.089132 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e5462a50d600c80c6ea7fe9e16001b4911c2244d48d17157177ad292b940df\": container with ID starting with d6e5462a50d600c80c6ea7fe9e16001b4911c2244d48d17157177ad292b940df not found: ID does not exist" containerID="d6e5462a50d600c80c6ea7fe9e16001b4911c2244d48d17157177ad292b940df" Dec 09 12:49:11 crc kubenswrapper[5002]: I1209 12:49:11.089152 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e5462a50d600c80c6ea7fe9e16001b4911c2244d48d17157177ad292b940df"} err="failed to get container status \"d6e5462a50d600c80c6ea7fe9e16001b4911c2244d48d17157177ad292b940df\": rpc error: code = NotFound desc = could not find container \"d6e5462a50d600c80c6ea7fe9e16001b4911c2244d48d17157177ad292b940df\": container with ID starting with d6e5462a50d600c80c6ea7fe9e16001b4911c2244d48d17157177ad292b940df not found: ID does not exist" Dec 09 12:49:12 crc kubenswrapper[5002]: I1209 12:49:12.075514 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7326dcf-8a30-4b22-af34-b7705e04e54c" path="/var/lib/kubelet/pods/d7326dcf-8a30-4b22-af34-b7705e04e54c/volumes" Dec 09 12:49:29 crc kubenswrapper[5002]: I1209 12:49:29.744582 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3fe6db24-e908-49ab-be4c-5482809166e0/init-config-reloader/0.log" Dec 09 12:49:30 crc kubenswrapper[5002]: I1209 12:49:30.075155 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3fe6db24-e908-49ab-be4c-5482809166e0/config-reloader/0.log" Dec 09 12:49:30 crc kubenswrapper[5002]: I1209 12:49:30.202918 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3fe6db24-e908-49ab-be4c-5482809166e0/init-config-reloader/0.log" Dec 09 12:49:30 crc kubenswrapper[5002]: I1209 12:49:30.241026 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3fe6db24-e908-49ab-be4c-5482809166e0/alertmanager/0.log" Dec 09 12:49:30 crc kubenswrapper[5002]: I1209 12:49:30.397138 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d6b2002d-986f-41bd-a7d8-1b59680b72e7/aodh-api/0.log" Dec 09 12:49:30 crc kubenswrapper[5002]: I1209 12:49:30.445148 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d6b2002d-986f-41bd-a7d8-1b59680b72e7/aodh-evaluator/0.log" Dec 09 12:49:30 crc kubenswrapper[5002]: I1209 12:49:30.466922 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d6b2002d-986f-41bd-a7d8-1b59680b72e7/aodh-listener/0.log" Dec 09 12:49:30 crc kubenswrapper[5002]: I1209 12:49:30.616414 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d6b2002d-986f-41bd-a7d8-1b59680b72e7/aodh-notifier/0.log" Dec 09 12:49:30 crc kubenswrapper[5002]: I1209 12:49:30.705002 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-746d5c8778-w7l5z_88442869-dcc3-4d84-9ee5-91756a2394ef/barbican-api/0.log" Dec 09 12:49:30 crc kubenswrapper[5002]: I1209 12:49:30.710095 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-746d5c8778-w7l5z_88442869-dcc3-4d84-9ee5-91756a2394ef/barbican-api-log/0.log" Dec 09 12:49:30 crc kubenswrapper[5002]: I1209 12:49:30.915318 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c6866897d-sgztm_04e2ad2f-2d91-4eb3-95bb-62229762bc1c/barbican-keystone-listener-log/0.log" Dec 09 12:49:30 crc kubenswrapper[5002]: I1209 12:49:30.931517 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c6866897d-sgztm_04e2ad2f-2d91-4eb3-95bb-62229762bc1c/barbican-keystone-listener/0.log" Dec 09 12:49:31 crc kubenswrapper[5002]: I1209 12:49:31.144260 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69b68b4587-6nfgt_785d609c-802a-48b2-b58a-7eaa54867270/barbican-worker/0.log" Dec 09 12:49:31 crc kubenswrapper[5002]: I1209 12:49:31.154951 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69b68b4587-6nfgt_785d609c-802a-48b2-b58a-7eaa54867270/barbican-worker-log/0.log" Dec 09 12:49:31 crc kubenswrapper[5002]: I1209 12:49:31.278297 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-lhhxc_ef92a5b7-d0a2-4b0a-bba8-d4c806f276d6/bootstrap-openstack-openstack-cell1/0.log" Dec 09 12:49:31 crc kubenswrapper[5002]: I1209 12:49:31.420728 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1db29fd6-56f8-4153-9760-638a8baac418/ceilometer-central-agent/0.log" Dec 09 12:49:31 crc kubenswrapper[5002]: I1209 12:49:31.495424 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1db29fd6-56f8-4153-9760-638a8baac418/ceilometer-notification-agent/0.log" Dec 09 12:49:31 crc kubenswrapper[5002]: I1209 12:49:31.595132 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1db29fd6-56f8-4153-9760-638a8baac418/proxy-httpd/0.log" Dec 09 12:49:31 crc kubenswrapper[5002]: I1209 12:49:31.627943 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1db29fd6-56f8-4153-9760-638a8baac418/sg-core/0.log" Dec 09 12:49:31 crc kubenswrapper[5002]: I1209 12:49:31.732064 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-hzzxs_a980500d-11f1-45f3-8eeb-9bec9abe3fd2/ceph-client-openstack-openstack-cell1/0.log" Dec 09 12:49:31 crc kubenswrapper[5002]: I1209 12:49:31.942951 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_89609918-9011-4ef3-82eb-d8f791fc2979/cinder-api/0.log" Dec 09 12:49:31 crc kubenswrapper[5002]: I1209 12:49:31.972136 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_89609918-9011-4ef3-82eb-d8f791fc2979/cinder-api-log/0.log" Dec 09 12:49:32 crc kubenswrapper[5002]: I1209 12:49:32.220415 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f/probe/0.log" Dec 09 12:49:32 crc kubenswrapper[5002]: I1209 12:49:32.244098 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_494d3baa-ddb8-4cfe-bcac-27a3bd8efc1f/cinder-backup/0.log" Dec 09 12:49:32 crc kubenswrapper[5002]: I1209 12:49:32.358444 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ebae47db-99fe-43b2-bbf8-00d7ba18b901/cinder-scheduler/0.log" Dec 09 12:49:32 crc kubenswrapper[5002]: I1209 12:49:32.476145 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ebae47db-99fe-43b2-bbf8-00d7ba18b901/probe/0.log" Dec 09 12:49:32 crc kubenswrapper[5002]: I1209 12:49:32.575217 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_44fcf36e-889c-47c3-96a7-cdc8263c3a01/cinder-volume/0.log" Dec 09 12:49:32 crc kubenswrapper[5002]: I1209 12:49:32.717898 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_44fcf36e-889c-47c3-96a7-cdc8263c3a01/probe/0.log" Dec 09 12:49:32 crc kubenswrapper[5002]: I1209 12:49:32.781312 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-nzm87_9e049b56-1b0d-47da-a400-5bc3cd964980/configure-network-openstack-openstack-cell1/0.log" Dec 09 12:49:32 crc kubenswrapper[5002]: I1209 12:49:32.980218 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-s87gq_5287e90f-c20c-41df-8e2a-ced571a234d5/configure-os-openstack-openstack-cell1/0.log" Dec 09 12:49:33 crc kubenswrapper[5002]: I1209 12:49:33.143653 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d6cd869d9-7gk6k_4e6f3711-6d8b-451e-84db-d89343c6f1cd/init/0.log" Dec 09 12:49:33 crc kubenswrapper[5002]: I1209 12:49:33.244273 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d6cd869d9-7gk6k_4e6f3711-6d8b-451e-84db-d89343c6f1cd/init/0.log" Dec 09 12:49:33 crc kubenswrapper[5002]: I1209 12:49:33.285832 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d6cd869d9-7gk6k_4e6f3711-6d8b-451e-84db-d89343c6f1cd/dnsmasq-dns/0.log" Dec 09 12:49:33 crc kubenswrapper[5002]: I1209 12:49:33.469086 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-fz8lt_0c070b4d-58da-444a-ba0c-2dbc72842283/download-cache-openstack-openstack-cell1/0.log" Dec 09 12:49:33 crc kubenswrapper[5002]: I1209 12:49:33.497323 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e75ade8b-f2b9-4102-a485-11ad67b3dd5f/glance-httpd/0.log" Dec 09 12:49:33 crc kubenswrapper[5002]: I1209 12:49:33.645465 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e75ade8b-f2b9-4102-a485-11ad67b3dd5f/glance-log/0.log" Dec 09 12:49:33 crc kubenswrapper[5002]: I1209 12:49:33.721853 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c2535b42-a1ab-41bb-9b0e-7970e69d34a3/glance-httpd/0.log" Dec 09 12:49:33 crc kubenswrapper[5002]: I1209 12:49:33.809024 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c2535b42-a1ab-41bb-9b0e-7970e69d34a3/glance-log/0.log" Dec 09 12:49:33 crc kubenswrapper[5002]: I1209 12:49:33.985517 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-54698c9446-z6mqw_671233cb-6429-4ecb-ad1d-9a40d9407b60/heat-api/0.log" Dec 09 12:49:34 crc kubenswrapper[5002]: I1209 12:49:34.147665 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-74d764bbbb-8rsqh_09b6ad8c-27bd-4485-94ac-ff455749056e/heat-cfnapi/0.log" Dec 09 12:49:34 crc kubenswrapper[5002]: I1209 12:49:34.387864 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5844579b4-rxqx6_fcab3bf4-1659-4429-93f1-c941380fa773/heat-engine/0.log" Dec 09 12:49:34 crc kubenswrapper[5002]: I1209 12:49:34.508050 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77b6c8965c-q29mq_e575f542-5dff-4ec7-b430-8e18382bc2e0/horizon/0.log" Dec 09 12:49:34 crc kubenswrapper[5002]: I1209 12:49:34.541494 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77b6c8965c-q29mq_e575f542-5dff-4ec7-b430-8e18382bc2e0/horizon-log/0.log" Dec 09 12:49:34 crc kubenswrapper[5002]: I1209 12:49:34.612567 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-j5mgh_16490275-ede7-4620-aa87-69e7c31b3dec/install-certs-openstack-openstack-cell1/0.log" Dec 09 12:49:34 crc kubenswrapper[5002]: I1209 12:49:34.747084 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-r26xg_5a0b071a-c032-4ba5-a808-afcfda05fef6/install-os-openstack-openstack-cell1/0.log" Dec 09 12:49:34 crc kubenswrapper[5002]: I1209 12:49:34.926045 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55cb897df9-2xg2j_1b5c3bc5-7b21-43f4-9172-e5604eb79053/keystone-api/0.log" Dec 09 12:49:34 crc kubenswrapper[5002]: I1209 12:49:34.959988 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29421361-mhxrg_905f5826-4f4a-4ec8-8c82-b109a32cb9bf/keystone-cron/0.log" Dec 09 12:49:35 crc kubenswrapper[5002]: I1209 12:49:35.074289 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_dedd9325-8ad0-49bf-9e6d-f14ea11b47dc/kube-state-metrics/0.log" Dec 09 12:49:35 crc kubenswrapper[5002]: I1209 12:49:35.203994 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-mkzb6_b8783f64-c4b2-43fa-bc56-b5eb5d5c0345/libvirt-openstack-openstack-cell1/0.log" Dec 09 12:49:35 crc kubenswrapper[5002]: I1209 12:49:35.293425 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_570b8a7a-298d-45bf-b6e3-eeb433ca5d87/manila-api-log/0.log" Dec 09 12:49:35 crc kubenswrapper[5002]: I1209 12:49:35.330472 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_570b8a7a-298d-45bf-b6e3-eeb433ca5d87/manila-api/0.log" Dec 09 12:49:35 crc kubenswrapper[5002]: I1209 12:49:35.493675 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_7865f250-f9ab-4946-bf60-7cd667f2dbfa/probe/0.log" Dec 09 12:49:35 crc kubenswrapper[5002]: I1209 12:49:35.508380 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_7865f250-f9ab-4946-bf60-7cd667f2dbfa/manila-scheduler/0.log" Dec 09 12:49:35 crc kubenswrapper[5002]: I1209 12:49:35.612085 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_b4edcd85-a7e2-4745-bb74-b3babfa3caf0/manila-share/0.log" Dec 09 12:49:35 crc kubenswrapper[5002]: I1209 12:49:35.746759 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_b4edcd85-a7e2-4745-bb74-b3babfa3caf0/probe/0.log" Dec 09 12:49:35 crc kubenswrapper[5002]: I1209 12:49:35.993970 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7fb76d694f-vmgzb_45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558/neutron-api/0.log" Dec 09 12:49:36 crc kubenswrapper[5002]: I1209 12:49:36.142263 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7fb76d694f-vmgzb_45c5af3b-9ac2-4d9d-b1e4-e7bf01b80558/neutron-httpd/0.log" Dec 09 12:49:36 crc kubenswrapper[5002]: I1209 12:49:36.222861 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-w6qc7_cb33c5e1-0410-4e89-951b-b7443e522f5f/neutron-dhcp-openstack-openstack-cell1/0.log" Dec 09 12:49:36 crc kubenswrapper[5002]: I1209 12:49:36.510135 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-wctks_7d587a7b-aec3-4210-b8d7-b428bcf38686/neutron-metadata-openstack-openstack-cell1/0.log" Dec 09 12:49:36 crc kubenswrapper[5002]: I1209 12:49:36.553012 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-frp4b_effd7072-b70f-4208-a804-82ed8c1fca04/neutron-sriov-openstack-openstack-cell1/0.log" Dec 09 12:49:36 crc kubenswrapper[5002]: I1209 12:49:36.807309 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_17dde858-c91e-499f-be5e-1e859b11c81f/nova-api-api/0.log" Dec 09 12:49:36 crc kubenswrapper[5002]: I1209 12:49:36.913368 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_17dde858-c91e-499f-be5e-1e859b11c81f/nova-api-log/0.log" Dec 09 12:49:36 crc kubenswrapper[5002]: I1209 12:49:36.966923 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e16a9875-2e67-4fd1-aad1-4e56d5d8f460/nova-cell0-conductor-conductor/0.log" Dec 09 12:49:37 crc kubenswrapper[5002]: I1209 12:49:37.129953 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_5a21591e-3a1c-4ff7-bf7b-a0b40bad00ac/nova-cell1-conductor-conductor/0.log" Dec 09 12:49:37 crc kubenswrapper[5002]: I1209 12:49:37.328026 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6b037088-45da-4b2b-85d1-777da6060b4f/nova-cell1-novncproxy-novncproxy/0.log" Dec 09 12:49:37 crc kubenswrapper[5002]: I1209 12:49:37.448311 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellknkww_e448dd02-1116-4b33-9298-7053138072e9/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Dec 09 12:49:37 crc kubenswrapper[5002]: I1209 12:49:37.540787 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-mtwlk_1bb12456-0517-4952-a5ef-b2a06433e6b1/nova-cell1-openstack-openstack-cell1/0.log" Dec 09 12:49:37 crc kubenswrapper[5002]: I1209 12:49:37.800744 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_94c16746-775c-4ee9-8cdc-898ff655cd41/nova-metadata-metadata/0.log" Dec 09 12:49:37 crc kubenswrapper[5002]: I1209 12:49:37.842567 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_94c16746-775c-4ee9-8cdc-898ff655cd41/nova-metadata-log/0.log" Dec 09 12:49:37 crc kubenswrapper[5002]: I1209 12:49:37.966521 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:49:37 crc kubenswrapper[5002]: I1209 12:49:37.966592 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:49:37 crc kubenswrapper[5002]: I1209 12:49:37.966648 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 12:49:37 crc kubenswrapper[5002]: I1209 12:49:37.967602 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2b189dbc4da5bd17d3dbe1a55a72bd7d5ec553e1d7ef1f793ee32247b44861fa"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:49:37 crc kubenswrapper[5002]: I1209 12:49:37.967669 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://2b189dbc4da5bd17d3dbe1a55a72bd7d5ec553e1d7ef1f793ee32247b44861fa" gracePeriod=600 Dec 09 12:49:38 crc kubenswrapper[5002]: I1209 12:49:38.299544 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="2b189dbc4da5bd17d3dbe1a55a72bd7d5ec553e1d7ef1f793ee32247b44861fa" exitCode=0 Dec 09 12:49:38 crc kubenswrapper[5002]: I1209 12:49:38.299585 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"2b189dbc4da5bd17d3dbe1a55a72bd7d5ec553e1d7ef1f793ee32247b44861fa"} Dec 09 12:49:38 crc kubenswrapper[5002]: I1209 12:49:38.299617 5002 scope.go:117] "RemoveContainer" containerID="2fa91ed46b52ab7a1e437a2bf0fbb9ef768679dcceaa767b67034d3abb00357b" Dec 09 12:49:38 crc kubenswrapper[5002]: I1209 12:49:38.493548 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e0fbb27e-b807-4823-978f-0ff21020d012/nova-scheduler-scheduler/0.log" Dec 09 12:49:38 crc kubenswrapper[5002]: I1209 12:49:38.641140 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7f5cdc56bb-d9kbw_a8e7b10c-91b5-4632-9ded-327e013f19db/init/0.log" Dec 09 12:49:38 crc kubenswrapper[5002]: I1209 12:49:38.936892 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7f5cdc56bb-d9kbw_a8e7b10c-91b5-4632-9ded-327e013f19db/octavia-api-provider-agent/0.log" Dec 09 12:49:38 crc kubenswrapper[5002]: I1209 12:49:38.985517 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7f5cdc56bb-d9kbw_a8e7b10c-91b5-4632-9ded-327e013f19db/init/0.log" Dec 09 12:49:39 crc kubenswrapper[5002]: I1209 12:49:39.193886 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jdm5x_3b1a1027-677c-4d78-b807-84091ef5fa84/init/0.log" Dec 09 12:49:39 crc kubenswrapper[5002]: I1209 12:49:39.210442 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7f5cdc56bb-d9kbw_a8e7b10c-91b5-4632-9ded-327e013f19db/octavia-api/0.log" Dec 09 12:49:39 crc kubenswrapper[5002]: I1209 12:49:39.322405 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerStarted","Data":"29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f"} Dec 09 12:49:39 crc kubenswrapper[5002]: I1209 12:49:39.563785 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jdm5x_3b1a1027-677c-4d78-b807-84091ef5fa84/init/0.log" Dec 09 12:49:39 crc kubenswrapper[5002]: I1209 12:49:39.578060 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-ljd8f_649fd26a-484b-46f2-b1d5-5acffdf62265/init/0.log" Dec 09 12:49:39 crc kubenswrapper[5002]: I1209 12:49:39.656555 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jdm5x_3b1a1027-677c-4d78-b807-84091ef5fa84/octavia-healthmanager/0.log" Dec 09 12:49:39 crc kubenswrapper[5002]: I1209 12:49:39.790071 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-ljd8f_649fd26a-484b-46f2-b1d5-5acffdf62265/init/0.log" Dec 09 12:49:40 crc kubenswrapper[5002]: I1209 12:49:40.138854 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-ljd8f_649fd26a-484b-46f2-b1d5-5acffdf62265/octavia-housekeeping/0.log" Dec 09 12:49:40 crc kubenswrapper[5002]: I1209 12:49:40.608002 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-njkxr_e83bfed7-2c34-4972-bcac-89bb8621051c/init/0.log" Dec 09 12:49:40 crc kubenswrapper[5002]: I1209 12:49:40.887410 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-njkxr_e83bfed7-2c34-4972-bcac-89bb8621051c/init/0.log" Dec 09 12:49:40 crc kubenswrapper[5002]: I1209 12:49:40.900131 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-njkxr_e83bfed7-2c34-4972-bcac-89bb8621051c/octavia-amphora-httpd/0.log" Dec 09 12:49:40 crc kubenswrapper[5002]: I1209 12:49:40.921471 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-5gzkr_a44e68c5-c542-4f82-af1f-a3405ef59656/init/0.log" Dec 09 12:49:41 crc kubenswrapper[5002]: I1209 12:49:41.172275 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-5gzkr_a44e68c5-c542-4f82-af1f-a3405ef59656/init/0.log" Dec 09 12:49:41 crc kubenswrapper[5002]: I1209 12:49:41.182154 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-5gzkr_a44e68c5-c542-4f82-af1f-a3405ef59656/octavia-rsyslog/0.log" Dec 09 12:49:41 crc kubenswrapper[5002]: I1209 12:49:41.342977 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-l54zc_a1f5f591-fa99-49ac-a17c-b643790ffdba/init/0.log" Dec 09 12:49:41 crc kubenswrapper[5002]: I1209 12:49:41.571650 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-l54zc_a1f5f591-fa99-49ac-a17c-b643790ffdba/init/0.log" Dec 09 12:49:41 crc kubenswrapper[5002]: I1209 12:49:41.608304 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_22c2018d-2934-4552-a848-770614c6ff8c/mysql-bootstrap/0.log" Dec 09 12:49:41 crc kubenswrapper[5002]: I1209 12:49:41.698034 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-l54zc_a1f5f591-fa99-49ac-a17c-b643790ffdba/octavia-worker/0.log" Dec 09 12:49:41 crc kubenswrapper[5002]: I1209 12:49:41.824298 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_22c2018d-2934-4552-a848-770614c6ff8c/mysql-bootstrap/0.log" Dec 09 12:49:41 crc kubenswrapper[5002]: I1209 12:49:41.887166 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_22c2018d-2934-4552-a848-770614c6ff8c/galera/0.log" Dec 09 12:49:41 crc kubenswrapper[5002]: I1209 12:49:41.954126 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_68ec5be1-063f-481b-a00c-cf882c596d5f/mysql-bootstrap/0.log" Dec 09 12:49:42 crc kubenswrapper[5002]: I1209 12:49:42.216443 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_68ec5be1-063f-481b-a00c-cf882c596d5f/mysql-bootstrap/0.log" Dec 09 12:49:42 crc kubenswrapper[5002]: I1209 12:49:42.323526 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_68ec5be1-063f-481b-a00c-cf882c596d5f/galera/0.log" Dec 09 12:49:42 crc kubenswrapper[5002]: I1209 12:49:42.336683 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2bdcec8a-0ec1-47ef-986c-3ad7d4ed7b4b/openstackclient/0.log" Dec 09 12:49:42 crc kubenswrapper[5002]: I1209 12:49:42.582485 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wxgg4_f7449046-0fa6-4724-9f8c-dd9cff7bc95d/openstack-network-exporter/0.log" Dec 09 12:49:42 crc kubenswrapper[5002]: I1209 12:49:42.589301 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9glwj_3e229031-30d4-44e3-897b-b4c4252e7b99/ovn-controller/0.log" Dec 09 12:49:42 crc kubenswrapper[5002]: I1209 12:49:42.842614 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ff8qg_5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86/ovsdb-server-init/0.log" Dec 09 12:49:43 crc kubenswrapper[5002]: I1209 12:49:43.078430 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ff8qg_5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86/ovs-vswitchd/0.log" Dec 09 12:49:43 crc kubenswrapper[5002]: I1209 12:49:43.081329 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ff8qg_5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86/ovsdb-server-init/0.log" Dec 09 12:49:43 crc kubenswrapper[5002]: I1209 12:49:43.147923 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ff8qg_5707bebc-bcab-4d7a-a4b9-28cb8b6a9b86/ovsdb-server/0.log" Dec 09 12:49:43 crc kubenswrapper[5002]: I1209 12:49:43.549429 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_603f27c8-09c3-4c86-8774-3a5937dfaf8e/ovn-northd/0.log" Dec 09 12:49:43 crc kubenswrapper[5002]: I1209 12:49:43.560135 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_603f27c8-09c3-4c86-8774-3a5937dfaf8e/openstack-network-exporter/0.log" Dec 09 12:49:43 crc kubenswrapper[5002]: I1209 12:49:43.819309 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-hhmxb_c17d588e-aecc-4b15-b345-a461e9e20a58/ovn-openstack-openstack-cell1/0.log" Dec 09 12:49:43 crc kubenswrapper[5002]: I1209 12:49:43.911137 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c2e2b902-40f9-4150-b6c8-e88c10db76fe/openstack-network-exporter/0.log" Dec 09 12:49:44 crc kubenswrapper[5002]: I1209 12:49:44.030645 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c2e2b902-40f9-4150-b6c8-e88c10db76fe/ovsdbserver-nb/0.log" Dec 09 12:49:44 crc kubenswrapper[5002]: I1209 12:49:44.153010 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_e01218b6-7004-46d8-91c5-cc3206cc6804/openstack-network-exporter/0.log" Dec 09 12:49:44 crc kubenswrapper[5002]: I1209 12:49:44.188283 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_e01218b6-7004-46d8-91c5-cc3206cc6804/ovsdbserver-nb/0.log" Dec 09 12:49:44 crc kubenswrapper[5002]: I1209 12:49:44.340169 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_67bf2a52-93b6-4957-97e5-4c41c6b9fcb1/openstack-network-exporter/0.log" Dec 09 12:49:44 crc kubenswrapper[5002]: I1209 12:49:44.396533 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_67bf2a52-93b6-4957-97e5-4c41c6b9fcb1/ovsdbserver-nb/0.log" Dec 09 12:49:44 crc kubenswrapper[5002]: I1209 12:49:44.608787 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0d72442d-739b-4835-a59a-f701330dd8d5/ovsdbserver-sb/0.log" Dec 09 12:49:44 crc kubenswrapper[5002]: I1209 12:49:44.612488 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0d72442d-739b-4835-a59a-f701330dd8d5/openstack-network-exporter/0.log" Dec 09 12:49:44 crc kubenswrapper[5002]: I1209 12:49:44.710352 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_c73129e1-6a6c-485e-8601-f89ab669974d/openstack-network-exporter/0.log" Dec 09 12:49:44 crc kubenswrapper[5002]: I1209 12:49:44.796420 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_c73129e1-6a6c-485e-8601-f89ab669974d/ovsdbserver-sb/0.log" Dec 09 12:49:44 crc kubenswrapper[5002]: I1209 12:49:44.916305 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_b41cb4fe-b7f6-4994-a92c-faa5675b32ce/openstack-network-exporter/0.log" Dec 09 12:49:44 crc kubenswrapper[5002]: I1209 12:49:44.968122 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_b41cb4fe-b7f6-4994-a92c-faa5675b32ce/ovsdbserver-sb/0.log" Dec 09 12:49:45 crc kubenswrapper[5002]: I1209 12:49:45.257383 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-77cc5c8946-bsgmg_26231343-3514-4a68-bbb5-95594b525082/placement-api/0.log" Dec 09 12:49:45 crc kubenswrapper[5002]: I1209 12:49:45.263644 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-77cc5c8946-bsgmg_26231343-3514-4a68-bbb5-95594b525082/placement-log/0.log" Dec 09 12:49:45 crc kubenswrapper[5002]: I1209 12:49:45.293399 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cdnffp_03b994f6-6474-46af-8a64-6e9f9169e073/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Dec 09 12:49:45 crc kubenswrapper[5002]: I1209 12:49:45.460890 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_49ee88e1-200f-4c9b-8411-cedfd8cf9afb/init-config-reloader/0.log" Dec 09 12:49:45 crc kubenswrapper[5002]: I1209 12:49:45.695026 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_49ee88e1-200f-4c9b-8411-cedfd8cf9afb/init-config-reloader/0.log" Dec 09 12:49:45 crc kubenswrapper[5002]: I1209 12:49:45.753492 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_49ee88e1-200f-4c9b-8411-cedfd8cf9afb/config-reloader/0.log" Dec 09 12:49:45 crc kubenswrapper[5002]: I1209 12:49:45.766873 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_49ee88e1-200f-4c9b-8411-cedfd8cf9afb/prometheus/0.log" Dec 09 12:49:45 crc kubenswrapper[5002]: I1209 12:49:45.908533 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_49ee88e1-200f-4c9b-8411-cedfd8cf9afb/thanos-sidecar/0.log" Dec 09 12:49:45 crc kubenswrapper[5002]: I1209 12:49:45.987077 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c/setup-container/0.log" Dec 09 12:49:46 crc kubenswrapper[5002]: I1209 12:49:46.226668 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c/setup-container/0.log" Dec 09 12:49:46 crc kubenswrapper[5002]: I1209 12:49:46.239893 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7a5d1355-cfbe-4ebb-bb37-a8ce5b29de81/memcached/0.log" Dec 09 12:49:46 crc kubenswrapper[5002]: I1209 12:49:46.336404 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8edfa23e-fdc1-4a53-be18-0ba6cf8fd30c/rabbitmq/0.log" Dec 09 12:49:46 crc kubenswrapper[5002]: I1209 12:49:46.354873 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_22ca3dfe-9996-43c3-89b4-ee6624561059/setup-container/0.log" Dec 09 12:49:46 crc kubenswrapper[5002]: I1209 12:49:46.591559 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-5wsxk_06e24666-e7a6-4d17-b4fa-237ecc575eff/reboot-os-openstack-openstack-cell1/0.log" Dec 09 12:49:46 crc kubenswrapper[5002]: I1209 12:49:46.602293 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_22ca3dfe-9996-43c3-89b4-ee6624561059/setup-container/0.log" Dec 09 12:49:46 crc kubenswrapper[5002]: I1209 12:49:46.762579 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-f9wq6_01e50bcf-190a-49f2-bde4-84a59dc60d2f/run-os-openstack-openstack-cell1/0.log" Dec 09 12:49:46 crc kubenswrapper[5002]: I1209 12:49:46.860853 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-r2pdh_44c30495-c199-4f48-a69d-368ff294f0a8/ssh-known-hosts-openstack/0.log" Dec 09 12:49:47 crc kubenswrapper[5002]: I1209 12:49:47.101218 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-7nglp_0625b628-5e86-426a-a766-ce710d59a41e/telemetry-openstack-openstack-cell1/0.log" Dec 09 12:49:47 crc kubenswrapper[5002]: I1209 12:49:47.234957 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-9rspg_17cd99c9-1797-4270-9cdb-0589c1797d27/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Dec 09 12:49:47 crc kubenswrapper[5002]: I1209 12:49:47.383533 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-n42l8_f848efb7-cc84-45e2-930a-efd7824640cf/validate-network-openstack-openstack-cell1/0.log" Dec 09 12:49:47 crc kubenswrapper[5002]: I1209 12:49:47.549734 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_22ca3dfe-9996-43c3-89b4-ee6624561059/rabbitmq/0.log" Dec 09 12:50:11 crc kubenswrapper[5002]: I1209 12:50:11.011987 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c_c673bba7-aec6-4d60-a236-e168b4570805/util/0.log" Dec 09 12:50:11 crc kubenswrapper[5002]: I1209 12:50:11.185356 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c_c673bba7-aec6-4d60-a236-e168b4570805/util/0.log" Dec 09 12:50:11 crc kubenswrapper[5002]: I1209 12:50:11.238503 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c_c673bba7-aec6-4d60-a236-e168b4570805/pull/0.log" Dec 09 12:50:11 crc kubenswrapper[5002]: I1209 12:50:11.240763 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c_c673bba7-aec6-4d60-a236-e168b4570805/pull/0.log" Dec 09 12:50:11 crc kubenswrapper[5002]: I1209 12:50:11.422057 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c_c673bba7-aec6-4d60-a236-e168b4570805/util/0.log" Dec 09 12:50:11 crc kubenswrapper[5002]: I1209 12:50:11.445956 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c_c673bba7-aec6-4d60-a236-e168b4570805/extract/0.log" Dec 09 12:50:11 crc kubenswrapper[5002]: I1209 12:50:11.455755 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538f80a306e96e20ae463cfd8070d55ec3227b7ea36219057239327815sk54c_c673bba7-aec6-4d60-a236-e168b4570805/pull/0.log" Dec 09 12:50:11 crc kubenswrapper[5002]: I1209 12:50:11.617422 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-kj2xr_7fce722c-f3f5-48b6-a567-e867e4e1f27b/kube-rbac-proxy/0.log" Dec 09 12:50:11 crc kubenswrapper[5002]: I1209 12:50:11.772141 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-4dzfs_78d17849-0dbc-4b43-b5e6-49bd7c766aa1/kube-rbac-proxy/0.log" Dec 09 12:50:11 crc kubenswrapper[5002]: I1209 12:50:11.808573 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-kj2xr_7fce722c-f3f5-48b6-a567-e867e4e1f27b/manager/0.log" Dec 09 12:50:12 crc kubenswrapper[5002]: I1209 12:50:12.439713 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-9th65_80762edf-2f88-41a7-b96b-2856b2e2c00e/kube-rbac-proxy/0.log" Dec 09 12:50:12 crc kubenswrapper[5002]: I1209 12:50:12.447210 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-9th65_80762edf-2f88-41a7-b96b-2856b2e2c00e/manager/0.log" Dec 09 12:50:12 crc kubenswrapper[5002]: I1209 12:50:12.487090 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-4dzfs_78d17849-0dbc-4b43-b5e6-49bd7c766aa1/manager/0.log" Dec 09 12:50:12 crc kubenswrapper[5002]: I1209 12:50:12.646751 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-vcspp_ce6b14b8-1656-4ac0-b6e2-24fd9329faf4/kube-rbac-proxy/0.log" Dec 09 12:50:12 crc kubenswrapper[5002]: I1209 12:50:12.812687 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-vcspp_ce6b14b8-1656-4ac0-b6e2-24fd9329faf4/manager/0.log" Dec 09 12:50:12 crc kubenswrapper[5002]: I1209 12:50:12.836826 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5pq7d_f0f92bf3-053d-4ea9-bd30-4e6724c414c8/kube-rbac-proxy/0.log" Dec 09 12:50:12 crc kubenswrapper[5002]: I1209 12:50:12.941453 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5pq7d_f0f92bf3-053d-4ea9-bd30-4e6724c414c8/manager/0.log" Dec 09 12:50:13 crc kubenswrapper[5002]: I1209 12:50:13.022548 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-b8mpv_8519dc96-e20f-47d9-9b48-e64a07393d39/kube-rbac-proxy/0.log" Dec 09 12:50:13 crc kubenswrapper[5002]: I1209 12:50:13.081503 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-b8mpv_8519dc96-e20f-47d9-9b48-e64a07393d39/manager/0.log" Dec 09 12:50:13 crc kubenswrapper[5002]: I1209 12:50:13.184202 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-qqkqr_531ff665-3733-405e-b178-0f185d8cb22e/kube-rbac-proxy/0.log" Dec 09 12:50:13 crc kubenswrapper[5002]: I1209 12:50:13.444318 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-qvnbd_d3588b02-67b1-4172-9ab8-9da4cb7b09dc/kube-rbac-proxy/0.log" Dec 09 12:50:13 crc kubenswrapper[5002]: I1209 12:50:13.491844 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-qvnbd_d3588b02-67b1-4172-9ab8-9da4cb7b09dc/manager/0.log" Dec 09 12:50:13 crc kubenswrapper[5002]: I1209 12:50:13.724460 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-495q6_a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf/kube-rbac-proxy/0.log" Dec 09 12:50:13 crc kubenswrapper[5002]: I1209 12:50:13.838159 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-qqkqr_531ff665-3733-405e-b178-0f185d8cb22e/manager/0.log" Dec 09 12:50:13 crc kubenswrapper[5002]: I1209 12:50:13.894723 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-495q6_a3c0ed3b-56a5-490b-bca4-9541eeb8e2cf/manager/0.log" Dec 09 12:50:13 crc kubenswrapper[5002]: I1209 12:50:13.937177 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-8jgwr_2e565997-148f-4d70-ae78-89f194f791f8/kube-rbac-proxy/0.log" Dec 09 12:50:14 crc kubenswrapper[5002]: I1209 12:50:14.078270 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-8jgwr_2e565997-148f-4d70-ae78-89f194f791f8/manager/0.log" Dec 09 12:50:14 crc kubenswrapper[5002]: I1209 12:50:14.139923 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-25fmz_2eeed466-946c-49a5-9fe9-b393629c3394/kube-rbac-proxy/0.log" Dec 09 12:50:14 crc kubenswrapper[5002]: I1209 12:50:14.202462 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-25fmz_2eeed466-946c-49a5-9fe9-b393629c3394/manager/0.log" Dec 09 12:50:14 crc kubenswrapper[5002]: I1209 12:50:14.298262 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-q294d_5f6d862d-d775-4712-b455-ad5110968d19/kube-rbac-proxy/0.log" Dec 09 12:50:14 crc kubenswrapper[5002]: I1209 12:50:14.443249 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-q294d_5f6d862d-d775-4712-b455-ad5110968d19/manager/0.log" Dec 09 12:50:14 crc kubenswrapper[5002]: I1209 12:50:14.492764 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7kqxz_51978e4b-eb47-4e87-8528-38631838226a/kube-rbac-proxy/0.log" Dec 09 12:50:14 crc kubenswrapper[5002]: I1209 12:50:14.658286 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7kqxz_51978e4b-eb47-4e87-8528-38631838226a/manager/0.log" Dec 09 12:50:14 crc kubenswrapper[5002]: I1209 12:50:14.690327 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-jb68h_097e947b-8923-4155-9d10-f0241f751ad8/kube-rbac-proxy/0.log" Dec 09 12:50:14 crc kubenswrapper[5002]: I1209 12:50:14.706747 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-jb68h_097e947b-8923-4155-9d10-f0241f751ad8/manager/0.log" Dec 09 12:50:14 crc kubenswrapper[5002]: I1209 12:50:14.823416 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fm85h8_3f8701c7-f70e-4584-8cf1-ed40adda7b84/kube-rbac-proxy/0.log" Dec 09 12:50:14 crc kubenswrapper[5002]: I1209 12:50:14.867547 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fm85h8_3f8701c7-f70e-4584-8cf1-ed40adda7b84/manager/0.log" Dec 09 12:50:15 crc kubenswrapper[5002]: I1209 12:50:15.083321 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-s67vq_267396c4-1ded-4436-9caa-a45bb8a54e75/registry-server/0.log" Dec 09 12:50:15 crc kubenswrapper[5002]: I1209 12:50:15.209320 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5d69f78f7c-pmv68_71024129-a040-442d-9e93-f3d0125153ac/operator/0.log" Dec 09 12:50:15 crc kubenswrapper[5002]: I1209 12:50:15.293863 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-rdjzk_276be0c6-fc03-4693-91c7-5a999e6f0d89/kube-rbac-proxy/0.log" Dec 09 12:50:15 crc kubenswrapper[5002]: I1209 12:50:15.472911 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-rdjzk_276be0c6-fc03-4693-91c7-5a999e6f0d89/manager/0.log" Dec 09 12:50:15 crc kubenswrapper[5002]: I1209 12:50:15.483136 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-hnnmt_d172ce03-c24c-41ca-a61a-eecdd9572f5d/kube-rbac-proxy/0.log" Dec 09 12:50:15 crc kubenswrapper[5002]: I1209 12:50:15.583338 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-hnnmt_d172ce03-c24c-41ca-a61a-eecdd9572f5d/manager/0.log" Dec 09 12:50:15 crc kubenswrapper[5002]: I1209 12:50:15.745735 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-x578z_cecd7de4-71b0-4914-a5c5-68b8da7704cf/operator/0.log" Dec 09 12:50:15 crc kubenswrapper[5002]: I1209 12:50:15.804515 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-vvm2b_1c8db604-51f8-4fe5-b659-fc14b7c706f1/kube-rbac-proxy/0.log" Dec 09 12:50:15 crc kubenswrapper[5002]: I1209 12:50:15.959664 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-vvm2b_1c8db604-51f8-4fe5-b659-fc14b7c706f1/manager/0.log" Dec 09 12:50:16 crc kubenswrapper[5002]: I1209 12:50:16.034253 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-4vg8l_c66676bd-3461-405a-bce7-60c91858b55e/kube-rbac-proxy/0.log" Dec 09 12:50:16 crc kubenswrapper[5002]: I1209 12:50:16.272645 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-8n5z8_048abed6-46b3-4dba-bf28-8c0ee1685817/kube-rbac-proxy/0.log" Dec 09 12:50:16 crc kubenswrapper[5002]: I1209 12:50:16.276405 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-8n5z8_048abed6-46b3-4dba-bf28-8c0ee1685817/manager/0.log" Dec 09 12:50:16 crc kubenswrapper[5002]: I1209 12:50:16.410151 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-4vg8l_c66676bd-3461-405a-bce7-60c91858b55e/manager/0.log" Dec 09 12:50:16 crc kubenswrapper[5002]: I1209 12:50:16.457152 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-vkkw9_53e6de02-2fb5-4491-aa93-7ad8bd2a190c/kube-rbac-proxy/0.log" Dec 09 12:50:16 crc kubenswrapper[5002]: I1209 12:50:16.530449 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-vkkw9_53e6de02-2fb5-4491-aa93-7ad8bd2a190c/manager/0.log" Dec 09 12:50:17 crc kubenswrapper[5002]: I1209 12:50:17.315589 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7f8d8fb65b-6gj42_2c2f2646-d609-433d-894e-5577cc497adc/manager/0.log" Dec 09 12:50:36 crc kubenswrapper[5002]: I1209 12:50:36.609241 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-r22tz_0af4f459-58c9-4433-9faf-5e669262fb2e/control-plane-machine-set-operator/0.log" Dec 09 12:50:36 crc kubenswrapper[5002]: I1209 12:50:36.741656 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cmm6f_dcdecdfe-61c0-4d85-99b5-e1fe25727259/kube-rbac-proxy/0.log" Dec 09 12:50:36 crc kubenswrapper[5002]: I1209 12:50:36.800362 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cmm6f_dcdecdfe-61c0-4d85-99b5-e1fe25727259/machine-api-operator/0.log" Dec 09 12:50:49 crc kubenswrapper[5002]: I1209 12:50:49.968912 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-lqcpz_e8dfcd91-b1dc-4187-98b6-5d7e10e0a92d/cert-manager-controller/0.log" Dec 09 12:50:50 crc kubenswrapper[5002]: I1209 12:50:50.199311 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-wwznf_0f05f554-fd63-4f53-bd2a-229c13a59ae2/cert-manager-cainjector/0.log" Dec 09 12:50:50 crc kubenswrapper[5002]: I1209 12:50:50.217038 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-nm6zd_19550cb9-1d04-4081-b7e9-f0f678c45926/cert-manager-webhook/0.log" Dec 09 12:51:03 crc kubenswrapper[5002]: I1209 12:51:03.328555 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-qmxnt_fcee10e4-739d-4559-a50f-3fbb1300ad78/nmstate-console-plugin/0.log" Dec 09 12:51:03 crc kubenswrapper[5002]: I1209 12:51:03.507728 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-q4qht_802b1295-7ccc-4cf1-a241-045ff519eae4/nmstate-handler/0.log" Dec 09 12:51:03 crc kubenswrapper[5002]: I1209 12:51:03.524052 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-xw282_14984393-4074-4385-80a1-04fc38c0580f/kube-rbac-proxy/0.log" Dec 09 12:51:03 crc kubenswrapper[5002]: I1209 12:51:03.557504 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-xw282_14984393-4074-4385-80a1-04fc38c0580f/nmstate-metrics/0.log" Dec 09 12:51:03 crc kubenswrapper[5002]: I1209 12:51:03.711239 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-dzhl5_c1916146-c9a5-4a76-ba95-d677b5bc1263/nmstate-operator/0.log" Dec 09 12:51:03 crc kubenswrapper[5002]: I1209 12:51:03.795675 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-crlfm_f62b3fbf-c7ef-4477-a655-5432c8392db4/nmstate-webhook/0.log" Dec 09 12:51:18 crc kubenswrapper[5002]: I1209 12:51:18.921561 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-v2gfv_74c8a781-bfca-4207-89e2-31bc463c8db9/kube-rbac-proxy/0.log" Dec 09 12:51:19 crc kubenswrapper[5002]: I1209 12:51:19.256279 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-frr-files/0.log" Dec 09 12:51:19 crc kubenswrapper[5002]: I1209 12:51:19.502121 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-frr-files/0.log" Dec 09 12:51:19 crc kubenswrapper[5002]: I1209 12:51:19.510048 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-metrics/0.log" Dec 09 12:51:19 crc kubenswrapper[5002]: I1209 12:51:19.515014 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-v2gfv_74c8a781-bfca-4207-89e2-31bc463c8db9/controller/0.log" Dec 09 12:51:19 crc kubenswrapper[5002]: I1209 12:51:19.529420 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-reloader/0.log" Dec 09 12:51:19 crc kubenswrapper[5002]: I1209 12:51:19.677019 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-reloader/0.log" Dec 09 12:51:19 crc kubenswrapper[5002]: I1209 12:51:19.869316 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-reloader/0.log" Dec 09 12:51:19 crc kubenswrapper[5002]: I1209 12:51:19.936430 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-metrics/0.log" Dec 09 12:51:19 crc kubenswrapper[5002]: I1209 12:51:19.944042 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-metrics/0.log" Dec 09 12:51:19 crc kubenswrapper[5002]: I1209 12:51:19.960609 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-frr-files/0.log" Dec 09 12:51:20 crc kubenswrapper[5002]: I1209 12:51:20.093656 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-frr-files/0.log" Dec 09 12:51:20 crc kubenswrapper[5002]: I1209 12:51:20.144225 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-reloader/0.log" Dec 09 12:51:20 crc kubenswrapper[5002]: I1209 12:51:20.145686 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/cp-metrics/0.log" Dec 09 12:51:20 crc kubenswrapper[5002]: I1209 12:51:20.184070 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/controller/0.log" Dec 09 12:51:20 crc kubenswrapper[5002]: I1209 12:51:20.346299 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/frr-metrics/0.log" Dec 09 12:51:20 crc kubenswrapper[5002]: I1209 12:51:20.360271 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/kube-rbac-proxy/0.log" Dec 09 12:51:20 crc kubenswrapper[5002]: I1209 12:51:20.612234 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/kube-rbac-proxy-frr/0.log" Dec 09 12:51:20 crc kubenswrapper[5002]: I1209 12:51:20.676661 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/reloader/0.log" Dec 09 12:51:20 crc kubenswrapper[5002]: I1209 12:51:20.917514 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-qwxw6_cb8964b1-8a1b-4af6-8340-b7678fef088c/frr-k8s-webhook-server/0.log" Dec 09 12:51:20 crc kubenswrapper[5002]: I1209 12:51:20.936592 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-86c8bdfb4c-jrzfr_d3a83a4b-db5e-4bed-817a-64fefad2fb7c/manager/0.log" Dec 09 12:51:21 crc kubenswrapper[5002]: I1209 12:51:21.140790 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-75dc995dc4-dczqv_72f4f251-fe6b-4c0e-8d63-927a2d37eba5/webhook-server/0.log" Dec 09 12:51:21 crc kubenswrapper[5002]: I1209 12:51:21.360377 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jdlbp_2b43b64e-4d97-41ba-8514-550bbb0f497b/kube-rbac-proxy/0.log" Dec 09 12:51:22 crc kubenswrapper[5002]: I1209 12:51:22.270755 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jdlbp_2b43b64e-4d97-41ba-8514-550bbb0f497b/speaker/0.log" Dec 09 12:51:23 crc kubenswrapper[5002]: I1209 12:51:23.655212 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cbcs_21791258-f31e-49b6-8470-1915bc504a3f/frr/0.log" Dec 09 12:51:35 crc kubenswrapper[5002]: I1209 12:51:35.835122 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx_98fed57c-cdf3-4e3e-a7e5-422973325063/util/0.log" Dec 09 12:51:36 crc kubenswrapper[5002]: I1209 12:51:36.063044 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx_98fed57c-cdf3-4e3e-a7e5-422973325063/util/0.log" Dec 09 12:51:36 crc kubenswrapper[5002]: I1209 12:51:36.117352 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx_98fed57c-cdf3-4e3e-a7e5-422973325063/pull/0.log" Dec 09 12:51:36 crc kubenswrapper[5002]: I1209 12:51:36.151422 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx_98fed57c-cdf3-4e3e-a7e5-422973325063/pull/0.log" Dec 09 12:51:36 crc kubenswrapper[5002]: I1209 12:51:36.765211 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx_98fed57c-cdf3-4e3e-a7e5-422973325063/util/0.log" Dec 09 12:51:36 crc kubenswrapper[5002]: I1209 12:51:36.787731 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx_98fed57c-cdf3-4e3e-a7e5-422973325063/pull/0.log" Dec 09 12:51:36 crc kubenswrapper[5002]: I1209 12:51:36.802841 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ad4jrx_98fed57c-cdf3-4e3e-a7e5-422973325063/extract/0.log" Dec 09 12:51:36 crc kubenswrapper[5002]: I1209 12:51:36.971958 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz_90bbec9d-d897-4556-845c-249b0d52c202/util/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.151353 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz_90bbec9d-d897-4556-845c-249b0d52c202/pull/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.157026 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz_90bbec9d-d897-4556-845c-249b0d52c202/util/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.163430 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz_90bbec9d-d897-4556-845c-249b0d52c202/pull/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.334988 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz_90bbec9d-d897-4556-845c-249b0d52c202/pull/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.342455 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz_90bbec9d-d897-4556-845c-249b0d52c202/util/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.373707 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbppkz_90bbec9d-d897-4556-845c-249b0d52c202/extract/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.499064 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l_4a24ce03-952b-4bbe-aa94-26132c1f7655/util/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.708054 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l_4a24ce03-952b-4bbe-aa94-26132c1f7655/pull/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.737754 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l_4a24ce03-952b-4bbe-aa94-26132c1f7655/util/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.753328 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l_4a24ce03-952b-4bbe-aa94-26132c1f7655/pull/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.928328 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l_4a24ce03-952b-4bbe-aa94-26132c1f7655/util/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.930297 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l_4a24ce03-952b-4bbe-aa94-26132c1f7655/extract/0.log" Dec 09 12:51:37 crc kubenswrapper[5002]: I1209 12:51:37.938274 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92104j42l_4a24ce03-952b-4bbe-aa94-26132c1f7655/pull/0.log" Dec 09 12:51:38 crc kubenswrapper[5002]: I1209 12:51:38.102537 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2_4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c/util/0.log" Dec 09 12:51:38 crc kubenswrapper[5002]: I1209 12:51:38.281288 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2_4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c/util/0.log" Dec 09 12:51:38 crc kubenswrapper[5002]: I1209 12:51:38.288491 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2_4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c/pull/0.log" Dec 09 12:51:38 crc kubenswrapper[5002]: I1209 12:51:38.305782 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2_4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c/pull/0.log" Dec 09 12:51:38 crc kubenswrapper[5002]: I1209 12:51:38.516536 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2_4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c/extract/0.log" Dec 09 12:51:38 crc kubenswrapper[5002]: I1209 12:51:38.535829 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2_4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c/pull/0.log" Dec 09 12:51:38 crc kubenswrapper[5002]: I1209 12:51:38.552227 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ctlb2_4ed47aa8-6b9f-4f89-b6b1-c6c87f8d567c/util/0.log" Dec 09 12:51:38 crc kubenswrapper[5002]: I1209 12:51:38.695619 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgbq4_170c8ce8-33c6-4369-bee5-88bbf86890ca/extract-utilities/0.log" Dec 09 12:51:38 crc kubenswrapper[5002]: I1209 12:51:38.935262 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgbq4_170c8ce8-33c6-4369-bee5-88bbf86890ca/extract-content/0.log" Dec 09 12:51:38 crc kubenswrapper[5002]: I1209 12:51:38.939339 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgbq4_170c8ce8-33c6-4369-bee5-88bbf86890ca/extract-utilities/0.log" Dec 09 12:51:38 crc kubenswrapper[5002]: I1209 12:51:38.954988 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgbq4_170c8ce8-33c6-4369-bee5-88bbf86890ca/extract-content/0.log" Dec 09 12:51:39 crc kubenswrapper[5002]: I1209 12:51:39.171957 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgbq4_170c8ce8-33c6-4369-bee5-88bbf86890ca/extract-utilities/0.log" Dec 09 12:51:39 crc kubenswrapper[5002]: I1209 12:51:39.174675 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgbq4_170c8ce8-33c6-4369-bee5-88bbf86890ca/extract-content/0.log" Dec 09 12:51:39 crc kubenswrapper[5002]: I1209 12:51:39.290885 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6h4f9_35816d2c-1627-4817-a15b-4339b0ca0ef6/extract-utilities/0.log" Dec 09 12:51:39 crc kubenswrapper[5002]: I1209 12:51:39.849844 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6h4f9_35816d2c-1627-4817-a15b-4339b0ca0ef6/extract-content/0.log" Dec 09 12:51:39 crc kubenswrapper[5002]: I1209 12:51:39.849843 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6h4f9_35816d2c-1627-4817-a15b-4339b0ca0ef6/extract-utilities/0.log" Dec 09 12:51:40 crc kubenswrapper[5002]: I1209 12:51:40.042203 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6h4f9_35816d2c-1627-4817-a15b-4339b0ca0ef6/extract-content/0.log" Dec 09 12:51:40 crc kubenswrapper[5002]: I1209 12:51:40.356247 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6h4f9_35816d2c-1627-4817-a15b-4339b0ca0ef6/extract-content/0.log" Dec 09 12:51:40 crc kubenswrapper[5002]: I1209 12:51:40.387470 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6h4f9_35816d2c-1627-4817-a15b-4339b0ca0ef6/extract-utilities/0.log" Dec 09 12:51:40 crc kubenswrapper[5002]: I1209 12:51:40.658854 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ph5kb_fe698bbf-4fa3-4d21-880a-110c1eedf006/marketplace-operator/0.log" Dec 09 12:51:40 crc kubenswrapper[5002]: I1209 12:51:40.713047 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q9xhw_cf487cfa-4480-4b72-86a8-6c57cc06403a/extract-utilities/0.log" Dec 09 12:51:40 crc kubenswrapper[5002]: I1209 12:51:40.967337 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q9xhw_cf487cfa-4480-4b72-86a8-6c57cc06403a/extract-utilities/0.log" Dec 09 12:51:40 crc kubenswrapper[5002]: I1209 12:51:40.980567 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q9xhw_cf487cfa-4480-4b72-86a8-6c57cc06403a/extract-content/0.log" Dec 09 12:51:41 crc kubenswrapper[5002]: I1209 12:51:41.008831 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q9xhw_cf487cfa-4480-4b72-86a8-6c57cc06403a/extract-content/0.log" Dec 09 12:51:41 crc kubenswrapper[5002]: I1209 12:51:41.204738 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q9xhw_cf487cfa-4480-4b72-86a8-6c57cc06403a/extract-utilities/0.log" Dec 09 12:51:41 crc kubenswrapper[5002]: I1209 12:51:41.266740 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q9xhw_cf487cfa-4480-4b72-86a8-6c57cc06403a/extract-content/0.log" Dec 09 12:51:41 crc kubenswrapper[5002]: I1209 12:51:41.545272 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgbq4_170c8ce8-33c6-4369-bee5-88bbf86890ca/registry-server/0.log" Dec 09 12:51:41 crc kubenswrapper[5002]: I1209 12:51:41.610567 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pjgxt_f98aeab2-acca-483b-86d6-22e294b6598b/extract-utilities/0.log" Dec 09 12:51:41 crc kubenswrapper[5002]: I1209 12:51:41.782311 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pjgxt_f98aeab2-acca-483b-86d6-22e294b6598b/extract-utilities/0.log" Dec 09 12:51:41 crc kubenswrapper[5002]: I1209 12:51:41.812056 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pjgxt_f98aeab2-acca-483b-86d6-22e294b6598b/extract-content/0.log" Dec 09 12:51:41 crc kubenswrapper[5002]: I1209 12:51:41.833833 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pjgxt_f98aeab2-acca-483b-86d6-22e294b6598b/extract-content/0.log" Dec 09 12:51:41 crc kubenswrapper[5002]: I1209 12:51:41.844216 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q9xhw_cf487cfa-4480-4b72-86a8-6c57cc06403a/registry-server/0.log" Dec 09 12:51:42 crc kubenswrapper[5002]: I1209 12:51:42.100386 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pjgxt_f98aeab2-acca-483b-86d6-22e294b6598b/extract-content/0.log" Dec 09 12:51:42 crc kubenswrapper[5002]: I1209 12:51:42.172020 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pjgxt_f98aeab2-acca-483b-86d6-22e294b6598b/extract-utilities/0.log" Dec 09 12:51:42 crc kubenswrapper[5002]: I1209 12:51:42.507242 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pjgxt_f98aeab2-acca-483b-86d6-22e294b6598b/registry-server/0.log" Dec 09 12:51:42 crc kubenswrapper[5002]: I1209 12:51:42.639932 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6h4f9_35816d2c-1627-4817-a15b-4339b0ca0ef6/registry-server/0.log" Dec 09 12:51:56 crc kubenswrapper[5002]: I1209 12:51:56.984368 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-2d286_37dc16a6-bb85-4730-bb3f-acebf2ad7404/prometheus-operator/0.log" Dec 09 12:51:57 crc kubenswrapper[5002]: I1209 12:51:57.059501 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b764dd5f8-8x8l9_183516b3-813f-49e0-9b21-0baa88a22cc6/prometheus-operator-admission-webhook/0.log" Dec 09 12:51:57 crc kubenswrapper[5002]: I1209 12:51:57.138912 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b764dd5f8-kswbr_ec9ffe1c-2fa3-440d-a9de-de56929b29db/prometheus-operator-admission-webhook/0.log" Dec 09 12:51:57 crc kubenswrapper[5002]: I1209 12:51:57.256655 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-qjfz8_f110f2fc-51fa-41da-8787-682fe8ceb5ff/operator/0.log" Dec 09 12:51:57 crc kubenswrapper[5002]: I1209 12:51:57.812729 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-87dfp_5cc80363-bb46-4085-b008-4dd0eecd9a09/perses-operator/0.log" Dec 09 12:51:58 crc kubenswrapper[5002]: I1209 12:51:58.727084 5002 scope.go:117] "RemoveContainer" containerID="9fdc15fc9c73bd0cd85aa9b68a692f7ad21e112c5a041c44d24a46bb706222c0" Dec 09 12:52:07 crc kubenswrapper[5002]: I1209 12:52:07.964498 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:52:07 crc kubenswrapper[5002]: I1209 12:52:07.965098 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:52:07 crc kubenswrapper[5002]: E1209 12:52:07.993152 5002 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.132:45508->38.102.83.132:39941: write tcp 38.102.83.132:45508->38.102.83.132:39941: write: broken pipe Dec 09 12:52:08 crc kubenswrapper[5002]: E1209 12:52:08.850217 5002 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.132:45652->38.102.83.132:39941: write tcp 38.102.83.132:45652->38.102.83.132:39941: write: broken pipe Dec 09 12:52:37 crc kubenswrapper[5002]: I1209 12:52:37.965284 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:52:37 crc kubenswrapper[5002]: I1209 12:52:37.966077 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:52:58 crc kubenswrapper[5002]: I1209 12:52:58.787593 5002 scope.go:117] "RemoveContainer" containerID="5109254cc9190f10933c09502783efb6e961bf116ede562d508aa93ba14c892d" Dec 09 12:52:58 crc kubenswrapper[5002]: I1209 12:52:58.820544 5002 scope.go:117] "RemoveContainer" containerID="9e9f76a14651656696421c7000ea36e350e79e3e04488fcbc404f364e55e857c" Dec 09 12:52:58 crc kubenswrapper[5002]: I1209 12:52:58.906989 5002 scope.go:117] "RemoveContainer" containerID="2824381e06efe240369b224895ea18ff77c765ab61308dbd951c2c8ed47f3fd2" Dec 09 12:53:07 crc kubenswrapper[5002]: I1209 12:53:07.965300 5002 patch_prober.go:28] interesting pod/machine-config-daemon-kxpn6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:53:07 crc kubenswrapper[5002]: I1209 12:53:07.965986 5002 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:53:07 crc kubenswrapper[5002]: I1209 12:53:07.966045 5002 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" Dec 09 12:53:07 crc kubenswrapper[5002]: I1209 12:53:07.967268 5002 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f"} pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:53:07 crc kubenswrapper[5002]: I1209 12:53:07.967356 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerName="machine-config-daemon" containerID="cri-o://29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" gracePeriod=600 Dec 09 12:53:08 crc kubenswrapper[5002]: E1209 12:53:08.105511 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:53:08 crc kubenswrapper[5002]: I1209 12:53:08.888019 5002 generic.go:334] "Generic (PLEG): container finished" podID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" exitCode=0 Dec 09 12:53:08 crc kubenswrapper[5002]: I1209 12:53:08.888060 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" event={"ID":"f49c6392-68b2-4847-9291-a0b4d9c1cbef","Type":"ContainerDied","Data":"29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f"} Dec 09 12:53:08 crc kubenswrapper[5002]: I1209 12:53:08.888092 5002 scope.go:117] "RemoveContainer" containerID="2b189dbc4da5bd17d3dbe1a55a72bd7d5ec553e1d7ef1f793ee32247b44861fa" Dec 09 12:53:08 crc kubenswrapper[5002]: I1209 12:53:08.888782 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:53:08 crc kubenswrapper[5002]: E1209 12:53:08.889124 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:53:24 crc kubenswrapper[5002]: I1209 12:53:24.061516 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:53:24 crc kubenswrapper[5002]: E1209 12:53:24.062450 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:53:36 crc kubenswrapper[5002]: I1209 12:53:36.060955 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:53:36 crc kubenswrapper[5002]: E1209 12:53:36.061736 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:53:51 crc kubenswrapper[5002]: I1209 12:53:51.060497 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:53:51 crc kubenswrapper[5002]: E1209 12:53:51.061174 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:54:02 crc kubenswrapper[5002]: I1209 12:54:02.060844 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:54:02 crc kubenswrapper[5002]: E1209 12:54:02.061777 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:54:04 crc kubenswrapper[5002]: I1209 12:54:04.603202 5002 generic.go:334] "Generic (PLEG): container finished" podID="1d3dbff2-1469-42d0-84d4-71bc89cf68f9" containerID="63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357" exitCode=0 Dec 09 12:54:04 crc kubenswrapper[5002]: I1209 12:54:04.603280 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7gz9/must-gather-85wlw" event={"ID":"1d3dbff2-1469-42d0-84d4-71bc89cf68f9","Type":"ContainerDied","Data":"63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357"} Dec 09 12:54:04 crc kubenswrapper[5002]: I1209 12:54:04.605379 5002 scope.go:117] "RemoveContainer" containerID="63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357" Dec 09 12:54:04 crc kubenswrapper[5002]: I1209 12:54:04.997675 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x7gz9_must-gather-85wlw_1d3dbff2-1469-42d0-84d4-71bc89cf68f9/gather/0.log" Dec 09 12:54:14 crc kubenswrapper[5002]: I1209 12:54:14.744474 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x7gz9/must-gather-85wlw"] Dec 09 12:54:14 crc kubenswrapper[5002]: I1209 12:54:14.745573 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-x7gz9/must-gather-85wlw" podUID="1d3dbff2-1469-42d0-84d4-71bc89cf68f9" containerName="copy" containerID="cri-o://4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba" gracePeriod=2 Dec 09 12:54:14 crc kubenswrapper[5002]: I1209 12:54:14.762235 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x7gz9/must-gather-85wlw"] Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.248720 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x7gz9_must-gather-85wlw_1d3dbff2-1469-42d0-84d4-71bc89cf68f9/copy/0.log" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.249323 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/must-gather-85wlw" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.357900 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9wt7\" (UniqueName: \"kubernetes.io/projected/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-kube-api-access-h9wt7\") pod \"1d3dbff2-1469-42d0-84d4-71bc89cf68f9\" (UID: \"1d3dbff2-1469-42d0-84d4-71bc89cf68f9\") " Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.358232 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-must-gather-output\") pod \"1d3dbff2-1469-42d0-84d4-71bc89cf68f9\" (UID: \"1d3dbff2-1469-42d0-84d4-71bc89cf68f9\") " Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.363501 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-kube-api-access-h9wt7" (OuterVolumeSpecName: "kube-api-access-h9wt7") pod "1d3dbff2-1469-42d0-84d4-71bc89cf68f9" (UID: "1d3dbff2-1469-42d0-84d4-71bc89cf68f9"). InnerVolumeSpecName "kube-api-access-h9wt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.461403 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9wt7\" (UniqueName: \"kubernetes.io/projected/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-kube-api-access-h9wt7\") on node \"crc\" DevicePath \"\"" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.531331 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1d3dbff2-1469-42d0-84d4-71bc89cf68f9" (UID: "1d3dbff2-1469-42d0-84d4-71bc89cf68f9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.563324 5002 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d3dbff2-1469-42d0-84d4-71bc89cf68f9-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.721653 5002 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x7gz9_must-gather-85wlw_1d3dbff2-1469-42d0-84d4-71bc89cf68f9/copy/0.log" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.722150 5002 generic.go:334] "Generic (PLEG): container finished" podID="1d3dbff2-1469-42d0-84d4-71bc89cf68f9" containerID="4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba" exitCode=143 Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.722219 5002 scope.go:117] "RemoveContainer" containerID="4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.722271 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7gz9/must-gather-85wlw" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.768467 5002 scope.go:117] "RemoveContainer" containerID="63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.838199 5002 scope.go:117] "RemoveContainer" containerID="4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba" Dec 09 12:54:15 crc kubenswrapper[5002]: E1209 12:54:15.839379 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba\": container with ID starting with 4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba not found: ID does not exist" containerID="4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.839436 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba"} err="failed to get container status \"4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba\": rpc error: code = NotFound desc = could not find container \"4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba\": container with ID starting with 4688a9782d27d6a6545b61de56a652004422a0f1de22907ca5b3ade35c1659ba not found: ID does not exist" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.839469 5002 scope.go:117] "RemoveContainer" containerID="63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357" Dec 09 12:54:15 crc kubenswrapper[5002]: E1209 12:54:15.841495 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357\": container with ID starting with 63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357 not found: ID does not exist" containerID="63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357" Dec 09 12:54:15 crc kubenswrapper[5002]: I1209 12:54:15.841536 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357"} err="failed to get container status \"63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357\": rpc error: code = NotFound desc = could not find container \"63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357\": container with ID starting with 63ebb7f2d4affbc1e84f6cccb4488e35ed2d4e24cdebd892a6f60684f2a0b357 not found: ID does not exist" Dec 09 12:54:16 crc kubenswrapper[5002]: I1209 12:54:16.060130 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:54:16 crc kubenswrapper[5002]: E1209 12:54:16.060367 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:54:16 crc kubenswrapper[5002]: I1209 12:54:16.070805 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3dbff2-1469-42d0-84d4-71bc89cf68f9" path="/var/lib/kubelet/pods/1d3dbff2-1469-42d0-84d4-71bc89cf68f9/volumes" Dec 09 12:54:28 crc kubenswrapper[5002]: I1209 12:54:28.068532 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:54:28 crc kubenswrapper[5002]: E1209 12:54:28.069411 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.804778 5002 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lhxsj"] Dec 09 12:54:38 crc kubenswrapper[5002]: E1209 12:54:38.806164 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7326dcf-8a30-4b22-af34-b7705e04e54c" containerName="extract-utilities" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.806183 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7326dcf-8a30-4b22-af34-b7705e04e54c" containerName="extract-utilities" Dec 09 12:54:38 crc kubenswrapper[5002]: E1209 12:54:38.806210 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7326dcf-8a30-4b22-af34-b7705e04e54c" containerName="extract-content" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.806218 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7326dcf-8a30-4b22-af34-b7705e04e54c" containerName="extract-content" Dec 09 12:54:38 crc kubenswrapper[5002]: E1209 12:54:38.806231 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7326dcf-8a30-4b22-af34-b7705e04e54c" containerName="registry-server" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.806239 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7326dcf-8a30-4b22-af34-b7705e04e54c" containerName="registry-server" Dec 09 12:54:38 crc kubenswrapper[5002]: E1209 12:54:38.806268 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3dbff2-1469-42d0-84d4-71bc89cf68f9" containerName="copy" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.806287 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3dbff2-1469-42d0-84d4-71bc89cf68f9" containerName="copy" Dec 09 12:54:38 crc kubenswrapper[5002]: E1209 12:54:38.806316 5002 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3dbff2-1469-42d0-84d4-71bc89cf68f9" containerName="gather" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.806324 5002 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3dbff2-1469-42d0-84d4-71bc89cf68f9" containerName="gather" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.806590 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3dbff2-1469-42d0-84d4-71bc89cf68f9" containerName="copy" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.806613 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3dbff2-1469-42d0-84d4-71bc89cf68f9" containerName="gather" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.806629 5002 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7326dcf-8a30-4b22-af34-b7705e04e54c" containerName="registry-server" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.808630 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.816994 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lhxsj"] Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.888396 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-catalog-content\") pod \"certified-operators-lhxsj\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.888518 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-utilities\") pod \"certified-operators-lhxsj\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.888557 5002 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzqwc\" (UniqueName: \"kubernetes.io/projected/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-kube-api-access-zzqwc\") pod \"certified-operators-lhxsj\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.990992 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-catalog-content\") pod \"certified-operators-lhxsj\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.991108 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-utilities\") pod \"certified-operators-lhxsj\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.991147 5002 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzqwc\" (UniqueName: \"kubernetes.io/projected/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-kube-api-access-zzqwc\") pod \"certified-operators-lhxsj\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.992122 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-catalog-content\") pod \"certified-operators-lhxsj\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:38 crc kubenswrapper[5002]: I1209 12:54:38.992158 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-utilities\") pod \"certified-operators-lhxsj\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:39 crc kubenswrapper[5002]: I1209 12:54:39.010283 5002 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzqwc\" (UniqueName: \"kubernetes.io/projected/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-kube-api-access-zzqwc\") pod \"certified-operators-lhxsj\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:39 crc kubenswrapper[5002]: I1209 12:54:39.061461 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:54:39 crc kubenswrapper[5002]: E1209 12:54:39.061742 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:54:39 crc kubenswrapper[5002]: I1209 12:54:39.142319 5002 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:39 crc kubenswrapper[5002]: I1209 12:54:39.756027 5002 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lhxsj"] Dec 09 12:54:40 crc kubenswrapper[5002]: I1209 12:54:40.073066 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhxsj" event={"ID":"8ba7ddaf-bb6d-4090-b9c8-584502b1f730","Type":"ContainerStarted","Data":"cf38d7331f6869ce8f7cd6940272a975f87a4d783e22fa09e1667af80685de38"} Dec 09 12:54:41 crc kubenswrapper[5002]: I1209 12:54:41.077661 5002 generic.go:334] "Generic (PLEG): container finished" podID="8ba7ddaf-bb6d-4090-b9c8-584502b1f730" containerID="07a3d8bf4a7d9714e8b74383540ba2ca54d9cfad31623e831d9b8a52fcad4f8a" exitCode=0 Dec 09 12:54:41 crc kubenswrapper[5002]: I1209 12:54:41.077710 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhxsj" event={"ID":"8ba7ddaf-bb6d-4090-b9c8-584502b1f730","Type":"ContainerDied","Data":"07a3d8bf4a7d9714e8b74383540ba2ca54d9cfad31623e831d9b8a52fcad4f8a"} Dec 09 12:54:41 crc kubenswrapper[5002]: I1209 12:54:41.079619 5002 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:54:45 crc kubenswrapper[5002]: I1209 12:54:45.117049 5002 generic.go:334] "Generic (PLEG): container finished" podID="8ba7ddaf-bb6d-4090-b9c8-584502b1f730" containerID="f49e6b43516a64a9de273fb4dde3cb08802cad633c9b4c31c01ef5157e988dc4" exitCode=0 Dec 09 12:54:45 crc kubenswrapper[5002]: I1209 12:54:45.117098 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhxsj" event={"ID":"8ba7ddaf-bb6d-4090-b9c8-584502b1f730","Type":"ContainerDied","Data":"f49e6b43516a64a9de273fb4dde3cb08802cad633c9b4c31c01ef5157e988dc4"} Dec 09 12:54:47 crc kubenswrapper[5002]: I1209 12:54:47.143186 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhxsj" event={"ID":"8ba7ddaf-bb6d-4090-b9c8-584502b1f730","Type":"ContainerStarted","Data":"c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df"} Dec 09 12:54:47 crc kubenswrapper[5002]: I1209 12:54:47.162541 5002 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lhxsj" podStartSLOduration=4.397936201 podStartE2EDuration="9.162522033s" podCreationTimestamp="2025-12-09 12:54:38 +0000 UTC" firstStartedPulling="2025-12-09 12:54:41.079372832 +0000 UTC m=+10413.471423913" lastFinishedPulling="2025-12-09 12:54:45.843958614 +0000 UTC m=+10418.236009745" observedRunningTime="2025-12-09 12:54:47.15830515 +0000 UTC m=+10419.550356241" watchObservedRunningTime="2025-12-09 12:54:47.162522033 +0000 UTC m=+10419.554573114" Dec 09 12:54:49 crc kubenswrapper[5002]: I1209 12:54:49.143413 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:49 crc kubenswrapper[5002]: I1209 12:54:49.143740 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:49 crc kubenswrapper[5002]: I1209 12:54:49.220701 5002 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:52 crc kubenswrapper[5002]: I1209 12:54:52.061374 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:54:52 crc kubenswrapper[5002]: E1209 12:54:52.062492 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:54:59 crc kubenswrapper[5002]: I1209 12:54:59.200627 5002 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:59 crc kubenswrapper[5002]: I1209 12:54:59.262392 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lhxsj"] Dec 09 12:54:59 crc kubenswrapper[5002]: I1209 12:54:59.289319 5002 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lhxsj" podUID="8ba7ddaf-bb6d-4090-b9c8-584502b1f730" containerName="registry-server" containerID="cri-o://c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df" gracePeriod=2 Dec 09 12:54:59 crc kubenswrapper[5002]: I1209 12:54:59.799123 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:54:59 crc kubenswrapper[5002]: I1209 12:54:59.906255 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-catalog-content\") pod \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " Dec 09 12:54:59 crc kubenswrapper[5002]: I1209 12:54:59.907016 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-utilities\") pod \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " Dec 09 12:54:59 crc kubenswrapper[5002]: I1209 12:54:59.907382 5002 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzqwc\" (UniqueName: \"kubernetes.io/projected/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-kube-api-access-zzqwc\") pod \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\" (UID: \"8ba7ddaf-bb6d-4090-b9c8-584502b1f730\") " Dec 09 12:54:59 crc kubenswrapper[5002]: I1209 12:54:59.909480 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-utilities" (OuterVolumeSpecName: "utilities") pod "8ba7ddaf-bb6d-4090-b9c8-584502b1f730" (UID: "8ba7ddaf-bb6d-4090-b9c8-584502b1f730"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:54:59 crc kubenswrapper[5002]: I1209 12:54:59.916608 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-kube-api-access-zzqwc" (OuterVolumeSpecName: "kube-api-access-zzqwc") pod "8ba7ddaf-bb6d-4090-b9c8-584502b1f730" (UID: "8ba7ddaf-bb6d-4090-b9c8-584502b1f730"). InnerVolumeSpecName "kube-api-access-zzqwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:54:59 crc kubenswrapper[5002]: I1209 12:54:59.984866 5002 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ba7ddaf-bb6d-4090-b9c8-584502b1f730" (UID: "8ba7ddaf-bb6d-4090-b9c8-584502b1f730"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.010315 5002 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzqwc\" (UniqueName: \"kubernetes.io/projected/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-kube-api-access-zzqwc\") on node \"crc\" DevicePath \"\"" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.010368 5002 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.010381 5002 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba7ddaf-bb6d-4090-b9c8-584502b1f730-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.310151 5002 generic.go:334] "Generic (PLEG): container finished" podID="8ba7ddaf-bb6d-4090-b9c8-584502b1f730" containerID="c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df" exitCode=0 Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.310207 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhxsj" event={"ID":"8ba7ddaf-bb6d-4090-b9c8-584502b1f730","Type":"ContainerDied","Data":"c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df"} Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.310244 5002 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhxsj" event={"ID":"8ba7ddaf-bb6d-4090-b9c8-584502b1f730","Type":"ContainerDied","Data":"cf38d7331f6869ce8f7cd6940272a975f87a4d783e22fa09e1667af80685de38"} Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.310269 5002 scope.go:117] "RemoveContainer" containerID="c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.310464 5002 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhxsj" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.684416 5002 scope.go:117] "RemoveContainer" containerID="f49e6b43516a64a9de273fb4dde3cb08802cad633c9b4c31c01ef5157e988dc4" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.705049 5002 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lhxsj"] Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.709753 5002 scope.go:117] "RemoveContainer" containerID="07a3d8bf4a7d9714e8b74383540ba2ca54d9cfad31623e831d9b8a52fcad4f8a" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.715177 5002 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lhxsj"] Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.845775 5002 scope.go:117] "RemoveContainer" containerID="c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df" Dec 09 12:55:00 crc kubenswrapper[5002]: E1209 12:55:00.846944 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df\": container with ID starting with c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df not found: ID does not exist" containerID="c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.846995 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df"} err="failed to get container status \"c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df\": rpc error: code = NotFound desc = could not find container \"c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df\": container with ID starting with c8817d70963cfea1896abc28f3796c6147022035dc66c8c19dd64d7795c933df not found: ID does not exist" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.847019 5002 scope.go:117] "RemoveContainer" containerID="f49e6b43516a64a9de273fb4dde3cb08802cad633c9b4c31c01ef5157e988dc4" Dec 09 12:55:00 crc kubenswrapper[5002]: E1209 12:55:00.847427 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f49e6b43516a64a9de273fb4dde3cb08802cad633c9b4c31c01ef5157e988dc4\": container with ID starting with f49e6b43516a64a9de273fb4dde3cb08802cad633c9b4c31c01ef5157e988dc4 not found: ID does not exist" containerID="f49e6b43516a64a9de273fb4dde3cb08802cad633c9b4c31c01ef5157e988dc4" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.847472 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f49e6b43516a64a9de273fb4dde3cb08802cad633c9b4c31c01ef5157e988dc4"} err="failed to get container status \"f49e6b43516a64a9de273fb4dde3cb08802cad633c9b4c31c01ef5157e988dc4\": rpc error: code = NotFound desc = could not find container \"f49e6b43516a64a9de273fb4dde3cb08802cad633c9b4c31c01ef5157e988dc4\": container with ID starting with f49e6b43516a64a9de273fb4dde3cb08802cad633c9b4c31c01ef5157e988dc4 not found: ID does not exist" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.847498 5002 scope.go:117] "RemoveContainer" containerID="07a3d8bf4a7d9714e8b74383540ba2ca54d9cfad31623e831d9b8a52fcad4f8a" Dec 09 12:55:00 crc kubenswrapper[5002]: E1209 12:55:00.847950 5002 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a3d8bf4a7d9714e8b74383540ba2ca54d9cfad31623e831d9b8a52fcad4f8a\": container with ID starting with 07a3d8bf4a7d9714e8b74383540ba2ca54d9cfad31623e831d9b8a52fcad4f8a not found: ID does not exist" containerID="07a3d8bf4a7d9714e8b74383540ba2ca54d9cfad31623e831d9b8a52fcad4f8a" Dec 09 12:55:00 crc kubenswrapper[5002]: I1209 12:55:00.848004 5002 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a3d8bf4a7d9714e8b74383540ba2ca54d9cfad31623e831d9b8a52fcad4f8a"} err="failed to get container status \"07a3d8bf4a7d9714e8b74383540ba2ca54d9cfad31623e831d9b8a52fcad4f8a\": rpc error: code = NotFound desc = could not find container \"07a3d8bf4a7d9714e8b74383540ba2ca54d9cfad31623e831d9b8a52fcad4f8a\": container with ID starting with 07a3d8bf4a7d9714e8b74383540ba2ca54d9cfad31623e831d9b8a52fcad4f8a not found: ID does not exist" Dec 09 12:55:02 crc kubenswrapper[5002]: I1209 12:55:02.071695 5002 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba7ddaf-bb6d-4090-b9c8-584502b1f730" path="/var/lib/kubelet/pods/8ba7ddaf-bb6d-4090-b9c8-584502b1f730/volumes" Dec 09 12:55:06 crc kubenswrapper[5002]: I1209 12:55:06.061551 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:55:06 crc kubenswrapper[5002]: E1209 12:55:06.062284 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:55:17 crc kubenswrapper[5002]: I1209 12:55:17.060671 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:55:17 crc kubenswrapper[5002]: E1209 12:55:17.061430 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:55:28 crc kubenswrapper[5002]: I1209 12:55:28.072348 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:55:28 crc kubenswrapper[5002]: E1209 12:55:28.073544 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:55:30 crc kubenswrapper[5002]: I1209 12:55:30.664603 5002 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod8ba7ddaf-bb6d-4090-b9c8-584502b1f730"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod8ba7ddaf-bb6d-4090-b9c8-584502b1f730] : Timed out while waiting for systemd to remove kubepods-burstable-pod8ba7ddaf_bb6d_4090_b9c8_584502b1f730.slice" Dec 09 12:55:40 crc kubenswrapper[5002]: I1209 12:55:40.060898 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:55:40 crc kubenswrapper[5002]: E1209 12:55:40.063601 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:55:55 crc kubenswrapper[5002]: I1209 12:55:55.060246 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:55:55 crc kubenswrapper[5002]: E1209 12:55:55.060952 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:56:09 crc kubenswrapper[5002]: I1209 12:56:09.060463 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:56:09 crc kubenswrapper[5002]: E1209 12:56:09.061469 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:56:20 crc kubenswrapper[5002]: I1209 12:56:20.064325 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:56:20 crc kubenswrapper[5002]: E1209 12:56:20.065407 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:56:33 crc kubenswrapper[5002]: I1209 12:56:33.060321 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:56:33 crc kubenswrapper[5002]: E1209 12:56:33.061516 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:56:46 crc kubenswrapper[5002]: I1209 12:56:46.060975 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:56:46 crc kubenswrapper[5002]: E1209 12:56:46.062035 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef" Dec 09 12:57:01 crc kubenswrapper[5002]: I1209 12:57:01.060896 5002 scope.go:117] "RemoveContainer" containerID="29f1eaa11fdfd853d7d438349fc9003be9b862bc0e494c5b01deab20dcab658f" Dec 09 12:57:01 crc kubenswrapper[5002]: E1209 12:57:01.061791 5002 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kxpn6_openshift-machine-config-operator(f49c6392-68b2-4847-9291-a0b4d9c1cbef)\"" pod="openshift-machine-config-operator/machine-config-daemon-kxpn6" podUID="f49c6392-68b2-4847-9291-a0b4d9c1cbef"